Faculty of 1000 evaluation for Re-assessment of mitigation strategies for deliberate releases of anthrax using a real-time outbreak characterization tool.

Author(s):  
Edward Feil ◽  
Katy Turner
2022 ◽  
Author(s):  
Alex D. Washburne ◽  
Nathaniel Hupert ◽  
Nicole Kogan ◽  
William Hanage ◽  
Mauricio Santillana

Characterizing the dynamics of epidemic trajectories is critical to understanding the potential impacts of emerging outbreaks and to designing appropriate mitigation strategies. As the COVID-19 pandemic evolves, however, the emergence of SARS-CoV-2 variants of concern has complicated our ability to assess in real-time the potential effects of imminent outbreaks, such as those presently caused by the Omicron variant. Here, we report that SARS-CoV-2 outbreaks across regions exhibit strain-specific times from onset to peak, specifically for Delta and Omicron variants. Our findings may facilitate real-time identification of peak medical demand and may help fine-tune ongoing and future outbreak mitigation deployment efforts.


Fire ◽  
2021 ◽  
Vol 4 (3) ◽  
pp. 55
Author(s):  
Gary L. Achtemeier ◽  
Scott L. Goodrick

Abrupt changes in wind direction and speed caused by thunderstorm-generated gust fronts can, within a few seconds, transform slow-spreading low-intensity flanking fires into high-intensity head fires. Flame heights and spread rates can more than double. Fire mitigation strategies are challenged and the safety of fire crews is put at risk. We propose a class of numerical weather prediction models that incorporate real-time radar data and which can provide fire response units with images of accurate very short-range forecasts of gust front locations and intensities. Real-time weather radar data are coupled with a wind model that simulates density currents over complex terrain. Then two convective systems from formation and merger to gust front arrival at the location of a wildfire at Yarnell, Arizona, in 2013 are simulated. We present images of maps showing the progress of the gust fronts toward the fire. Such images can be transmitted to fire crews to assist decision-making. We conclude, therefore, that very short-range gust front prediction models that incorporate real-time radar data show promise as a means of predicting the critical weather information on gust front propagation for fire operations, and that such tools warrant further study.


2015 ◽  
Vol 15 (6) ◽  
pp. 2985-3005 ◽  
Author(s):  
J.-E. Petit ◽  
O. Favez ◽  
J. Sciare ◽  
V. Crenn ◽  
R. Sarda-Estève ◽  
...  

Abstract. Aerosol mass spectrometer (AMS) measurements have been successfully used towards a better understanding of non-refractory submicron (PM1) aerosol chemical properties based on short-term campaigns. The recently developed Aerosol Chemical Speciation Monitor (ACSM) has been designed to deliver quite similar artifact-free chemical information but for low cost, and to perform robust monitoring over long-term periods. When deployed in parallel with real-time black carbon (BC) measurements, the combined data set allows for a quasi-comprehensive description of the whole PM1 fraction in near real time. Here we present 2-year long ACSM and BC data sets, between mid-2011 and mid-2013, obtained at the French atmospheric SIRTA supersite that is representative of background PM levels of the region of Paris. This large data set shows intense and time-limited (a few hours) pollution events observed during wintertime in the region of Paris, pointing to local carbonaceous emissions (mainly combustion sources). A non-parametric wind regression analysis was performed on this 2-year data set for the major PM1 constituents (organic matter, nitrate, sulfate and source apportioned BC) and ammonia in order to better refine their geographical origins and assess local/regional/advected contributions whose information is mandatory for efficient mitigation strategies. While ammonium sulfate typically shows a clear advected pattern, ammonium nitrate partially displays a similar feature, but, less expectedly, it also exhibits a significant contribution of regional and local emissions. The contribution of regional background organic aerosols (OA) is significant in spring and summer, while a more pronounced local origin is evidenced during wintertime, whose pattern is also observed for BC originating from domestic wood burning. Using time-resolved ACSM and BC information, seasonally differentiated weekly diurnal profiles of these constituents were investigated and helped to identify the main parameters controlling their temporal variations (sources, meteorological parameters). Finally, a careful investigation of all the major pollution episodes observed over the region of Paris between 2011 and 2013 was performed and classified in terms of chemical composition and the BC-to-sulfate ratio used here as a proxy of the local/regional/advected contribution of PM. In conclusion, these first 2-year quality-controlled measurements of ACSM clearly demonstrate their great potential to monitor on a long-term basis aerosol sources and their geographical origin and provide strategic information in near real time during pollution episodes. They also support the capacity of the ACSM to be proposed as a robust and credible alternative to filter-based sampling techniques for long-term monitoring strategies.


2004 ◽  
Vol 70 (4) ◽  
pp. 2296-2306 ◽  
Author(s):  
G. Douglas Inglis ◽  
Lisa D. Kalischuk

ABSTRACT Campylobacter species are fastidious to culture, and the ability to directly quantify biomass in microbiologically complex substrates using real-time quantitative (RTQ) PCR may enhance our understanding of their biology and facilitate the development of efficacious mitigation strategies. This study reports the use of nested RTQ-PCR to directly quantify Campylobacter jejuni and Campylobacter lanienae in cattle feces. For C. jejuni, the single-copy mapA gene was selected. For C. lanienae, the three-copy 16S rRNA gene was targeted. RTQ-PCR primers were tested alone or they were nested with species-specific primers, and amplification products were detected using the intercalating dye SYBR Green. Nesting did not increase the specificity or sensitivity of C. jejuni quantification, and the limit of quantification was 19 to 25 genome copies (≈3 × 103 CFU/g of feces). In contrast, nested RTQ-PCR was necessary to confer specificity on C. lanienae by targeting the 16S rRNA gene. The limit of quantification was 1.8 genome copies (≈250 CFU/g of feces), and there was no discernible difference between the two C. lanienae secondary primer sets evaluated. Detection and quantification of C. jejuni in naturally infested cattle feces by RTQ-PCR were comparable to the results of culture-based methods. In contrast, culturing did not detect C. lanienae in 6 of 10 fecal samples positive for the bacterium and substantially underestimated cell densities relative to nested RTQ-PCR. The results of this study illustrate that RTQ-PCR can be used to directly quantify campylobacters, including very fastidious species, in a microbiologically and chemically complex substrate. Furthermore, targeting of a multicopy universal gene provided highly sensitive quantification of C. lanienae, but nested RTQ-PCR was necessary to confer specificity. This method will facilitate subsequent studies to elucidate the impact of this group of bacteria within the gastrointestinal tracts of livestock and studies of the factors that influence colonization success and shedding.


Molecules ◽  
2021 ◽  
Vol 26 (16) ◽  
pp. 4761
Author(s):  
Brian T. Buckley ◽  
Rachel Buckley ◽  
Cathleen L. Doherty

Many of the current innovations in instrument design have been focused on making them smaller, more rugged, and eventually field transportable. The ultimate application is obvious, carrying the instrument to the field for real time sample analysis without the need for a support laboratory. Real time data are priceless when screening either biological or environmental samples, as mitigation strategies can be initiated immediately upon the discovery that contaminant metals are present in a location they were not intended to be. Additionally, smaller “handheld” instruments generally require less sample for analysis, possibly increasing sensitivity, another advantage to instrument miniaturization. While many other instruments can be made smaller just by using available micro-technologies (e.g., eNose), shrinking an ICP-MS or AES to something someone might carry in a backpack or pocket is now closer to reality than in the past, and can be traced to its origins based on a component-by-component evaluation. While the optical and mass spectrometers continue to shrink in size, the ion/excitation source remains a challenge as a tradeoff exists between excitation capabilities and the power requirements for the plasma’s generation. Other supporting elements have only recently become small enough for transport. A systematic review of both where the plasma spectrometer started and the evolution of technologies currently available may provide the roadmap necessary to miniaturize the spectrometer. We identify criteria on a component-by-component basis that need to be addressed in designing a miniaturized device and recognize components (e.g., source) that probably require further optimization. For example, the excitation/ionization source must be energetic enough to take a metal from a solid state to its ionic state. Previously, a plasma required a radio frequency generator or high-power DC source, but excitation can now be accomplished with non-thermal (cold) plasma sources. Sample introduction, for solids, liquids, and gasses, presents challenges for all sources in a field instrument. Next, the interface between source and a mass detector usually requires pressure reduction techniques to get an ion from plasma to the spectrometer. Currently, plasma mass spectrometers are field ready but not necessarily handheld. Optical emission spectrometers are already capable of getting photons to the detector but could eventually be connected to your phone. Inert plasma gas generation is close to field ready if nitrogen generators can be miniaturized. Many of these components are already commercially available or at least have been reported in the literature. Comparisons to other “handheld” elemental analysis devices that employ XRF, LIBS, and electrochemical methods (and their limitations) demonstrate that a “cold” plasma-based spectrometer can be more than competitive. Migrating the cold plasma from an emission only source to a mass spectrometer source, would allow both analyte identification and potentially source apportionment through isotopic fingerprinting, and may be the last major hurdle to overcome. Finally, we offer a possible design to aid in making the cold plasma source more applicable to a field deployment.


2020 ◽  
Vol 68 (9) ◽  
pp. 750-764
Author(s):  
Christoph Brosinsky ◽  
Rainer Krebs ◽  
Dirk Westermann

AbstractEmerging real-time applications in information technology, and operational technology enable new innovative concepts to design and operate cyber-physical systems. A promising approach, which has been discovered recently as key technology by several industries is the Digital Twin (DT) concept. A DT connects the virtual representation of a physical object, system or process by available information and sensor data streams, which allows to gather new information about the system it mirrors by applying analytic functions. Thereby the DT technology can help to fill sensor data gaps, e. g., to support anomaly detection, and to predict future operating conditions and system states. This paper discusses a dynamic power system DT as a cornerstone instance of a new generation of EMS, and a prospective new EMS architecture, to support the increasingly complex operation of electric power systems. Unlike in traditional offline power system models, the parameters are updated dynamically using measurement information from the supervisory control and data acquisition (SCADA) and a wide area monitoring system (WAMS) to tune the model. This allows to derive a highly accurate virtual representation of the mirrored physical objects. A simulation engine, the Digital Dynamic Mirror (DDM) is introduced, in order to be able to reproduce the state of a reference network in real-time. The validation of the approach is carried out by a case study. In a closed loop within EMS applications, the DDM can help to assess contingency mitigation strategies, thus it can support the decision-making process under variable system conditions. The next generation of control centre Energy Management System (EMS) can benefit from this development by augmentation of the dynamic observability, and the rise of operator situation awareness.


Author(s):  
Angela E. Kitali ◽  
Priyanka Alluri ◽  
Thobias Sando ◽  
Wensong Wu

Secondary crashes (SCs) have increasingly been recognized as a major problem leading to reduced capacity and additional traffic delays. However, the limited knowledge on the nature and characteristics of SCs has largely impeded their mitigation strategies. There are two main issues with analyzing SCs. First, relevant variables are unknown, but, at the same time, most of the variables considered in the models are highly correlated. Second, only a small proportion of incidents results in SCs, making it an imbalanced classification problem. This study developed a reliable SC risk prediction model using the Least Absolute Shrinkage and Selection Operator (LASSO) penalized logistic regression model with Synthetic Minority Oversampling TEchnique-Nominal Continuous (SMOTE-NC). The proposed model is considered to improve the predictive accuracy of the SC risk model because it accounts for the asymmetric nature of SCs, performs variable selection, and removes highly correlated variables. The study data were collected on a 35-mi I-95 section for 3 years in Jacksonville, Florida. SCs were identified based on real-time speed data. The results indicated that real-time traffic variables and primary incident characteristics significantly affect the likelihood of SCs. The most influential variables included mean of detector occupancy, coefficient of variation of equivalent hourly volume, mean of speed, primary incident type, percentage of lanes closed, incident occurrence time, shoulder blocked, number of responding agencies, incident impact duration, incident clearance duration, and roadway alignment. The study results can be used by agencies to develop SC mitigation strategies, and therefore improve the operational and safety performance of freeways.


Sign in / Sign up

Export Citation Format

Share Document