scholarly journals Development of a Bayesian network for probabilistic risk assessment of pesticides

2021 ◽  
Author(s):  
Sophie Mentzel ◽  
Merete Grung ◽  
Knut Erik Tollefsen ◽  
Marianne Stenrod ◽  
Karina Petersen ◽  
...  

Conventional environmental risk assessment of chemicals is based on a calculated risk quotient, representing the ratio of exposure to effects of the chemical, in combination with assessment factors to account for uncertainty. Probabilistic risk assessment approaches can offer more transparency, by using probability distributions for exposure and/or effects to account for variability and uncertainty. In this study, a probabilistic approach using Bayesian network (BN) modelling is explored as an alternative to traditional risk calculation. BNs can serve as meta-models that link information from several sources and offer a transparent way of incorporating the required characterization of uncertainty for environmental risk assessment. To this end, a BN has been developed and parameterised for the pesticides azoxystrobin, metribuzin, and imidacloprid. We illustrate the development from deterministic (traditional) risk calculation, via intermediate versions, to fully probabilistic risk characterisation using azoxystrobin as an example. We also demonstrate seasonal risk calculation for the three pesticides.

Author(s):  
A. A. Flippen ◽  
R. J. Navarro ◽  
A. M. Larsen ◽  
M. Stamatelatos

The safety of the public, the astronaut crew, Agency assets, other payloads, and the environment are NASA’s priorities when assessing the adequacy of space flight designs. While Probabilistic Risk Assessment (PRA) has been successfully applied to Space Shuttle and Space Station vehicle risk decision-making, the mandated use of a non-probabilistic rule-based approach is unique to the safety certification of NASA’s habitable payloads. A 1997 survey of historical safety policies with NASA’s Payload Safety Review Panel (PSRP) revealed that the non-probabilistic approach for habitable payloads was not arbitrary but founded on informed risk decisions from 20 years ago by then NASA Headquarters policy makers. Based on a sound payload safety track record, there has been no compelling reason, until recently, to consider expanding from the present NSTS 1700.7B rule-based approach to include risk-based PRA as a viable alternative. However, with the Agency’s increased focus on structured risk management, the establishment of a Risk Assessment Program at NASA Headquarters, and refined PRA guidelines and techniques, PRA is now formally recognized as an essential method for evaluating complex and high risk systems. The PSRP recognizes a growing need and an opportunity for evaluating the efficacy of risk-based PRA methods for application to increasingly complex next generation payload technologies. Therefore, it is timely to revisit the potential application of PRA to habitable payloads. This paper discusses PRA as a risk-based method that, when properly implemented, will result in equivalent or improed safety compared with the rule-based failure tolerance requirements for achieving the Agency’s “Safety First” core value. The benefits and cautions associated with infusing PRA methodology into the PSRP safety certification process are also discussed, as well as a proposed deployment strategy of how PRA might be prudently tailored and applied to habitable payloads. The use of PRA for assessing payload reliability is unrestricted at NASA but this is beyond the scope of the present discussion of payload safety applications.


Author(s):  
Sophie Mentzel ◽  
Merete Grung ◽  
Knut Erik Tollefsen ◽  
Marianne Stenrød ◽  
Karina Petersen ◽  
...  

2018 ◽  
Author(s):  
Virgile Baudrot ◽  
Sandrine Charles

ABSTRACTProviding reliable environmental quality standards (EQSs) is a challenging issue in environmental risk assessment (ERA). These EQSs are derived from toxicity endpoints estimated from dose-response models to identify and characterize the environmental hazard of chemical compounds such as those released by human activities. These toxicity endpoints include the classicalx% effect/lethal concentrations at a specific timet(EC/LC(x,t)) and the new multiplication factors applied to environmental exposure profiles leading tox% effect reduction at a specific timet(MF(x,t), or denotedLP(x,t) by the EFSA). However, classical dose-response models used to estimate toxicity endpoints have some weaknesses, such as their dependency on observation time points, which are likely to differ between species (e.g., experiment duration). Furthermore, real-world exposure profiles are rarely constant over time, which makes the use of classical dose-response models difficult and compromises the derivation ofMF(x,t). When dealing with survival or immobility toxicity test data, these issues can be overcome with the use of the general unified threshold model of survival (GUTS), a toxicokinetics-toxicodynamics (TKTD) model that provides an explicit framework to analyse both time- and concentration-dependent data sets as well as obtain a mechanistic derivation ofEC/LC(x,t) andMF(x,t) regardless of x and at any time t of interest. In addition, the assessment of a risk is inherently built upon probability distributions, such that the next critical step for ERA is to characterize the uncertainties of toxicity endpoints and, consequently, those of EQSs. With this perspective, we investigated the use of a Bayesian framework to obtain the uncertainties from the calibration process and to propagate them to model predictions, includingLC(x,t) andMF(x,t) derivations. We also explored the mathematical properties ofLC(x,t) andMF(x,t) as well as the impact of different experimental designs to provide some recommendations for a robust derivation of toxicity endpoints leading to reliable EQSs: avoid computingLC(x,t) andMF(x,t) for extremexvalues (0 or 100%), where uncertainty is maximal; computeMF(x,t) after a long period of time to take depuration time into account and test survival under few correlated and uncorrelated pulses of the contaminant in terms of depuration.


Author(s):  
Shinyoung Kwag ◽  
Abhinav Gupta

Conventional probabilistic risk assessment (PRA) methodologies (USNRC, 1983; IAEA, 1992; EPRI, 1994; Ellingwood, 2001) conduct risk assessment for different external hazards by considering each hazard separately and independent of each other. The risk metric for a specific hazard is evaluated by a convolution of the fragility and the hazard curves. The fragility curve for basic event is obtained by using empirical, experimental, and/or numerical simulation data for a particular hazard. Treating each hazard as an independent mutually exclusive event can be inappropriate in some cases as certain hazards are statistically correlated or dependent. Examples of such correlated events include but are not limited to flooding induced fire, seismically induced internal or external flooding, or even seismically induced fire. In the current practice, system level risk and consequence sequences are typically calculated using a Fault Tree Analysis (FTA) that uses logic gates to express the causative relationship between events. Furthermore, the basic events in an FTA are considered as independent. Therefore, conducting a multi-hazard PRA using a Fault Tree is quite impractical. In some cases using an FTA to conduct a multi-hazard PRA can even be inaccurate because an FTA cannot account for uncertainties in events and the use of logic gates limits the consideration of statistical correlations or dependencies between the events. An additional limitation of an FTA based PRA is embedded in its inability to easily accommodate newly observed data and calculation of updated risk or accident scenarios under the newly available information. Finally, FTA is not best suited for addressing beyond design basis vulnerabilities. Therefore, in this paper, we present the results from a study on multi-hazard risk assessment that is conducted using a Bayesian network (BN) with Bayesian inference. The framework can consider general relationships among risks from multiple hazards, allows updating by considering the newly available data/information at any level, and evaluate scenarios for vulnerabilities due to beyond design bases events.


Sign in / Sign up

Export Citation Format

Share Document