scholarly journals Using sensitivity analysis to identify key factors for the propagation of a plant epidemic

2018 ◽  
Vol 5 (1) ◽  
pp. 171435 ◽  
Author(s):  
Loup Rimbaud ◽  
Claude Bruchou ◽  
Sylvie Dallot ◽  
David R. J. Pleydell ◽  
Emmanuel Jacquot ◽  
...  

Identifying the key factors underlying the spread of a disease is an essential but challenging prerequisite to design management strategies. To tackle this issue, we propose an approach based on sensitivity analyses of a spatiotemporal stochastic model simulating the spread of a plant epidemic. This work is motivated by the spread of sharka, caused by plum pox virus , in a real landscape. We first carried out a broad-range sensitivity analysis, ignoring any prior information on six epidemiological parameters, to assess their intrinsic influence on model behaviour. A second analysis benefited from the available knowledge on sharka epidemiology and was thus restricted to more realistic values. The broad-range analysis revealed that the mean duration of the latent period is the most influential parameter of the model, whereas the sharka-specific analysis uncovered the strong impact of the connectivity of the first infected orchard. In addition to demonstrating the interest of sensitivity analyses for a stochastic model, this study highlights the impact of variation ranges of target parameters on the outcome of a sensitivity analysis. With regard to sharka management, our results suggest that sharka surveillance may benefit from paying closer attention to highly connected patches whose infection could trigger serious epidemics.

2019 ◽  
Vol 109 (7) ◽  
pp. 1184-1197 ◽  
Author(s):  
Loup Rimbaud ◽  
Sylvie Dallot ◽  
Claude Bruchou ◽  
Sophie Thoyer ◽  
Emmanuel Jacquot ◽  
...  

Improvement of management strategies of epidemics is often hampered by constraints on experiments at large spatiotemporal scales. A promising approach consists of modeling the biological epidemic process and human interventions, which both impact disease spread. However, few methods enable the simultaneous optimization of the numerous parameters of sophisticated control strategies. To do so, we propose a heuristic approach (i.e., a practical improvement method approximating an optimal solution) based on sequential sensitivity analyses. In addition, we use an economic improvement criterion based on the net present value, accounting for both the cost of the different control measures and the benefit generated by disease suppression. This work is motivated by sharka (caused by Plum pox virus), a vector-borne disease of prunus trees (especially apricot, peach, and plum), the management of which in orchards is mainly based on surveillance and tree removal. We identified the key parameters of a spatiotemporal model simulating sharka spread and control and approximated optimal values for these parameters. The results indicate that the current French management of sharka efficiently controls the disease, but it can be economically improved using alternative strategies that are identified and discussed. The general approach should help policy makers to design sustainable and cost-effective strategies for disease management.


Author(s):  
T.J. Gilliland ◽  
T. Ball ◽  
D. Hennessy

This review addresses key factors and impediments that govern the efficient transfer of nutrient energy from primary producing grassland to ruminant milk and meat. The review focuses on permanent improved grasslands, defined as “swards maintained at a high production potential by grass-to-grass renewal”, frequently of a 5- to 10-yr longevity. Breeding progress to date is examined as are the primary objectives for the next generation of cultivars. This involves aligning grass productivity to ruminant demand in three primary aspects, namely intake potential, nutritional value and productivity profile. The opportunity to selectively improve plant traits affecting sward structure, chemical composition, seasonality and ability to persist and perform under farm conditions is evaluated. The EU context involves appraising the impact of variables such as grass species and cultivar, regional abiotic stresses (water, temperature, nutrients, soil type, etc.), biotic stresses from disease and pests, regional diversity in sward management strategies, and the opportunity to minimise the environmental footprint of ruminant farming.


2017 ◽  
Vol 74 (6) ◽  
pp. 894-906 ◽  
Author(s):  
Abbey E. Camaclang ◽  
Janelle M.R. Curtis ◽  
Ilona Naujokaitis-Lewis ◽  
Mark S. Poesch ◽  
Marten A. Koops

We developed a spatially explicit simulation model of poaching behaviour to quantify the relative influence of the intensity, frequency, and spatial distribution of poaching on metapopulation viability. We integrated our model of poaching with a stochastic, habitat-based, spatially explicit population model, applied it to examine the impact of poaching on northern abalone (Haliotis kamtschatkana) metapopulation dynamics in Barkley Sound, British Columbia, Canada, and quantified model sensitivity to input parameters. While demographic parameters remained important in predicting extinction probabilities for northern abalone, our simulations indicate that the odds of extinction are twice as high when populations are subjected to poaching. Viability was influenced by poaching variables that affect the total number of individuals removed. Of these, poaching mortality was the most influential in predicting metapopulation viability, with each 0.1 increase in mortality rate resulting in 22.6% increase in the odds of extinction. By contrast, the location and spatial correlation of events were less important predictors of viability. When data are limited, simulation models of poaching combined with sensitivity analyses can be useful in informing management strategies and future research directions.


2021 ◽  
Vol 13 (6) ◽  
pp. 3455
Author(s):  
Simon Rahn ◽  
Marion Gödel ◽  
Rainer Fischer ◽  
Gerta Köster

Protest demonstrations are a manifestation of fundamental rights. Authorities are responsible for guiding protesters safely along predefined routes, typically set in an urban built environment. Microscopic crowd simulations support decision-makers in finding sustainable crowd management strategies. Planning routes usually requires knowledge about the length of the demonstration march. This case study quantifies the impact of two uncertain parameters, the number of protesters and the standard deviation of their free-flow speeds, on the length of a protest march through Kaiserslautern, Germany. Over 1000 participants walking through more than 100,000 m2 lead to a computationally demanding model that cannot be analyzed with a standard Monte Carlo ansatz. We select and apply analysis methods that are efficient for large topographies. This combination constitutes the main novelty of this paper: We compute Sobol’ indices with two different methods, based on polynomial chaos expansions, for a down-scaled version of the original set-up and compare them to Monte Carlo computations. We employ the more accurate of the approaches for the full-scale scenario. The global sensitivity analysis reveals a shift in the governing parameter from the number of protesters to the standard deviation of their free-flow speeds over time, stressing the benefits of a time-dependent analysis. We discuss typical actions, for example floats that reduce the variation of the free-flow speed, and their effectiveness in view of the findings.


2018 ◽  
Author(s):  
Loup Rimbaud ◽  
Sylvie Dallot ◽  
Claude Bruchou ◽  
Sophie Thoyer ◽  
Emmanuel Jacquot ◽  
...  

ABSTRACTImprovement of management strategies of epidemics is often hampered by constraints on experiments at large spatiotemporal scales. A promising approach consists of modelling the biological epidemic process and human interventions, which both impact disease spread. However, few methods enable the simultaneous optimisation of the numerous parameters of sophisticated control strategies. To do so, we propose a heuristic approach (i.e., a practical improvement method approximating an optimal solution) based on sequential sensitivity analyses. In addition, we use an economic improvement criterion, based on the net present value, accounting for both the cost of the different control measures and the benefit generated by disease suppression. This work is motivated by sharka (caused by Plum pox virus), a vector-borne disease of prunus trees (especially apricot, peach and plum) whose management in orchards is mainly based on surveillance and tree removal. We identified the key parameters of a spatiotemporal model simulating sharka spread and control, and approximated optimal values for these parameters. The results indicate that the current French management of sharka efficiently controls the disease, but can be economically improved using alternative strategies that are identified and discussed. The general approach should help policymakers to design sustainable and cost-effective strategies for disease management.


Blood ◽  
2017 ◽  
Vol 130 (Suppl_1) ◽  
pp. 707-707
Author(s):  
Suzanne F Fustolo-Gunnink ◽  
K Fijnvandraat ◽  
I M Ree ◽  
C Caram-Deelder ◽  
P Andriessen ◽  
...  

Abstract Introduction Limited evidence supports the widely used practice of administering platelet transfusions to prevent major bleeding in preterm thrombocytopenic neonates. Only 1 randomized controlled trial addressed this issue, but used thresholds higher than those currently used in clinical practice. In order to assess the impact of platelet transfusions on bleeding risk, the primary objective of this study was to develop a prediction model for bleeding. Platelet transfusion was included as variable in this model. In these secondary analyses, we further explored the impact of platelet transfusions on bleeding risk. Materials and methods In this multicenter cohort study, neonates with a gestational age (GA) <34 weeks at birth, admitted to a neonatal intensive care unit (NICU) who developed a platelet count <50x109/L were included. The main study endpoint was major bleeding, defined as intraventricular hemorrhage (IVH) grade 3, IVH with parenchymal involvement, other types of intracranial hemorrhage visible on ultrasound scans, pulmonary hemorrhage or any other type of bleeding requiring immediate interventions. The prediction model was developed using landmarking, in which multiple cox models at regular time-points were combined into 1 supermodel. To further explore the impact of platelet transfusions on bleeding risk, we performed 3 sensitivity analyses by selecting specific transfusions (instead of all transfusions). Sensitivity analysis 1 : transfusions according to protocol, defined as transfusions for platelet counts >20x109/L only allowed in case of GA<32 weeks and <1500 grams and presence of NEC, sepsis, or treatment with mechanical ventilation, or in case of invasive procedures. Sensitivity analysis 2: transfusions with fair increments, defined as platelet count ≥50x109/L within 24 hours. Sensitivity analysis 3: transfusion dose 11 ml/kg or higher. Results A total of 640 neonates were included with a median gestational age of 28 weeks. 70 neonates developed a major bleed. IUGR, postnatal age, platelet count and mechanical ventilation were independent predictors of bleeding. The model allowed calculation of two bleeding risks for individual neonates: one in case of platelet transfusion and one in case of no platelet transfusion. 1361 platelet transfusions were administered to 449 of 640 (70%) neonates, of which 87 were hyperconcentrates. The hazard ratio for transfusion in the original model was 1.0, indicating no predictive power. Sensitivity analysis 1: 704 (52%) transfusions were given according to protocol. When selecting these transfusions, the hazard ratio for transfusion changed from 1.0 to 0.5, but the p-value remained > 0.05.Sensitivity analysis 2: 764 (56%) of transfusions resulted in a count >50x109/L within 24 hours. When selecting these transfusions, the hazard ratio for transfusion changed from 1.0 to 0.25, but the p-value remained >0.05. 115 (8%) transfusions did not have a follow up platelet count within 24 hours. Sensitivity analysis 3: of the non-hyperconcentrated platelet transfusions, 517 of 1274 (41%) transfusions were ≥ 11 ml/kg. When selecting these transfusions, the hazard ratio for transfusion changed from 1.0 to 0.1, with a p-value of 0.05. Conclusion With this tool, absolute risk of bleeding in individual preterm thrombocytopenic neonates can be calculated. Additionally, risk of bleeding can be assessed for 2 scenarios: with and without platelet transfusion. This can help clinicians in deciding whether or not to transfuse a patient. In the primary model, platelet transfusion was not a predictor for bleeding risk. However, the findings of the sensitivity analyses suggest that transfusions with a dose > 11ml/kg may have a more profound effect on bleeding risk. Disclosures No relevant conflicts of interest to declare.


2021 ◽  
pp. 1-26
Author(s):  
Silvana M. Pesenti ◽  
Alberto Bettini ◽  
Pietro Millossovich ◽  
Andreas Tsanakas

Abstract The Scenario Weights for Importance Measurement (SWIM) package implements a flexible sensitivity analysis framework, based primarily on results and tools developed by Pesenti et al. (2019). SWIM provides a stressed version of a stochastic model, subject to model components (random variables) fulfilling given probabilistic constraints (stresses). Possible stresses can be applied on moments, probabilities of given events, and risk measures such as Value-At-Risk and Expected Shortfall. SWIM operates upon a single set of simulated scenarios from a stochastic model, returning scenario weights, which encode the required stress and allow monitoring the impact of the stress on all model components. The scenario weights are calculated to minimise the relative entropy with respect to the baseline model, subject to the stress applied. As well as calculating scenario weights, the package provides tools for the analysis of stressed models, including plotting facilities and evaluation of sensitivity measures. SWIM does not require additional evaluations of the simulation model or explicit knowledge of its underlying statistical and functional relations; hence, it is suitable for the analysis of black box models. The capabilities of SWIM are demonstrated through a case study of a credit portfolio model.


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Ping-Tee Tan ◽  
Suzie Cro ◽  
Eleanor Van Vogt ◽  
Matyas Szigeti ◽  
Victoria R. Cornelius

Abstract Background Missing data are common in randomised controlled trials (RCTs) and can bias results if not handled appropriately. A statistically valid analysis under the primary missing-data assumptions should be conducted, followed by sensitivity analysis under alternative justified assumptions to assess the robustness of results. Controlled Multiple Imputation (MI) procedures, including delta-based and reference-based approaches, have been developed for analysis under missing-not-at-random assumptions. However, it is unclear how often these methods are used, how they are reported, and what their impact is on trial results. This review evaluates the current use and reporting of MI and controlled MI in RCTs. Methods A targeted review of phase II-IV RCTs (non-cluster randomised) published in two leading general medical journals (The Lancet and New England Journal of Medicine) between January 2014 and December 2019 using MI. Data was extracted on imputation methods, analysis status, and reporting of results. Results of primary and sensitivity analyses for trials using controlled MI analyses were compared. Results A total of 118 RCTs (9% of published RCTs) used some form of MI. MI under missing-at-random was used in 110 trials; this was for primary analysis in 43/118 (36%), and in sensitivity analysis for 70/118 (59%) (3 used in both). Sixteen studies performed controlled MI (1.3% of published RCTs), either with a delta-based (n = 9) or reference-based approach (n = 7). Controlled MI was mostly used in sensitivity analysis (n = 14/16). Two trials used controlled MI for primary analysis, including one reporting no sensitivity analysis whilst the other reported similar results without imputation. Of the 14 trials using controlled MI in sensitivity analysis, 12 yielded comparable results to the primary analysis whereas 2 demonstrated contradicting results. Only 5/110 (5%) trials using missing-at-random MI and 5/16 (31%) trials using controlled MI reported complete details on MI methods. Conclusions Controlled MI enabled the impact of accessible contextually relevant missing data assumptions to be examined on trial results. The use of controlled MI is increasing but is still infrequent and poorly reported where used. There is a need for improved reporting on the implementation of MI analyses and choice of controlled MI parameters.


2017 ◽  
Vol 8 (4) ◽  
pp. 557-575 ◽  
Author(s):  
Manjula Devak ◽  
C. T. Dhanya

Abstract Different hydrological models provide diverse perspectives of the system being modeled, and inevitably, are imperfect representations of reality. Irrespective of the choice of models, the major source of error in any hydrological modeling is the uncertainty in the determination of model parameters, owing to the mismatch between model complexity and available data. Sensitivity analysis (SA) methods help to identify the parameters that have a strong impact on the model outputs and hence influence the model response. In addition, SA assists in analyzing the interaction between parameters, its preferable range and its spatial variability, which in turn influence the model outcomes. Various methods are available to perform SA and the perturbation technique varies widely. This study attempts to categorize the SA methods depending on the assumptions and methodologies involved in various methods. The pros and cons associated with each SA method are discussed. The sensitivity pertaining to the impact of space and time resolutions on model results is highlighted. The applicability of different SA approaches for various purposes is understood. This study further elaborates the objectives behind selection and application of SA approaches in hydrological modeling, hence providing valuable insights on the limitations, knowledge gaps, and future research directions.


2019 ◽  
Vol 109 (7) ◽  
pp. 1198-1207 ◽  
Author(s):  
Coralie Picard ◽  
Samuel Soubeyrand ◽  
Emmanuel Jacquot ◽  
Gaël Thébaud

Epidemiological models are increasingly used to predict epidemics and improve management strategies. However, they rarely consider landscape characteristics although such characteristics can influence the epidemic dynamics and, thus, the effectiveness of disease management strategies. Here, we present a generic in silico approach which assesses the influence of landscape aggregation on the costs associated with an epidemic and on improved management strategies. We apply this approach to sharka, one of the most damaging diseases of Prunus trees, for which a management strategy is already applied in France. Epidemic simulations were carried out with a spatiotemporal stochastic model under various management strategies in landscapes differing in patch aggregation. Using sensitivity analyses, we highlight the impact of management parameters on the economic output of the model. We also show that the sensitivity analysis can be exploited to identify several strategies that are, according to the model, more profitable than the current French strategy. Some of these strategies are specific to a given aggregation level, which shows that management strategies should generally be tailored to each specific landscape. However, we also identified a strategy that is efficient for all levels of landscape aggregation. This one-size-fits-all strategy has important practical implications because of its simple applicability at a large scale.


Sign in / Sign up

Export Citation Format

Share Document