Main features of the tool SUSA 4.0 for uncertainty and sensitivity analyses

Author(s):  
Martina Kloos
Keyword(s):  
1997 ◽  
Vol 17 (03) ◽  
pp. 166-169
Author(s):  
Judith O’Brien ◽  
Wendy Klittich ◽  
J. Jaime Caro

SummaryDespite evidence from 6 major clinical trials that warfarin effectively prevents strokes in atrial fibrillation, clinicians and health care managers may remain reluctant to support anticoagulant prophylaxis because of its perceived costs. Yet, doing nothing also has a price. To assess this, we carried out a pharmacoe-conomic analysis of warfarin use in atrial fibrillation. The course of the disease, including the occurrence of cerebral and systemic emboli, intracranial and other major bleeding events, was modeled and a meta-analysis of the clinical trials and other relevant literature was carried out to estimate the required probabilities with and without warfarin use. The cost of managing each event, including acute and subsequent care, home care equipment and MD costs, was derived by estimating the cost per resource unit, the proportion consuming each resource and the volume of use. Unit costs and volumes of use were determined from established US government databases, all charges were adjusted using cost-to-charge ratios, and a 3% discount rate was applied to costs incurred beyond the first year. The proportions of patients consuming each resource were estimated by fitting a joint distribution to the clinical trial data, stroke outcome data from a recent Swedish study and aggregate ICD-9 specific, Massachusetts discharge data. If nothing is done, 3.2% more patients will suffer serious emboli annually and the expected annual cost of managing a patient will increase by DM 2,544 (1996 German Marks), from DM 4,366 to DM 6,910. Extensive multiway sensitivity analyses revealed that the higher price of doing nothing persists except for very extreme combinations of inputs unsupported by literature or clinical standards. The price of doing nothing is thus so high, both in health and economic terms, that cost-consciousness as well as clinical considerations mandate warfarin prophylaxis in atrial fibrillation.


AIAA Journal ◽  
1999 ◽  
Vol 37 ◽  
pp. 653-656
Author(s):  
Ying Gu ◽  
Lifen Chen ◽  
Wenliang Wang

2019 ◽  
Author(s):  
Siobhan Hugh-Jones ◽  
Sophie Beckett ◽  
Pavan Mallikarjun

Schools are promising sites for the delivery of prevention and early intervention programs to reduce child and adolescent anxiety. It is unclear whether universal or targeted approaches are most effective. This review and meta-analysis examines the effectiveness of school-based indicated interventions and was registered with PROSPERO [CRD42018087628].MEDLINE, EMBASE, PsycINFO and the Cochrane Library were searched for randomised controlled trials comparing indicated school programs for child and adolescent anxiety to active or inactive control groups. Twenty original studies, with 2076 participants, met the inclusion criteria and 18 were suitable for meta-analysis. Sub-group and sensitivity analyses explored intervention intensity, delivery agent and control type. A small beneficial effect was found for indicated programs compared to controls on self-reported anxiety symptoms at post-test (g = -0.28, CI = -0.50, -0.05, k= 18). The small effect was maintained at 6 (g = -0.35, CI= -0.58, -0.13, k = 9) and 12 months (g = -0.24, CI = -0.48, 0.00, k = 4). Based on two studies, >12 month effects were very small (g = -0.01, CI= -0.38, 0.36). No differences were found based on intervention intensity, delivery agent and control type. There was evidence of publication bias and a relatively high risk of contamination in studies. Findings support the value of school based indicated programs for child and adolescent anxiety. Effects at 12 months outperform many universal programs. High quality, randomised controlled and pragmatic trials are needed, with attention control groups and beyond 12 month diagnostic assessments are needed.


2019 ◽  
Vol 3 (Special Issue on First SACEE'19) ◽  
pp. 173-180
Author(s):  
Giorgia Di Gangi ◽  
Giorgio Monti ◽  
Giuseppe Quaranta ◽  
Marco Vailati ◽  
Cristoforo Demartino

The seismic performance of timber light-frame shear walls is investigated in this paper with a focus on energy dissipation and ductility ensured by sheathing-to-framing connections. An original parametric finite element model has been developed in order to perform sensitivity analyses. The model considers the design variables affecting the racking load-carrying capacity of the wall. These variables include aspect ratio (height-to-width ratio), fastener spacing, number of vertical studs and framing elements cross-section size. A failure criterion has been defined based on the observation of both the global behaviour of the wall and local behaviour of fasteners in order to identify the ultimate displacement of the wall. The equivalent viscous damping has been numerically assessed by estimating the damping factor which is in use in the capacity spectrum method. Finally, an in-depth analysis of the results obtained from the sensitivity analyses led to the development of a simplified analytical procedure which is able to predict the capacity curve of a timber light-frame shear wall.


2020 ◽  
Vol 24 (1) ◽  
pp. 47-56
Author(s):  
Ove Oklevik ◽  
Grzegorz Kwiatkowski ◽  
Mona Kristin Nytun ◽  
Helene Maristuen

The quality of any economic impact assessment largely depends on the adequacy of the input variables and chosen assumptions. This article presents a direct economic impact assessment of a music festival hosted in Norway and sensitivity analyses of two study design assumptions: estimated number of attendees and chosen definition (size) of the affected area. Empirically, the article draws on a state-of-the-art framework of an economic impact analysis and uses primary data from 471 event attendees. The results show that, first, an economic impact analysis is a complex task that requires high precision in assessing different monetary flows entering and leaving the host region, and second, the study design assumptions exert a tremendous influence on the final estimation. Accordingly, the study offers a fertile agenda for local destination marketing organizers and event managers on how to conduct reliable economic impact assessments and explains which elements of such analyses are particularly important for final estimations.


2014 ◽  
Vol 28 (3) ◽  
pp. 421-454 ◽  
Author(s):  
John W. Mortimer ◽  
Linda R. Henderson

SYNOPSIS While retired government employees clearly depend on public sector defined benefit pension funds, these plans also contribute significantly to U.S. state and national economies. Growing public concern about the funding adequacy of these plans, hard hit by the great recession, raises questions about their future viability. After several years of study, the Governmental Accounting Standards Board (GASB) approved two new standards, GASB 67 and 68, with the goal of substantially improving the accounting for and transparency of financial reporting of state/municipal public employee defined benefit pension plans. GASB 68, the focus of this paper, requires state/municipal governments to calculate and report a net pension liability based on a single discount rate that combines the rate of return on funded plan assets with a low-risk index rate on the unfunded portion of the liability. This paper illustrates the calculation of estimates for GASB 68 reportable net pension liabilities, funded ratios, and single discount rates for 48 fiscal year state employee defined benefit plans by using an innovative valuation model and readily available data. The results show statistically significant increases in reportable net pension liabilities and decreases in the estimated hypothetical GASB 68 funded ratios and single discount rates. Our sensitivity analyses examine the effect of changes in the low-risk rate and time period on these results. We find that reported discount rates of weaker plans approach the low-risk rate, resulting in higher pension liabilities and creating policy incentives to increase risky assets in pension portfolios.


Author(s):  
Po Ting Lin ◽  
Wei-Hao Lu ◽  
Shu-Ping Lin

In the past few years, researchers have begun to investigate the existence of arbitrary uncertainties in the design optimization problems. Most traditional reliability-based design optimization (RBDO) methods transform the design space to the standard normal space for reliability analysis but may not work well when the random variables are arbitrarily distributed. It is because that the transformation to the standard normal space cannot be determined or the distribution type is unknown. The methods of Ensemble of Gaussian-based Reliability Analyses (EoGRA) and Ensemble of Gradient-based Transformed Reliability Analyses (EGTRA) have been developed to estimate the joint probability density function using the ensemble of kernel functions. EoGRA performs a series of Gaussian-based kernel reliability analyses and merged them together to compute the reliability of the design point. EGTRA transforms the design space to the single-variate design space toward the constraint gradient, where the kernel reliability analyses become much less costly. In this paper, a series of comprehensive investigations were performed to study the similarities and differences between EoGRA and EGTRA. The results showed that EGTRA performs accurate and effective reliability analyses for both linear and nonlinear problems. When the constraints are highly nonlinear, EGTRA may have little problem but still can be effective in terms of starting from deterministic optimal points. On the other hands, the sensitivity analyses of EoGRA may be ineffective when the random distribution is completely inside the feasible space or infeasible space. However, EoGRA can find acceptable design points when starting from deterministic optimal points. Moreover, EoGRA is capable of delivering estimated failure probability of each constraint during the optimization processes, which may be convenient for some applications.


2019 ◽  
Vol 4 (6) ◽  
pp. e001817 ◽  
Author(s):  
Apostolos Tsiachristas ◽  
David Gathara ◽  
Jalemba Aluvaala ◽  
Timothy Chege ◽  
Edwine Barasa ◽  
...  

IntroductionNeonatal mortality is an urgent policy priority to improve global population health and reduce health inequality. As health systems in Kenya and elsewhere seek to tackle increased neonatal mortality by improving the quality of care, one option is to train and employ neonatal healthcare assistants (NHCAs) to support professional nurses by taking up low-skill tasks.MethodsMonte-Carlo simulation was performed to estimate the potential impact of introducing NHCAs in neonatal nursing care in four public hospitals in Nairobi on effectively treated newborns and staff costs over a period of 10 years. The simulation was informed by data from 3 workshops with >10 stakeholders each, hospital records and scientific literature. Two univariate sensitivity analyses were performed to further address uncertainty.ResultsStakeholders perceived that 49% of a nurse full-time equivalent could be safely delegated to NHCAs in standard care, 31% in intermediate care and 20% in intensive care. A skill-mix with nurses and NHCAs would require ~2.6 billionKenyan Shillings (KES) (US$26 million) to provide quality care to 58% of all newborns in need (ie, current level of coverage in Nairobi) over a period of 10 years. This skill-mix configuration would require ~6 billion KES (US$61 million) to provide quality of care to almost all newborns in need over 10 years.ConclusionChanging skill-mix in hospital care by introducing NHCAs may be an affordable way to reduce neonatal mortality in low/middle-income countries. This option should be considered in ongoing policy discussions and supported by further evidence.


Sign in / Sign up

Export Citation Format

Share Document