scholarly journals Multiplex tests to identify gastrointestinal bacteria, viruses and parasites in people with suspected infectious gastroenteritis: a systematic review and economic analysis

2017 ◽  
Vol 21 (23) ◽  
pp. 1-188 ◽  
Author(s):  
Karoline Freeman ◽  
Hema Mistry ◽  
Alexander Tsertsvadze ◽  
Pam Royle ◽  
Noel McCarthy ◽  
...  

Background Gastroenteritis is a common, transient disorder usually caused by infection and characterised by the acute onset of diarrhoea. Multiplex gastrointestinal pathogen panel (GPP) tests simultaneously identify common bacterial, viral and parasitic pathogens using molecular testing. By providing test results more rapidly than conventional testing methods, GPP tests might positively influence the treatment and management of patients presenting in hospital or in the community. Objective To systematically review the evidence for GPP tests [xTAG® (Luminex, Toronto, ON, Canada), FilmArray (BioFire Diagnostics, Salt Lake City, UT, USA) and Faecal Pathogens B (AusDiagnostics, Beaconsfield, NSW, Australia)] and to develop a de novo economic model to compare the cost-effectiveness of GPP tests with conventional testing in England and Wales. Data sources Multiple electronic databases including MEDLINE, EMBASE, Web of Science and the Cochrane Database were searched from inception to January 2016 (with supplementary searches of other online resources). Review methods Eligible studies included patients with acute diarrhoea; comparing GPP tests with standard microbiology techniques; and patient, management, test accuracy or cost-effectiveness outcomes. Quality assessment of eligible studies used tailored Quality Assessment of Diagnostic Accuracy Studies-2, Consolidated Health Economic Evaluation Reporting Standards and Philips checklists. The meta-analysis included positive and negative agreement estimated for each pathogen. A de novo decision tree model compared patients managed with GPP testing or comparable coverage with patients managed using conventional tests, within the Public Health England pathway. Economic models included hospital and community management of patients with suspected gastroenteritis. The model estimated costs (in 2014/15 prices) and quality-adjusted life-year losses from a NHS and Personal Social Services perspective. Results Twenty-three studies informed the review of clinical evidence (17 xTAG, four FilmArray, two xTAG and FilmArray, 0 Faecal Pathogens B). No study provided an adequate reference standard with which to compare the test accuracy of GPP with conventional tests. A meta-analysis (of 10 studies) found considerable heterogeneity; however, GPP testing produces a greater number of pathogen-positive findings than conventional testing. It is unclear whether or not these additional ‘positives’ are clinically important. The review identified no robust evidence to inform consequent clinical management of patients. There is considerable uncertainty about the cost-effectiveness of GPP panels used to test for suspected infectious gastroenteritis in hospital and community settings. Uncertainties in the model include length of stay, assumptions about false-positive findings and the costs of tests. Although there is potential for cost-effectiveness in both settings, key modelling assumptions need to be verified and model findings remain tentative. Limitations No test–treat trials were retrieved. The economic model reflects one pattern of care, which will vary across the NHS. Conclusions The systematic review and cost-effectiveness model identify uncertainties about the adoption of GPP tests within the NHS. GPP testing will generally correctly identify pathogens identified by conventional testing; however, these tests also generate considerable additional positive results of uncertain clinical importance. Future work An independent reference standard may not exist to evaluate alternative approaches to testing. A test–treat trial might ascertain whether or not additional GPP ‘positives’ are clinically important or result in overdiagnoses, whether or not earlier diagnosis leads to earlier discharge in patients and what the health consequences of earlier intervention are. Future work might also consider the public health impact of different testing treatments, as test results form the basis for public health surveillance. Study registration This study is registered as PROSPERO CRD2016033320. Funding The National Institute for Health Research Health Technology Assessment programme.

2014 ◽  
Vol 18 (58) ◽  
pp. 1-406 ◽  
Author(s):  
Tristan Snowsill ◽  
Nicola Huxley ◽  
Martin Hoyle ◽  
Tracey Jones-Hughes ◽  
Helen Coelho ◽  
...  

BackgroundLynch syndrome (LS) is an inherited autosomal dominant disorder characterised by an increased risk of colorectal cancer (CRC) and other cancers, and caused by mutations in the deoxyribonucleic acid (DNA) mismatch repair genes.ObjectiveTo evaluate the accuracy and cost-effectiveness of strategies to identify LS in newly diagnosed early-onset CRC patients (aged < 50 years). Cascade testing of relatives is employed in all strategies for individuals in whom LS is identified.Data sources and methodsSystematic reviews were conducted of the test accuracy of microsatellite instability (MSI) testing or immunohistochemistry (IHC) in individuals with CRC at risk of LS, and of economic evidence relating to diagnostic strategies for LS. Reviews were carried out in April 2012 (test accuracy); and in February 2012, repeated in February 2013 (economic evaluations). Databases searched included MEDLINE (1946 to April week 3, 2012), EMBASE (1980 to week 17, 2012) and Web of Science (inception to 30 April 2012), and risk of bias for test accuracy was assessed using the Quality Assessment of Diagnostic Accuracy Studies-2 (QUADAS-2) quality appraisal tool. A de novo economic model of diagnostic strategies for LS was developed.ResultsInconsistencies in study designs precluded pooling of diagnostic test accuracy results from a previous systematic review and nine subsequent primary studies. These were of mixed quality, with significant methodological concerns identified for most. IHC and MSI can both play a part in diagnosing LS but neither is gold standard. No UK studies evaluated the cost-effectiveness of diagnosing and managing LS, although studies from other countries generally found some strategies to be cost-effective compared with no testing.The de novo model demonstrated that all strategies were cost-effective compared with no testing at a threshold of £20,000 per quality-adjusted life-year (QALY), with the most cost-effective strategy utilising MSI andBRAFtesting [incremental cost-effectiveness ratio (ICER) = £5491 per QALY]. The maximum health benefit to the population of interest would be obtained using universal germline testing, but this would not be a cost-effective use of NHS resources compared with the next best strategy. When the age limit was raised from 50 to 60 and 70 years, the ICERs compared with no testing increased but remained below £20,000 per QALY (except for universal germline testing with an age limit of 70 years). The total net health benefit increased with the age limit as more individuals with LS were identified. Uncertainty was evaluated through univariate sensitivity analyses, which suggested that the parameters substantially affecting cost-effectiveness: were the risk of CRC for individuals with LS; the average number of relatives identified per index patient; the effectiveness of colonoscopy in preventing metachronous CRC; the cost of colonoscopy; the duration of the psychological impact of genetic testing on health-related quality of life (HRQoL); and the impact of prophylactic hysterectomy and bilateral salpingo-oophorectomy on HRQoL (this had the potential to make all testing strategies more expensive and less effective than no testing).LimitationsThe absence of high-quality data for the impact of prophylactic gynaecological surgery and the psychological impact of genetic testing on HRQoL is an acknowledged limitation.ConclusionsResults suggest that reflex testing for LS in newly diagnosed CRC patients aged < 50 years is cost-effective. Such testing may also be cost-effective in newly diagnosed CRC patients aged < 60 or < 70 years. Results are subject to uncertainty due to a number of parameters, for some of which good estimates were not identified. We recommend future research to estimate the cost-effectiveness of testing for LS in individuals with newly diagnosed endometrial or ovarian cancer, and the inclusion of aspirin chemoprevention. Further research is required to accurately estimate the impact of interventions on HRQoL.Study registrationThis study is registered as PROSPERO CRD42012002436.FundingThe National Institute for Health Research Health Technology Assessment programme.


Diabetes Care ◽  
2010 ◽  
Vol 33 (9) ◽  
pp. 2126-2128 ◽  
Author(s):  
N. Laiteerapong ◽  
E. S. Huang

2009 ◽  
Vol 20 (3) ◽  
pp. 73-77 ◽  
Author(s):  
Mark J Kearns ◽  
Sabrina S Plitt ◽  
Bonita E Lee ◽  
Joan L Robinson

BACKGROUND: There are limited recent data on rubella immunity in women of childbearing age in Canada. In the present paper, the proportion of rubella seroreactivity and redundant testing (testing of women previously seropositive when tested by the same physician) in the Alberta prenatal rubella screening program were studied.METHODS: In the present retrospective observational study, data on all specimens submitted for prenatal screening in Alberta between August 2002 and December 2005 were extracted from the Provincial Laboratory for Public Health database. The proportion of rubella screening and immunoglobulin G (IgG) seroreactivity were determined. Demographic variables were compared between rubella seroreactors and nonseroreactors. The proportion of redundant testing was determined.RESULTS: Of 159,046 prenatal specimens, 88.3% (n=140,473) were screened for rubella immunity. In total, 8.8% of specimens tested negative for rubella IgG. Younger women (23.2% of women younger than 20 years of age versus 4.7% of women between 35 and 39 years of age; P<0.001) and women from northern Alberta (11.9% versus 8.1% [overall]; P<0.001) were significantly more likely to have seronegative specimens. Of the 20,044 women who had multiple rubella immunity screenings, 88.1% (n=17,651) had multiple positive test results. In total, 20.7% of the 42,274 specimens submitted from women with multiple screenings were deemed redundant.DISCUSSION: Younger women were most likely to be seronegative for rubella. The public health significance of women entering their childbearing years with low or undetectable rubella IgG levels remains to be determined. A large number of women with documented rubella immunity were unnecessarily retested.


2019 ◽  
Author(s):  
Joseph B. Babigumira ◽  
Solomon J. Lubinga ◽  
Mindy M. Cheng ◽  
James K. Karichu ◽  
Louis P. Garrison

Abstract Background HIV viral load (VL) monitoring informs antiretroviral therapy failure and helps to guide regimen changes. Typically, VL monitoring is performed using dried blood spot (DBS) samples transported and tested in a centralized laboratory. Novel sample collection technologies based on dried plasma stored on a plasma separation card (PSC) have become available. The cost-effectiveness of these different testing approaches to monitor VL is uncertain, especially in resource-limited settings. The objective of this study is to evaluate the potential cost-effectiveness of HIV VL testing approaches with PSC samples compared to DBS samples in Malawi. Methods We developed a decision-tree model to evaluate the cost-effectiveness of two different sample collection and testing methods—DBS and PSC samples transported and tested at central laboratories. The analysis used data from the published literature and was performed from the Malawi Ministry of Health perspective. We estimated costs of sample collection, transportation, and testing. The primary clinical outcome was test accuracy (proportion of patients correctly classified with or without treatment failure). Sensitivity analysis was performed to assess the robustness of results. Results The estimated test accuracy for a DBS testing approach was 87.5% compared to 97.4% for an approach with PSC. The estimated total cost per patient of a DBS testing approach was $19.39 compared to $17.73 for a PSC approach. Based on this, a PSC-based testing approach “dominates” a DBS-based testing approach (i.e., lower cost and higher accuracy). Conclusion The base-case analysis shows that a testing approach using PSC sample is less costly and more accurate (correctly classifies more patients with or without treatment failure) than with a DBS approach. Our study suggests that a PSC testing approach is likely an optimal strategy for routine HIV VL monitoring in Malawi. However, given the limited data regarding sample viability, additional real-world data are needed to validate the results.


2018 ◽  
Vol 28 (5) ◽  
pp. 725-729
Author(s):  
Deepa Prasad ◽  
Joni Steinberg ◽  
Christopher Snyder

AbstractIntroductionNewborn atrial flutter can be treated by medications, pacing, or direct current cardioversion. The purpose is to compare the cost-effectiveness of digoxin, pacing, and direct current cardioversion for the treatment of atrial flutter in neonates.Materials and methodsA decision tree model was developed comparing the efficacy and cost of digoxin, pacing, and direct current cardioversion based on a meta-analysis of published studies of success rates of cardioversion of neonatal atrial flutter (age<2 months). Patients who failed initial attempt at cardioversion progressed to the next methodology until successful. Data were analysed to assess the cost-effectiveness of these methods with cost estimates obtained from 2015 Medicare reimbursement rates.ResultsThe cost analysis for cardioversion of atrial flutter found the most efficient method to be direct current cardioversion at a cost of $10 304, pacing was next at $11 086, and the least cost-effective was digoxin at $14 374. The majority of additional cost, regardless of method, was from additional neonatal ICU day either owing to digoxin loading or failure to covert. Direct current cardioversion remains the most cost-effective strategy by sensitivity analyses performed on pacing conversion rate and the cost of the neonatal ICU/day. Direct current cardioversion remains cost-effective until the assumed conversion rate is below 64.6%.ConclusionThe most cost-efficient method of cardioverting a neonate with atrial flutter is direct current cardioversion. It has the highest success rates based on the meta-analysis, shorter length of stay in the neonatal ICU owing to its success, and results in cost-savings ranging from $800 to $4000 when compared with alternative approaches.


2019 ◽  
Vol 40 (7) ◽  
pp. 721-731 ◽  
Author(s):  
Stefan V Danilla ◽  
Rocio P Jara ◽  
Felipe Miranda ◽  
Francisco Bencina ◽  
Marcela Aguirre ◽  
...  

Abstract Background Breast implant-associated anaplastic large cell lymphoma (BIA-ALCL) is an emergent disease that threatens patients with texturized breast implants. Major concerns about the safety of these implants are leading to global changes to restrict the utilization of this product. The principal alternative is to perform breast augmentation utilizing smooth implants, given the lack of association with BIA-ALCL. The implications and costs of this intervention are unknown. Objectives The authors of this study determined the cost-effectiveness of smooth implants compared with texturized implants for breast augmentation surgery. Methods A tree decision model was utilized to analyze the cost-effectiveness. Model input parameters were derived from published sources. The capsular contracture (CC) rate was calculated from a meta-analysis. Effectiveness measures were life years, avoided BIA-ALCL, avoided deaths, and avoided reoperations. A sensitivity analysis was performed to test the robustness of the model. Results For avoided BIA-ALCL, the incremental cost was $18,562,003 for smooth implants over texturized implants. The incremental cost-effectiveness ratio was negative for life years, and avoided death and avoided reoperations were negative. The sensitivity analysis revealed that to avoid 1 case of BIA-ALCL, the utilization of smooth implants would be cost-effective for a risk of developing BIA-ALCL equal to or greater than 1:196, and there is a probability of CC with smooth implants equal to or less than 0.096. Conclusions The utilization of smooth implants to prevent BIA-ALCL is not cost-effective. Banning texturized implants to prevent BIA-ALCL may involve additional consequences, which should be considered in light of higher CC rates and more reoperations associated with smooth implants than with texturized implants.


2019 ◽  
Vol 3 (Supplement_1) ◽  
Author(s):  
Reina Engle-Stone ◽  
Stephen Vosti ◽  
Laura Meinzen-Dick ◽  
Sika Kumordzie

Abstract Objectives We aimed to estimate the potential effects, costs, and cost-effectiveness of a programmatic transition from distribution of iron-folic acid (IFA) tablets to distribution of multiple micronutrient supplements (MMS) to pregnant woman, using Bangladesh and Burkina Faso as case studies. Methods For each country, we developed an 11-year predictive model using baseline demographic information from the Lives Saved Tool and effect sizes from a recent meta-analysis of trials of MMS compared to IFA supplementation during pregnancy. We predicted the number of cases of stillbirth, infant mortality, and adverse birth outcomes (low birth weight, small-for-gestational age, and preterm birth) and DALYs averted by replacing IFA with MMS at current levels of IFA coverage (∼50% nationally in Bangladesh; ∼10% in Burkina Faso). We estimated initial program transition costs and the annual marginal cost of MMS compared to IFA supplements, and calculated cost-effectiveness measures for scenarios with varied numbers of tablets received and consumed by pregnant women. Results In Bangladesh, immediate replacement of IFA with MMS at current coverage (assuming all covered pregnancies receive 180 tablets) was predicted to avert >73,800 deaths and >178,500 cases of preterm birth over 11 years at a cost of $5.0 to $14.2 per DALY averted; costs would increase by ∼9% with the addition of programmatic transition costs. In Burkina Faso, the same scenario would avert >5700 deaths and >6600 cases of preterm birth over 11 years at a cost of $3.6 to $15.5 per DALY averted. Assuming that benefits of supplementation accrue only above a given threshold (e.g., 180 tablets per pregnancy), accounting for supplement consumption above or below this threshold (e.g., consumption of 30 tablets or 270 tablets) could substantially reduce the cost-effectiveness of the IFA-MMS switch in comparison with a scenario in which all covered pregnancies consume exactly 180 tablets, although cost per DALY averted remained below $105 in all scenarios. Conclusions This modeling analysis suggests that the cost per DALY averted by transitioning from IFA to MMS is low relative to other prenatal interventions designed to save lives. Improvements in program delivery and supplement adherence would improve the cost-effectiveness of replacing IFA with MMS. Funding Sources Sight and Life; Sackler Institute for Nutrition Science.


2018 ◽  
Vol 36 (6) ◽  
pp. 554-562 ◽  
Author(s):  
Young Chandler ◽  
Clyde B. Schechter ◽  
Jinani Jayasekera ◽  
Aimee Near ◽  
Suzanne C. O’Neill ◽  
...  

Purpose Gene expression profile (GEP) testing can support chemotherapy decision making for patients with early-stage, estrogen receptor–positive, human epidermal growth factor 2–negative breast cancers. This study evaluated the cost effectiveness of one GEP test, Onco type DX (Genomic Health, Redwood City, CA), in community practice with test-eligible patients age 40 to 79 years. Methods A simulation model compared 25-year societal incremental costs and quality-adjusted life-years (QALYs) of community Onco type DX use from 2005 to 2012 versus usual care in the pretesting era (2000 to 2004). Inputs included Onco type DX and chemotherapy data from an integrated health care system and national and published data on Onco type DX accuracy, chemotherapy effectiveness, utilities, survival and recurrence, and Medicare and patient costs. Sensitivity analyses varied individual parameters; results were also estimated for ideal conditions (ie, 100% testing and adherence to test-suggested treatment, perfect test accuracy, considering test effects on reassurance or worry, and lowest costs). Results Twenty-four percent of test-eligible patients had Onco type DX testing. Testing was higher in younger patients and patients with stage I disease ( v stage IIA), and 75.3% and 10.2% of patients with high and low recurrence risk scores received chemotherapy, respectively. The cost-effectiveness ratio for testing ( v usual care) was $188,125 per QALY. Considering test effects on worry versus reassurance decreased the cost-effectiveness ratio to $58,431 per QALY. With perfect test accuracy, the cost-effectiveness ratio was $28,947 per QALY, and under ideal conditions, it was $39,496 per QALY. Conclusion GEP testing is likely to have a high cost-effectiveness ratio on the basis of community practice patterns. However, realistic variations in assumptions about key variables could result in GEP testing having cost-effectiveness ratios in the range of other accepted interventions. The differences in cost-effectiveness ratios on the basis of community versus ideal conditions underscore the importance of considering real-world implementation when assessing the new technology.


Sign in / Sign up

Export Citation Format

Share Document