scholarly journals Estimating Cost-Effectiveness of Confirmatory Oral Food Challenges in the Diagnosis of Children With Food Allergy

2019 ◽  
Vol 6 ◽  
pp. 2333794X1989129
Author(s):  
Abdullah Alsaggaf ◽  
James Murphy ◽  
Sydney Leibel

Introduction. Food allergies affect 8% of the pediatric population in the United States with an estimated annual cost of US$25 billion. The low specificity of some of the main food allergy tests used in diagnosis may generate false positives incurring unnecessary costs. We examined the cost-effectiveness of oral food challenges (OFC) as confirmatory tests in the diagnosis of food allergy. Methods. We constructed a decision tree with a Markov model comparing the long-term (15 years) cost and effectiveness—in the form of quality-adjusted life years (QALY)—of confirmatory OFCs compared with immediate allergenic food elimination (FE) after a skin prick test or blood immunoglobulin E (IgE) level in children with suspected food allergy. For costs, we included the costs of OFCs and the reported annual costs of having a food allergy, including direct medical costs and costs borne by families. Results. The cost of OFC strategy was $8671 compared with $18 012 for the FE strategy for the length of the model. Also, the OFC strategy had a total QALY of 21.942 compared with 21.740 for the FE strategy. In the OFC strategy, the total cost was $9341 less than FE and the increase in QALY after OFCs led to a 0.202 higher effectiveness in the OFC strategy. Conclusion. In conclusion, our study shows that the confirmatory OFC strategy dominated the FE strategy and that a confirmatory OFC for children, within a year of diagnosis, is a cost-effective strategy that decreases costs and appears to improve quality of life.

2021 ◽  
Vol 3 (1) ◽  
pp. 3-7 ◽  
Author(s):  
Justin Greiwe

A verified food allergy can be an impactful life event that leads to increased anxiety and measurable effects on quality of life. Allergists play a key role in framing this discussion and can help alleviate underlying fears by promoting confidence and clarifying safety concerns. Correctly diagnosing a patient with an immunoglobulin E (IgE) mediated food allergy remains a nuanced process fraught with the potential for error and confusion. This is especially true in situations in which the clinical history is not classic, and allergists rely too heavily on food allergy testing to provide a confirmatory diagnosis. A comprehensive medical history is critical in the diagnosis of food allergy and should be used to determine subsequent testing and interpretation of the results. Oral food challenge (OFC) is a critical procedure to identify patients with an IgE-mediated food allergy when the history and testing are not specific enough to confirm the diagnosis and can be a powerful teaching tool regardless of outcome. Although the safety and feasibility of performing OFC in a busy allergy office have always been a concern, in the hands of an experienced and trained provider, OFC is a safe and reliable procedure for patients of any age. With food allergy rates increasing and analysis of recent data that suggests that allergists across the United States are not providing this resource consistently to their patients, more emphasis needs to be placed on food challenge education and hands-on experience. The demand for OFCs will only continue to increase, especially with the growing popularity of oral immunotherapy programs; therefore, it is essential that allergists become familiar with the merits and limitations of current testing modalities and open their doors to using OFCs in the office.


2012 ◽  
Vol 30 (4_suppl) ◽  
pp. 226-226
Author(s):  
Roman Casciano ◽  
Maruit Chulikavit ◽  
Allison Perrin ◽  
Zhimei Liu ◽  
Xufang Wang ◽  
...  

226 Background: Everolimus and sunitinib were recently approved to treat patients with advanced, progressive pancreatic neuroendocrine tumors (pNETs). This analysis examines the projected cost-effectiveness of everolimus versus sunitinib in this setting from a US payer perspective. Methods: A lifetime Markov model was developed to simulate a cohort of advanced, progressive pNET patients and to estimate the cost per incremental life-years gained (LYG) and quality-adjusted life years (QALYs) gained when treating with everolimus as compared to sunitinib. Absent head-to-head trials, efficacy data were based on a weight-adjusted indirect comparison of the two agents using the respective phase 3 trial data. Disease or health states considered in the model included: stable disease without adverse events, stable disease with adverse events, disease progression, and death. Costs of anti-tumor therapies, symptomatic care drugs, and post-progression therapy were based on wholesale acquisition cost. Other costs including physician visits, tests, infusions, hospitalizations, adverse event costs, and end-of-life care costs were obtained from literature and/or standard sources such as the Healthcare Cost and Utilization Project and Medicare reimbursement rates. Utility inputs were based on a UK time trade-off study. Sensitivity analyses were conducted to test the model’s robustness. Results: In the base case analysis, the estimated gain of everolimus over sunitinib was 0.448 LYs (0.304 QALYs) at an incremental cost of $12,673, resulting in an incremental cost-effectiveness ratio (ICER) of $28,281/LYG ($41,702/QALY gained), which fell within the cost per QALY range for many other widely used oncology drugs. The analysis was sensitive to the uncertainty of the sunitinib trial results; however, a probabilistic sensitivity analysis showed the results were consistent across simulations. Conclusions: While the analysis is limited by its reliance on an indirect comparison of two phase 3 studies rather than a single head-to-head trial, everolimus is expected to be cost-effective relative to sunitinib in advanced pNET.


2018 ◽  
Vol 10 (3) ◽  
pp. 152
Author(s):  
Tonny Tanus ◽  
Sunny Wangko

Abstrak: Prevalensi alergi makanan makin meningkat di seluruh dunia dan mengenai semua usia. Keparahan dan kompleksitas penyakit juga meningkat terlebih pada populasi anak. Terdapat beberapa jenis reaksi alergi yang dibahas: immunoglobulin E (IgE) mediated allergies and anaphylaxis, food triggered atopic dermatitis, eosinophilic esophagitis, dan non IgE mediated gastrointestinal food allergic disorders seperti food protein induced enterocolitis syndrome (FPIEs). Tes alergi, baik melalui kulit maupun IgE yang telah dikerjakan sekian lama masih dibebani dengan hasil positif palsu dan negatif palsu yang bermakna dengan manfaat terbatas pada beberapa alergi makanan. Selain menghindari, tidak terdapat terapi yang ampuh untuk alergi makanan. Berbagai imunoterapi telah dipelajari melalui jalur, subkutan, epikutan, oral dan sublingual yang hanya menghasilkan desensitisasi sementara dan dibebani dengan berbagai isu mengenai keamanannya. Agen biologik yang menghambat sitokin/interleukin (IL) dan molekul pada reaksi alergi makanan tampaknya merupakan pilihan yang menjanjikan. Anti IgE telah dipergunakan pada asma dan urtikaria kronis. Anti IL-4 dan IL-13 yang menghambat produksi IgE diindikasikan untuk dermatitis atopik. Anti eosinofil anti IL-5 berhasil menurunkan eksaserbasi asma. Berbagai agen biologik telah dipelajari untuk berbagai kondisi alergik dan imunologik, tetapi efektivitas dan kepraktisan terapi yang mahal ini untuk alergi makanan masih menjadi tanda tanya.Kata kunci: alergi makanan, reaksi alergi, terapi alergi makananAbstract: Food allergies have been increasing in prevalence for years affecting all ages. Disease severity and complexity have also increased, especially in the pediatric population. There are several types of reactions including: immunoglobulin-E (IgE) mediated allergies and anaphylaxis, food-triggered atopic dermatitis, eosinophilic esophagitis, and non IgE mediated gastrointestinal food allergic disorders such as FPIEs. Though allergy testing has been around for years, both skin and IgE testing are burdened by significant false positives and negatives, and are only useful in some food allergies. Avoidance is the sole therapy for food allergy. A variety of immunotherapies have been studied; subcutaneous, epicutaneous, oral, and sublingual. At best they only produce a temporary state of desensitization and have many safety issues. Examples of biologicals which block critical cytokines/interleukins (IL) in allergic conditions are Anti IgE, anti IL-4 and IL-13, and Anti eosinophils, Anti IL-5. Other biologicals are being studied for allergic conditions, but whether these expensive future treatments will be proven effective and practical in food allergy is unknown.Keywords: food allergy, allergic reaction, food allergy therapy


Author(s):  
Wenyin Loh ◽  
Mimi Tang

There is a lack of high-quality evidence based on the gold standard of oral food challenges to determine food allergy prevalence. Nevertheless, studies using surrogate measures of food allergy, such as health service utilization and clinical history, together with allergen-specific immunoglobulin E (sIgE), provide compelling data that the prevalence of food allergy is increasing in both Western and developing countries. In Western countries, challenge-diagnosed food allergy has been reported to be as high as 10%, with the greatest prevalence noted among younger children. There is also growing evidence of increasing prevalence in developing countries, with rates of challenge-diagnosed food allergy in China and Africa reported to be similar to that in Western countries. An interesting observation is that children of East Asian or African descent born in a Western environment are at higher risk of food allergy compared to Caucasian children; this intriguing finding emphasizes the importance of genome-environment interactions and forecasts future increases in food allergy in Asia and Africa as economic growth continues in these regions. While cow’s milk and egg allergy are two of the most common food allergies in most countries, diverse patterns of food allergy can be observed in individual geographic regions determined by each country’s feeding patterns. More robust studies investigating food allergy prevalence, particularly in Asia and the developing world, are necessary to understand the extent of the food allergy problem and identify preventive strategies to cope with the potential increase in these regions.


2009 ◽  
Vol 29 (6) ◽  
pp. 690-706 ◽  
Author(s):  
Christina M. L. Kelton ◽  
Margaret K. Pasquale

Cost-effectiveness analysis (CEA) has been widely used in evaluating treatments for osteoporosis. To study the claim of enhanced persistence, this research determined the effects of persistence (the proportion of individuals who remain on treatment) and efficacy on incremental cost-effectiveness ratios (ICERs) for bisphosphonate treatment relative to no bisphosphonate treatment in the United States. For 2 age groups, 55 to 59 and 75 to 79, the authors relied on published fracture rates and applied them to 1000 postmenopausal osteoporotic patients with bone mineral density (BMD) T score ≤−2.5 during 3 years of treatment. After developing an algebraic ICER, with effectiveness measured by either quality-adjusted life years (QALYs) gained or number of fractures averted, they determined the effects of persistence and efficacy and then calibrated the model to variable estimates from the literature. For the younger (older) cohort, the cost per fracture averted was $66,606 ($18,256), consistent with a validated Markov simulation model. The effect of a 1 percentage point change in vertebral efficacy was 24 (5) times the effect of a 1 percentage point change in persistence for the younger cohort when QALYs (fractures) were involved. Nonvertebral efficacy had approximately 27 (9) times the effect of persistence. For the older cohort, the ratios were 15 (4.5) and 33 (10) for vertebral and nonvertebral fractures, respectively. In evaluating the claim of enhanced persistence, formulary decision makers need to exercise caution to ensure that efficacy is not compromised. Two drugs would have to be virtually identical in efficacy for better persistence to improve cost-effectiveness.


2018 ◽  
Vol 9 ◽  
Author(s):  
Sayantani Sindher ◽  
Andrew J. Long ◽  
Natasha Purington ◽  
Madeleine Chollet ◽  
Sara Slatkin ◽  
...  

Background: Double-blind placebo-controlled food challenges (DBPCFCs) remain the gold standard for the diagnosis of food allergy; however, challenges require significant time and resources and place the patient at an increased risk for severe allergic adverse events. There have been continued efforts to identify alternative diagnostic methods to replace or minimize the need for oral food challenges (OFCs) in the diagnosis of food allergy.Methods: Data was extracted for all IRB-approved, Stanford-initiated clinical protocols involving standardized screening OFCs to a cumulative dose of 500 mg protein to any of 11 food allergens in participants with elevated skin prick test (SPT) and/or specific IgE (sIgE) values to the challenged food across 7 sites. Baseline population characteristics, biomarkers, and challenge outcomes were analyzed to develop diagnostic criteria predictive of positive OFCs across multiple allergens in our multi-allergic cohorts.Results: A total of 1247 OFCs completed by 427 participants were analyzed in this cohort. Eighty-five percent of all OFCs had positive challenges. A history of atopic dermatitis and multiple food allergies were significantly associated with a higher risk of positive OFCs. The majority of food-specific SPT, sIgE, and sIgE/total IgE (tIgE) thresholds calculated from cumulative tolerated dose (CTD)-dependent receiver operator curves (ROC) had high discrimination of OFC outcome (area under the curves > 0.75). Participants with values above the thresholds were more likely to have positive challenges.Conclusions: This is the first study, to our knowledge, to not only adjust for tolerated allergen dose in predicting OFC outcome, but to also use this method to establish biomarker thresholds. The presented findings suggest that readily obtainable biomarker values and patient demographics may be of use in the prediction of OFC outcome and food allergy. In the subset of patients with SPT or sIgE values above the thresholds, values appear highly predictive of a positive OFC and true food allergy. While these values are relatively high, they may serve as an appropriate substitute for food challenges in clinical and research settings.


Endoscopy ◽  
2019 ◽  
Vol 51 (11) ◽  
pp. 1051-1058 ◽  
Author(s):  
Hailey J. James ◽  
Theodore W. James ◽  
Stephanie B. Wheeler ◽  
Jennifer C. Spencer ◽  
Todd H. Baron

Abstract Background Roux-en-Y gastric bypass (RYGB) surgery is the second most common weight loss surgery in the United States. Treatment of pancreaticobiliary disease in this patient population is challenging due to the altered anatomy, which limits the use of standard instruments and techniques. Both nonoperative and operative modalities are available to overcome these limitations, including device-assisted (DAE) endoscopic retrograde cholangiopancreatography (ERCP), laparoscopic-assisted (LA) ERCP, and endoscopic ultrasound-directed transgastric ERCP (EDGE). The aim of this study was to compare the cost-effectiveness of ERCP-based modalities for treatment of pancreaticobiliary diseases in post-RYGB patients. Methods A decision tree model with a 1-year time horizon was used to analyze the cost-effectiveness of EDGE, DAE-ERCP, and LA-ERCP in post-RYGB patients. Monte Carlo simulation was used to assess a plausible range of incremental cost-effectiveness ratios, net monetary benefit calculations, and a cost-effectiveness acceptability curve. One-way sensitivity analyses and probabilistic sensitivity analyses were also performed to assess how changes in key parameters affected model conclusions. Results EDGE resulted in the lowest total costs and highest total quality-adjusted life-years (QALY) for a total of $5188/QALY, making it the dominant alternative compared with DAE-ERCP and LA-ERCP. In probabilistic analyses, EDGE was the most cost-effective modality compared with LA-ERCP and DAE-ERCP in 94.4 % and 97.1 % of simulations, respectively. Conclusion EDGE was the most cost-effective modality in post-RYGB anatomy for treatment of pancreaticobiliary diseases compared with DAE-ERCP and LA-ERCP. Sensitivity analysis demonstrated that this conclusion was robust to changes in important model parameters.


2021 ◽  
Author(s):  
Jarett Anderson ◽  
Torunn Sivesind

UNSTRUCTURED Eczema is a common and taxing condition, with an estimated prevalence of 10.7% among pediatric patients in the United States and cost of 5 billion USD annually.1 Eczema has a known association with food allergies, with both conditions most commonly developing during the first year of life. The cost of care and the daily attention required to treat both eczema and food allergy represent significant burdens to individuals and families. Without a global standard for neonatal or infant skin care, and with few emollient studies performed in term infants, this Cochrane review2 provides a much-needed assessment of the evidence for use of emollients and other interventions to prevent eczema, as well as their effects on development of food allergy. This systematic review assessed 33 randomized controlled trials (n=25,827), all of which studied term (>37 weeks) infants (<12 months) without pre-existing diagnosis of eczema, food allergy, or other skin condition. The review concludes that skin care interventions do not change the risk of developing eczema by the age of one to two years (risk ratio (RR) 1.03, 95% confidence interval (CI) 0.81-1.31, 7 trials, n=3075), nor do they reduce the time needed to develop eczema (hazard ratio 0.86, 95% CI 0.65-1.14, 9 trials, n=3349, but were associated with a higher number of skin infections (RR 1.34, 95% CI 1.02-1.77, 6 trials, n=2728). These were all reported with moderate-certainty evidence. There was limited evidence concerning the impact of skin care interventions on IgE-mediated food allergies (RR 2.53, 95% CI 0.99-6.47, 1 trial, n=996) or sensitization to food allergens at 1-2 years (RR 0.86, 95% CI 0.28-2.69, 2 trials, n=1055); the few trials that investigated these outcomes produced broad confidence intervals that failed to achieve statistical significance. Further work is warranted to identify the effects of different skin care interventions on the prevention of eczema and their effects on food allergy. There are currently a number of ongoing clinical trials to assess the use of skin care interventions on the prevention of atopic dermatitis and food allergy, and one trial that recently concluded that there is no evidence that the use of daily emollients reduces the risk of eczema by the age of two years in high-risk patients (patients with first-degree relatives with a history of eczema, asthma, or allergic rhinitis).3 Incidence of dry skin and eczema have increased, especially since the onset of the coronavirus pandemic. With an enhanced emphasis on frequent hand-washing, hand hygiene has also become an increasingly popular topic amongst individuals and families.4 In recent years prior to the pandemic, an increase in the incidence of eczema in the pediatric population was reported, most prominently among infants.5 With this in mind, it is important for clinicians to familiarize themselves with treatment regimens that are supported by data, preferably from large systematic reviews like those performed by the Cochrane Review Groups. By understanding the information from numerous studies simultaneously, meta-analyses, such as the one summarized here, enable physicians to apply evidence to clinical practice and make sound recommendations to patients.


2019 ◽  
Author(s):  
Jifan Wang ◽  
Michelle A. Lee Bravatti ◽  
Elizabeth J. Johnson ◽  
Gowri Raman

Abstract Background Heart disease is the leading cause of death in the United States. The U.S. Food and Drug Administration approved the health claim that 1.5 oz. (42.5 g) nut intake may reduce the risk of cardiovascular disease (CVD). Previous studies have focused on the cost-effectiveness of other foods or dietary factors on primary CVD prevention, yet not in almond consumption. This study aimed to examine the cost-effectiveness of almond consumption in CVD prevention. Methods A decision model was developed for 42.5 g almond per day versus no almond consumption and CVD in the U.S. population. Parameters in the model were derived from the literature, which included the probabilities of increasing LDL-C, developing acute myocardial infarction (MI) and stroke, treating MI, dying from the disease and surgery, as well as the costs of the disease and procedures in the U.S. population, and the quality-adjusted life years (QALY). The cost of almonds was based on the current price in the U.S. market. Sensitivity analyses were conducted for the time of almond consumption, ten-year risk prevention, patients with or without CVD and using sex-specific probabilities for MI, and varying the costs of procedures and almonds. Results The almond strategy had $302 lower cost and 0.02 QALY gain compared to the non-almond strategy in the main analysis. The annual net monetary benefit (NMB) of almond consumption was $1,360 higher per person than no almond consumption, when the willingness to pay threshold was set at $50,000 for annual health care expenditure. Almond was more cost-effective than non-almond in CVD prevention in all the sensitivity analyses, except for the time of almond consumption as a morning or afternoon snack or when costs of almonds increased from $0.47 to $1.41 per day. Conclusion Consuming 42.5 g of almonds per day is a cost-effective approach to prevent CVD in the short term and potentially in the long term.


2017 ◽  
Vol 4 (3) ◽  
Author(s):  
Nicole T Shen ◽  
Jared A Leff ◽  
Yecheskel Schneider ◽  
Carl V Crawford ◽  
Anna Maw ◽  
...  

Abstract Background Systematic reviews with meta-analyses and meta-regression suggest that timely probiotic use can prevent Clostridium difficile infection (CDI) in hospitalized adults receiving antibiotics, but the cost effectiveness is unknown. We sought to evaluate the cost effectiveness of probiotic use for prevention of CDI versus no probiotic use in the United States. Methods We programmed a decision analytic model using published literature and national databases with a 1-year time horizon. The base case was modeled as a hypothetical cohort of hospitalized adults (mean age 68) receiving antibiotics with and without concurrent probiotic administration. Projected outcomes included quality-adjusted life-years (QALYs), costs (2013 US dollars), incremental cost-effectiveness ratios (ICERs; $/QALY), and cost per infection avoided. One-way, two-way, and probabilistic sensitivity analyses were conducted, and scenarios of different age cohorts were considered. The ICERs less than $100000 per QALY were considered cost effective. Results Probiotic use dominated (more effective and less costly) no probiotic use. Results were sensitive to probiotic efficacy (relative risk &lt;0.73), the baseline risk of CDI (&gt;1.6%), the risk of probiotic-associated bactermia/fungemia (&lt;0.26%), probiotic cost (&lt;$130), and age (&gt;65). In probabilistic sensitivity analysis, at a willingness-to-pay threshold of $100000/QALY, probiotics were the optimal strategy in 69.4% of simulations. Conclusions Our findings suggest that probiotic use may be a cost-effective strategy to prevent CDI in hospitalized adults receiving antibiotics age 65 or older or when the baseline risk of CDI exceeds 1.6%.


Sign in / Sign up

Export Citation Format

Share Document