scholarly journals Premedication Prior to Peg-Asparaginase Is Cost-Effective for Pediatric Leukemia Patients

Blood ◽  
2020 ◽  
Vol 136 (Supplement 1) ◽  
pp. 7-7
Author(s):  
Meghan McCormick ◽  
Jillian Lapinski ◽  
Erika Friehling ◽  
Ken Smith

Background: Asparaginase is a critical therapy component for childhood acute lymphoblastic leukemia (ALL). Hypersensitivity reactions and silent inactivation by neutralizing antibodies can lead to withholding further doses. Reactions occur in 10-20% of children receiving the commonly used, widely available pegylated asparaginase. The less immunogenic Erwinia asparaginase may allow continued administration, but requires more frequent dosing and is subject to limitations in availability. Inability to receive all recommended asparaginase doses decreases disease-free survival. Premedication with antihistamines, antipyretics and steroids decreases hypersensitivity reaction frequency, preventing the need for alternates. The cost-effectiveness of premedication strategies in childhood ALL is unclear. Methods: We used a Markov model to estimate strategy costs and quality-adjusted life years (QALYs) for two patient scenarios: a 3-year-old with standard-risk ALL receiving 2 asparaginase doses, and a 15-year-old with high-risk ALL receiving 7 asparaginase doses over a 5-year time horizon. Patients entering the model received premedication with serum asparaginase level monitoring, monitoring only, or no premedication/monitoring. Literature data were used for hypersensitivity reaction and silent inactivation risks following each asparaginase dose. Silent inactivation was not identified the non-monitoring strategy. Disease outcomes, therapy and associated additional care costs, and health state quality-of-life utilities were obtained from the literature and US databases. Evaluation took the societal perspective, with costs and effectiveness discounted at 3%/yr. Multiple sensitivity analyses were performed. Results: In both the standard-risk and high-risk analyses, premedication was the least costly strategy. In the standard-risk model, premedication with monitoring cost $4,586 less than monitoring alone, resulted in 8% fewer changes to Erwinia and 0.01 additional QALYs. It cost $1,993 less than no premedication/monitoring, resulted in 3% fewer changes and 0.08 additional QALYs. In the high-risk model, premedication cost $29,757 less than monitoring alone, resulted in 7% fewer medication changes and 0.01 fewer QALYS; thus, monitoring alone was expensive, costing >$2 million/QALY gained compared to premedication and monitoring. Premedication cost $11,255 less than no premedication/monitoring, resulted in 2% fewer changes and 0.07 additional QALYs. Individual variation of all model inputs did not change the favorability of premedication and monitoring for either model. In probabilistic sensitivity analyses varying all parameters simultaneously over distributions 1000 times, premedication and monitoring was favored in >86% of model iterations in both standard- and high-risk scenarios. Conclusion: Compared to other strategies, premedication use and asparaginase level monitoring in children with ALL is economically reasonable and potentially cost-saving. Disclosures No relevant conflicts of interest to declare.

2002 ◽  
Vol 16 (12) ◽  
pp. 877-879 ◽  
Author(s):  
John K Marshall

The Canadian Coordinating Office for Health Technology Assessment (CCOHTA) published an economic analysis, using a Markov model, of infliximab therapy for Crohn’s disease that is refractory to other treatments. This was the first fully published economic analysis that addresses this treatment option. Health state transitions were based on data from Olmsted County, Minnesota, health state resource profiles were created using expert opinion and a number of assumptions were made when designing the model. The analysis was rigorous, the best available efficacy and safety data were used, state-of-the art sensitivity analyses were undertaken and an ‘acceptability curve‘ was constructed. The model found that infliximab was effective in increasing quality-adjusted life years when offered in a variety of protocols, but it was associated with high incremental cost utility ratios compared with usual care. The results should be interpreted, however, in view of a number of limitations. The time horizon for the analysis was short (one year), because of a lack of longer-term efficacy data, and might have led to an underestimation of the benefits from averting surgery. Because the analysis was performed from the perspective of a Canadian provincial ministry of health, only direct medical costs were considered. Patients with active Crohn’s disease are likely to incur significant indirect costs, which could be mitigated by this medication. The analysis should be updated as new data become available. Moreover, small changes in the cost of the medication could make the treatment cost effective, according to this model. Economic analyses, such as the one undertaken by the CCOHTA, cannot by themselves solve dilemmas in the allocation of limited health care resources, and other considerations must be included when formulating policy. This is especially important for patients with severe Crohn’s disease, who have significant disability and for whom few therapeutic options exist.


Open Heart ◽  
2019 ◽  
Vol 6 (1) ◽  
pp. e001037 ◽  
Author(s):  
Claudia I Rinciog ◽  
Laura M Sawyer ◽  
Alexander Diamantopoulos ◽  
Mitchell S V Elkind ◽  
Matthew Reynolds ◽  
...  

ObjectiveTo evaluate the cost-effectiveness of insertable cardiac monitors (ICMs) compared with standard of care (SoC) for detecting atrial fibrillation (AF) in patients at high risk of stroke (CHADS2 >2), using a UK National Health Service (NHS) perspective.MethodsUsing patient characteristics and clinical data from the REVEAL AF trial, a Markov model assessed the cost-effectiveness of detecting AF with an ICM compared with SoC. Costs and benefits were extrapolated across modelled patient lifetime. Ischaemic and haemorrhagic strokes, intracranial and extracranial haemorrhages and minor bleeds were modelled. Diagnostic and device costs were included, plus costs of treating stroke and bleeding events and costs of oral anticoagulants (OACs). Costs and health outcomes, measured as quality-adjusted life years (QALYs), were discounted at 3.5% per annum. One-way deterministic and probabilistic sensitivity analyses (PSA) were undertaken.ResultsThe total per-patient cost for ICM was £13 360 versus £11 936 for SoC (namely, annual 24 hours Holter monitoring). ICMs generated a total of 6.50 QALYs versus 6.30 for SoC. The incremental cost-effectiveness ratio (ICER) was £7140/QALY gained, below the £20 000/QALY acceptability threshold. ICMs were cost-effective in 77.4% of PSA simulations. The number of ICMs needed to prevent one stroke was 21 and to cause a major bleed was 37. ICERs were sensitive to assumed proportions of patients initiating or discontinuing OAC after AF diagnosis, type of OAC used and how intense the traditional monitoring was assumed to be under SoC.ConclusionsThe use of ICMs to identify AF in a high-risk population is cost-effective for the UK NHS.


2017 ◽  
Vol 35 (1) ◽  
pp. 63-71 ◽  
Author(s):  
Caroline G. Watts ◽  
Anne E. Cust ◽  
Scott W. Menzies ◽  
Graham J. Mann ◽  
Rachael L. Morton

Purpose Clinical guidelines recommend that people at high risk of melanoma receive regular surveillance to improve survival through early detection. A specialized High Risk Clinic in Sydney, Australia was found to be effective for this purpose; however, wider implementation of this clinical service requires evidence of cost-effectiveness and data addressing potential overtreatment of suspicious skin lesions. Patients and Methods A decision-analytic model was built to compare the costs and benefits of specialized surveillance compared with standard care over a 10-year period, from a health system perspective. A high-risk standard care cohort was obtained using linked population data, comprising the Sax Institute’s 45 and Up cohort study, linked to Medicare Benefits Schedule claims data, the cancer registry, and hospital admissions data. Benefits were measured in quality-adjusted life-years gained. Sensitivity analyses were undertaken for all model parameters. Results Specialized surveillance through the High Risk Clinic was both less expensive and more effective than standard care. The mean saving was A$6,828 (95% CI, $5,564 to $8,092) per patient, and the mean quality-adjusted life-year gain was 0.31 (95% CI, 0.27 to 0.35). The main drivers of the differences were detection of melanoma at an earlier stage resulting in less extensive treatment and a lower annual mean excision rate for suspicious lesions in specialized surveillance (0.81; 95% CI, 0.72 to 0.91) compared with standard care (2.55; 95% CI, 2.34 to 2.76). The results were robust when tested in sensitivity analyses. Conclusion Specialized surveillance was a cost-effective strategy for the management of individuals at high risk of melanoma. There were also fewer invasive procedures in specialized surveillance compared with standard care in the community.


2016 ◽  
Vol 157 (29) ◽  
pp. 1161-1170
Author(s):  
Zoltán Vokó ◽  
Gergő Túri ◽  
Adriána Zsólyom

Introduction: The burden of oral cancer is high in Hungary. Aim: To study the cost-effectiveness of potential oral cancer screening in Hungary. Method: Three strategies were compared: no introduction of screening, organized yearly screening for 40-year-old males in general medical practise, and opportunistic screening of high risk 40-year-old males in primary care. Local estimates of health utilities and costs of each health state and of the screening programmes were identified. The main outcomes were total costs, quality adjusted life years, and incremental cost-effectiveness ratios. Results: Depending on the efficacy of the treatments of precancerous lesions and the participation rate, screening strategies are cost-effective over a 15–20 year time course. The opportunistic screening of high risk people is more cost-effective than the other strategies. Conclusions: Opportunistic screening of high risk people would be cost-effective in Hungary. The uncertainty about the efficacy of the treatments of precancerous lesions requires more research to support evidence based health policy making. Orv. Hetil., 2016, 157(29), 1161–1170.


2021 ◽  
Vol 11 ◽  
Author(s):  
Weiting Liao ◽  
Huiqiong Xu ◽  
David Hutton ◽  
Qiuji Wu ◽  
Kexun Zhou ◽  
...  

BackgroundThe INVICTUS trial assessed the efficacy and safety of ripretinib compared with placebo in the management of advanced gastrointestinal stromal tumors.MethodWe used a Markov model with three health states: progression-free disease, progression disease and death. We parameterized the model from time-to-event data (progression-free survival, overall survival) of ripretinib and placebo arms in the INVICTUS trial and extrapolated to a patient’s lifetime horizon. Estimates of health state utilities and costs were based on clinical trial data and the published literature. The outcomes of this model were measured in quality-adjusted life-years (QALYs), costs, and incremental cost-effectiveness ratios (ICERs). Uncertainty was tested via univariate and probabilistic sensitivity analyses.ResultsThe base-case model projected improved outcomes (by 0.29 QALYs) and additional costs (by $70,251) and yielded an ICER of $244,010/QALY gained for ripretinib versus placebo. The results were most sensitive to progression rates, the price of ripretinib, and health state utilities. The ICER was most sensitive to overall survival. When overall survival in the placebo group was lower, the ICER dropped to $127,399/QALY. The ICER dropped to $150,000/QALY when the monthly cost of ripretinib decreased to $14,057. Probabilistic sensitivity analyses revealed that ripretinib was the cost-effective therapy in 41.1% of simulations at the willingness-to-pay (WTP) threshold of $150,000.ConclusionAs the fourth- or further-line therapy in advanced gastrointestinal stromal tumors, ripretinib is not cost-effective in the US. Ripretinib would achieve its cost-effectiveness with a price discount of 56% given the present effectiveness.


2017 ◽  
Vol 52 (1) ◽  
pp. 7
Author(s):  
Octaviana Simbolon ◽  
Yulistiani Yulistiani ◽  
I DG Ugrasena ◽  
Mariyatul Qibtiyah

Glucocorticoids play an important role in the treatment of acute lymphoblastic leukemia (ALL). However, supraphysiological doses may cause suppression of the adrenal. Adrenal suppression resulting in reduced cortisol response may cause an inadequate host defence against infections, which remains a cause of morbidity and mortality in children with ALL. The occurrence of adrenal suppression before and after glucocorticoid therapy for childhood ALL is unclear. The aim of this study is to analysis the effect of glucocorticoid on cortisol levels during induction phase chemotherapy in children with acute lymphoblastic leukemia. A cross-sectional, observational prospective study was conducted to determine the effect of glucocorticoid on cortisol levels in children with acute lymphoblastic leukemia. Patients who met inclusion criteria were given dexamethasone or prednisone therapy for 49 days according to the 2013 Indonesian Chemotherapy ALL Protocol. Cortisol levels were measured on days 0, 14, 28, 42 and 56 of induction phase chemotherapy. There were 24 children, among 31 children recruited, who suffered from acute lymphoblastic leukemia. Before treatment, the means of cortisol levels were 228.95 ng/ml in standard risk group (prednisone) and 199.67 ng/ml in high risk group (dexamethasone). In standard risk group, the adrenal suppression occurs at about day 56. There was a significant decrement of cortisol levels in high risk group in days 14, 28, 42 against days 0 of induction phase (p=0.001). Both groups displayed different peak cortisol levels after 6 week of induction phase (p=0.028). Dexamethasone resulted in lower cortisol levels than prednisone during induction phase chemotherapy in children with acute lymphoblastic leukemia.


2021 ◽  
Vol 4 (1) ◽  
Author(s):  
Makayla Kirksey ◽  
Brownsyne Tucker Edmonds

Background/Objective: The optimal mode of delivery (MOD) for malpresentation in periviable deliveries (22-24 weeks), remains a source of debate. Neonatal and maternal complications can arise from both vaginal (VD) and cesarean delivery (CD), and the threat of maternal morbidity extends to subsequent pregnancies. It has been difficult to compare these risks while counseling patients about MOD options, so we sought to create a decision tree that maps probable outcomes associated with breech deliveries at 23- and 24-weeks’ gestation, as well as complications posed for subsequent pregnancies.     Methods: An extensive literature review was conducted to identify risk estimates of periviable maternal and neonatal outcomes, along with elective repeat CD (ERCD) and trial of labor after cesarean (TOLAC) for subsequent pregnancies. Probabilities were inputted into TreeAge software, starting with primary maternal health states that may result from CD and VD – “death”, “hysterectomy”, or “no hysterectomy”, followed by the probability of neonatal health states– “death”, “severe morbidity”, or “no severe morbidity”. The likelihood of placenta previa or normal placenta was considered for subsequent pregnancies. We factored in the possibility of ERCD or TOLAC and the associated maternal and neonatal risks for each.      Results: Final design of the tree is complete and risk estimates have been inputted. Primary analysis and sensitivity analyses are planned for August 2021. Ultimately, we will also be able to use measured utility values to calculate quality adjusted life years (QALYs) for each health state.      Conclusion and Clinical Impact: Whether CD or VD is optimal for breech presentation in periviable delivery is influenced by a complex array of factors, including future reproductive plans and maternal values related to potential neonatal and maternal morbidity and mortality. Quantifying risks associated with each MOD will aid providers in their efforts to help families make informed decisions and reduce morbidity across the reproductive lifespan.  


Blood ◽  
2004 ◽  
Vol 104 (11) ◽  
pp. 1953-1953 ◽  
Author(s):  
Susan R. Rheingold ◽  
Nancy J. Bunin ◽  
Richard Aplenc ◽  
Ann M. Leahey ◽  
Beverly J. Lange

Abstract Relapses of childhood ALL that occur on therapy are associated with a dismal survival. To improve the prognosis for these patients, we developed an intensive multiagent chemotherapy protocol consisting of an induction with idarubicin, vincristine (VCR), dexamethasone (DEX), and peg-asparaginase (PEG). Consolidation included high-dose cytarabine (ARA-C), etoposide (VP-16), and PEG followed by VCR and methotrexate (MTX). After an interim maintenance (IM), induction and consolidation were repeated followed by maintenance therapy lasting two years. Maintenance and IM consisted of alternating two week cycles of VP-16, ARA-C, and PEG, with MTX ,VCR, DEX, and PEG. Between 1992 and 2002, 53 pts (32M, 21F) received treatment according to this protocol, 21 of whom were treated as part of the original study (Leahey AM et al., Med Ped Onco34(5):313–8, 2000). Median time to relapse was 37 months from diagnosis (range 12–86 mos). Twenty-one pts were on therapy at relapse and 25 pts were <36 months from diagnosis. Relapses included isolated bone marrow (BM) (32 pts), BM and central nervous system (CNS) (9 pts), BM and testicular (3 pts), and extramedullary (EM) (9 pts). By present day criteria 26 pts were standard risk (SR), 23 were high risk (HR), and 4 were infants. Two patients died in induction, and 2 never achieved a second remission. All others achieved remission by the end of induction (92%). Five-year event-free survival (EFS) and overall survival (OS) are both 56% +/−7% (CI 41%–68%), at a mean of 47 months from relapse (range 0–141 mos). Patients with a first complete remission (CR1) duration <36 months have an EFS of 40% +/−10% (CI 21%–58%); >36 months CR1 is associated with an EFS of 70% +/−9% (CI 50%–84%). Of the events in all pts who initiated therapy,13 were from refractory/recurrent ALL (25%), and 10 pts died of toxicity (19%). Four pts died from chemotherapy induced toxicity (8%), and 6 died from transplant (BMT) related toxicities (11%). Fourteen pts in second remission proceeded to BMT at a mean of 5 months from relapse (range 4.5–8 mos), and 7 of these pts remain in CR2. Of the 39 pts who continued on chemotherapy, 6 pts (35%) with CR1<36 months remain in CR2 and 16 patients (57%) with CR1>36 months remain in CR2 or 3. The 19 pts who were treated on modern standard risk (SR) protocols (CCG-1881 to present) were more salvagable than their 20 high risk (HR) counterparts (CCG-1882 to present). Five year EFS for SR and HR pts is 67% +/−8% (CI 48%–80%) and 38% +/−12% (CI 16%–59%) respectively. Intensive rotating therapy with reinduction and reconsolidation improves EFS. Children with early EM relapses of ALL, CR1 > 36 months, and SR patients all have very good long term survival. Novel therapies need to be integrated into intensive relapse protocols for children with early BM relapse and children treated upfront on HR protocols. Figure Figure Figure Figure


2018 ◽  
Vol 146 (14) ◽  
pp. 1834-1840 ◽  
Author(s):  
A. Kowada

AbstractGastric cancer is the third leading cause of cancer death worldwide. Gastric cancer screening using upper gastrointestinal series, endoscopy and serological testing has been performed in population-based (employee-based and community-based) and opportunistic cancer screening in Japan. There were 45 531 gastric cancer deaths in 2016, with the low screening and detection rates.Helicobacter pylori(H. pylori) screening followed by eradication treatment is recommended in high-risk population settings to reduce gastric cancer incidence. The aim of this study was to evaluate the cost-effectiveness ofH. pyloriscreening followed by eradication treatment for a high-risk population in the occupational health setting. Decision trees and Markov models were developed for two strategies;H. pyloriantibody test (HPA) screening and no screening. Targeted populations were hypothetical cohorts of employees aged 20, 30, 40, 50 and 60 years using a company health payer perspective on a lifetime horizon. Per-person costs and effectiveness (quality-adjusted life-years) were calculated and compared. HPA screening yielded greater benefits at the lower cost than no screening. One-way and probabilistic sensitivity analyses using Monte-Carlo simulation showed strong robustness of the results.H. pyloriscreening followed by eradication treatment is recommended to prevent gastric cancer for employees in Japan, on the basis of cost-effectiveness.


10.36469/9829 ◽  
2016 ◽  
Vol 4 (1) ◽  
pp. 90-102
Author(s):  
Louise Perrault ◽  
Dilip Makhija ◽  
Idal Beer ◽  
Suzanne Laplante ◽  
Sergio Iannazzo ◽  
...  

Background: Patients developing acute kidney injury (AKI) during critical illness or major surgery are at risk for renal sequelae such as costly and invasive acute renal replacement therapy (RRT) and chronic dialysis (CD). Rates of renal injury may be reduced with use of chloride-restrictive intravenous (IV) resuscitation fluids instead of chloride-liberal fluids. Objectives: To compare the cost-effectiveness of chloride-restrictive versus chloride-liberal crystalloid fluids used during fluid resuscitation or for the maintenance of hydration among patients hospitalized in the US for critical illnesses or major surgery. Methods: Clinical outcomes and costs for a simulated patient cohort (starting age 60 years) receiving either chloride-restrictive or chloride-liberal crystalloids were estimated using a decision tree for the first 90-day period after IV fluid initiation followed by a Markov model over the remainder of the cohort lifespan. Outcomes modeled in the decision tree were AKI development, recovery from AKI, progression to acute RRT, progression to CD, and death. Health states included in the Markov model were dialysis free without prior AKI, dialysis-free following AKI, CD, and death. Estimates of clinical parameters were taken from a recent meta-analysis, other published studies, and the US Renal Data System. Direct healthcare costs (in 2015 USD) were included for IV fluids, RRT, and CD. US-normalized health-state utilities were used to calculate quality-adjusted life years (QALYs). Results: In the cohort of 100 patients, AKI was predicted to develop in the first 90 days in 36 patients receiving chloride-liberal crystalloids versus 22 receiving chloride-restrictive crystalloids. Higher costs of chloride-restrictive crystalloids were offset by savings from avoided renal adverse events. Chloride-liberal crystalloids were dominant over chloride-restrictive crystalloids, gaining 93.5 life-years and 81.4 QALYs while saving $298 576 over the cohort lifespan. One-way sensitivity analyses indicated results were most sensitive to the relative risk for AKI development and relatively insensitive to fluid cost. In probabilistic sensitivity analyses with 1000 iterations, chloride-restrictive crystalloids were dominant in 94.7% of iterations, with incremental cost-effectiveness ratios below $50 000/QALY in 99.6%. Conclusions: This analysis predicts improved patient survival and fewer renal complications with chloriderestrictive IV fluids, yielding net savings versus chloride-liberal fluids. Results require confirmation in adequately powered head-to-head randomized trials.


Sign in / Sign up

Export Citation Format

Share Document