simulated cohort
Recently Published Documents


TOTAL DOCUMENTS

12
(FIVE YEARS 3)

H-INDEX

2
(FIVE YEARS 0)

2021 ◽  
Vol 39 (15_suppl) ◽  
pp. 10028-10028
Author(s):  
Florence Lennie Wong ◽  
Janie M. Lee ◽  
Wendy M. Leisenring ◽  
Joseph Philip Neglia ◽  
Rebecca M. Howell ◽  
...  

10028 Background: Female survivors of childhood HL treated with ≥10 Gy of chest radiation are at high risk for breast cancer (BC). The Children’s Oncology Group (COG) guidelines recommend CBE annually starting at puberty and then semiannually from age 25, plus lifetime annual mammography (MAM) and breast Magnetic Resonance Imaging (MRI) starting 8y after chest radiation or age 25, whichever is later. While imaging-based screening recommendations are largely consistent with US guidelines for women at high BC risk, only the COG guidelines recommend CBE. The benefits of lifetime CBE starting from puberty for life in chest-irradiated HL survivors is unknown. Methods: Life-years (LYs) and lifetime BC mortality risk were estimated from a simulated cohort of 5-million HL survivors using the data from 5y female survivors of HL in the Childhood Cancer Survivor Study (CCSS) treated with ≥10 Gy of chest radiation. The simulated cohort underwent annual MAM+MRI from age 25 for life, with and without annual CBE from age 11 (presumed age of puberty) to age 24 and with and without semiannual CBE from age 25 for life with 100% adherence. BC included in-situ and invasive BC. Treatment-related BC incidence and non-BC mortality risks were estimated from the CCSS data. Risks at age <25 were extrapolated from the CCSS estimates while risks beyond age 50 were extrapolated additionally using the US population rates. CBE sensitivity (17.8%, in-situ and invasive BC) and specificity (98%) and MAM+MRI sensitivity (84.2-86.0%, in-situ; 96.7-97.1%, invasive) and specificity (75.3%) were obtained from the medical literature. Results: The CCSS cohort included 1057 female HL survivors. BC (all invasive) developed in three patients at age <25 (ages: 23, 24, 24). In the simulated cohort receiving no screening, lifetime BC risk was 40.8% and BC mortality was 17.5%. HL survivors around age 50 were at a 7.4-fold higher risk of developing BC and a 5.2-fold higher risk of non-BC mortality when compared with the general population. Compared to no annual CBE for ages 11-24y, undergoing annual CBE did not increase gains in LYs or reduce lifetime BC mortality relative to no screening (Table). Among those who survived to age ≥25, undergoing semiannual CBE from age 25 for life compared to no semiannual CBE also resulted in little gain in LYs or reduction in lifetime BC mortality relative to no screening. Conclusions: Lifetime CBE starting at puberty in conjunction with MAM+MRI appears to add little survival benefits compared with no CBE, suggesting that COG guidelines may be revised without adverse effect on long-term outcomes for chest-irradiated female survivors of childhood HL.[Table: see text]



PLoS ONE ◽  
2021 ◽  
Vol 16 (3) ◽  
pp. e0247941
Author(s):  
Philippe van Wilder ◽  
Irina Odnoletkova ◽  
Mehdi Mouline ◽  
Esther de Vries

Background Common variable immunodeficiency disorders (CVID), the most common form of primary antibody deficiency, are rare conditions associated with considerable morbidity and mortality. The clinical benefit of immunoglobulin replacement therapy (IgGRT) is substantial: timely treatment with appropriate doses significantly reduces mortality and the incidence of CVID-complications such as major infections and bronchiectasis. Unfortunately, CVID-patients still face a median diagnostic delay of 4 years. Their disease burden, expressed in annual loss of disability-adjusted life years, is 3-fold higher than in the general population. Hurdles to treatment access and reimbursement by healthcare payers may exist because the value of IgGRT is poorly documented. This paper aims to demonstrate cost-effectiveness and cost-utility (on life expectancy and quality) of IgGRT in CVID. Methods and findings With input from a literature search, we built a health-economic model for cost-effectiveness and cost-utility assessment of IgGRT in CVID. We compared a mean literature-based dose (≥450mg/kg/4wks) to a zero-or-low dose (0 to ≤100 mg/kg/4wks) in a simulated cohort of adult patients from time of diagnosis until death; we also estimated the economic impact of diagnostic delay in this simulated cohort. Compared to no or minimal treatment, IgGRT showed an incremental benefit of 17 life-years (LYs) and 11 quality-adjusted life-years (QALYs), resulting in an incremental cost-effectiveness ratio (ICER) of €29,296/LY and €46,717/QALY. These results were robust in a sensitivity analysis. Reducing diagnostic delay by 4 years provided an incremental benefit of six LYs and four QALYs compared to simulated patients with delayed IgGRT initiation, resulting in an ICER of €30,374/LY and €47,495/QALY. Conclusions The health-economic model suggests that early initiation of IgGRT compared to no or delayed IgGRT is highly cost-effective. CVID-patients’ access to IgGRT should be facilitated, not only because of proven clinical efficacy, but also due to the now demonstrated cost-effectiveness.



2020 ◽  
Vol 30 (4) ◽  
pp. 543-552
Author(s):  
Sarah S. Romano ◽  
Kemi M. Doll

Objective: To assess the predicted performance of the American College of Obstetrics and Gynecology (ACOG)’s recommended endometrial thickness (ET) of ≥4mm via transvaginal ultrasound (TVUS) for a simulated cohort of US Black women with postmenopausal bleeding (PMB).Methods: We used endometrial cancer parameters from ET studies upon which guidelines are based, as well as documented population characteristics of US Black wom­en, to simulate a cohort of US Black women with PMB. Annual endometrial cancer (EC) prevalence overall and by histology type (I and II), history and current diagnosis of uterine fibroids, and visibility of endometria were estimated. Sensitivity analyses were performed to assess performance changes with quality of baseline parameters and impact of fibroids on ET visibility.Main Outcome Measures: Performance characteristics of 3+, 4+, and 5+mm ET thresholds were assessed including sensitiv­ity, specificity, positive predictive value (PPV), negative predictive value (NPV), Receiver Operator Characteristic (ROC) curves, and the area under the curve (AUC).Results: In the main model with the 4+mm recommended threshold, TVUS ET showed a sensitivity of 47.5% (95% CI: 46.0-49.0%); specificity of 64.9% (95% CI: 64.4-65.3%); PPV of 13.1% (95% CI: 12.5-13.6%); NPV of 91.7% (95% CI: 91.4-92.1%), and AUC of .57 (95% CI: .56-.57).Conclusions: Among a simulated cohort of US Black women, the recommended 4+mm ET threshold to trigger diagnostic biopsy for EC diagnosis performed poorly, with more than 50% of cases missed and an 8-fold higher frequency of false nega­tive results than reported for the general population. Ethn Dis. 2021;30(4):543-552; doi:10.18865/ed.30.4.543



Author(s):  
Jonathan A Michaels ◽  
Matt D Stevenson

ABSTRACTThere has been extensive speculation on the apparent differences in mortality between countries reporting on the confirmed cases and deaths due to Covid-19. A number of explanations have been suggested, but there is no clear evidence about how apparent fatality rates may be expected to vary with the different testing regimes, admission policies and other variables. An individual patient simulation model was developed to address this question. Parameters and sensitivity analysis based upon recent international data sources for Covid-19 and results were averaged over 100 iterations for a simulated cohort of over 500,000 patients.Different testing regimes for Covid-19 were considered; testing admitted patients only, various rates of community testing of symptomatic cases and active contact-tracing and screening.In the base case analysis, apparent mortality ranged from 10.5% under a policy of testing only admitted patients to 0.4% with intensive contact tracing and community testing. These findings were sensitive to assumptions regarding admission rates and the rate of spread, with more selective admission policies and suppression of spread increasing the apparent mortality and the potential for apparent mortality rates to exceed 18% under some circumstances. Under all scenarios the proportion of patients tested in the community had the greatest impact on apparent mortality.Whilst differences in mortality due to health service and demographic factors cannot be excluded, the current international differences in reported mortality are all consistent with differences in practice regarding screening, community testing and admission policies.



2019 ◽  
Vol 2019 ◽  
pp. 1-10 ◽  
Author(s):  
Markus Hodel ◽  
Patricia R. Blank ◽  
Petra Marty ◽  
Olav Lapaire

In Switzerland, 2.3% of pregnant women develop preeclampsia. Quantification of the soluble fms-like tyrosine kinase-1 (sFlt-1) and placental growth factor (PlGF) ratio has shown a diagnostic value in the second and third trimesters of pregnancy, in particular in ruling out preeclampsia within one week. We estimated the economic impact of implementing sFlt-1/PlGF ratio evaluation, in addition to the standard of care (SOC), for women with suspected preeclampsia from a Swiss healthcare system’s perspective. A decision tree model was developed to estimate direct medical costs of diagnosis and management of a simulated cohort of Swiss pregnant women with suspected preeclampsia (median week of gestation: 32) until delivery. The model compared SOC vs. SOC plus sFlt-1/PlGF ratio, using clinical inputs from a large multicenter study (PROGNOSIS). Resource use data and unit costs were obtained from hospital records and public sources. The assumed cost for sFlt-1/PlGF evaluation was €141. Input parameters were validated by clinical experts in Switzerland. The model utilized a simulated cohort of 6084 pregnant women with suspected preeclampsia (representing 7% of all births in Switzerland in 2015, n=86,919). In a SOC scenario, 36% of women were hospitalized, of whom 27% developed preeclampsia and remained hospitalized until birth. In a sFlt-1/PlGF test scenario, 76% of women had a sFlt-1/PlGF ratio of ≤38 (2% hospitalized), 11% had a sFlt-1/PlGF ratio of >38-<85 (55% hospitalized), and 13% had a sFlt-1/PlGF ratio of ≥85 (65% hospitalized). Total average costs/pregnant woman (including birth) were €10,925 vs. €10,579 (sFlt-1/PlGF), and total costs were €66,469,362 vs. €64,363,060 (sFlt-1/PlGF). Implementation of sFlt-1/PlGF evaluation would potentially achieve annual savings of €2,105,064 (€346/patient), mainly due to reduction in unnecessary hospitalization. sFlt-1/PlGF evaluation appears economically promising in predicting short-term absence of preeclampsia in Swiss practice. Improved diagnostic accuracy and reduction in unnecessary hospitalization could lead to significant cost savings in the Swiss healthcare system.



2019 ◽  
Vol 26 (1) ◽  
pp. 103-111 ◽  
Author(s):  
Diana M Negoescu ◽  
Eva A Enns ◽  
Brooke Swanhorst ◽  
Bonnie Baumgartner ◽  
James P Campbell ◽  
...  

Proactive therapeutic drug monitoring of infliximab is a marginally cost-effective strategy for Crohn’s disease, whereas reactive therapeutic drug monitoring is cost-effective. As the cost of infliximab decreases, a proactive strategy of dosing infliximab becomes more cost-effective.



Blood ◽  
2018 ◽  
Vol 132 (Supplement 1) ◽  
pp. 1041-1041 ◽  
Author(s):  
Gabriel Tremblay ◽  
Qayyim Said ◽  
Beilei Cai ◽  
Shan Ashton Garib ◽  
Dimitrios Tomaras ◽  
...  

Abstract Background: Severe Aplastic Anemia (SAA) is a rare bone marrow disorder characterized by inadequate levels of peripheral, multi-lineage blood cells. Of the two available first-line treatments for SAA, allogeneic hematopoietic stem cell transplantation is limited by patient eligibility and donor availability, and immunosuppressive therapy (IST) is characterized by a significant proportion of non-responders, toxicity, and risk of transformation to diseases such as acute myelogenous leukemia. Patients who do not respond to treatments become transfusion dependent, which has a significant impact on patients' quality of life as well as healthcare costs. Eltrombopag is the only TPO-R agonist approved for the treatment of refractory SAA. In a Phase I/II clinical trial, eltrombopag, given in association with IST, showed efficacy in patients with naive-SAA, and offers a significantly improved first-line alternative to patients affected by SAA (Townsley, et al 2017). In the US healthcare environment, there is a need to compare costs and consequences to understand value. Objective: Evaluate eltrombopag as a first-line treatment versus IST alone for SAA from the American private healthcare system perspective. Methods: A responder model for newly diagnosed SAA patients was created to assess the treatment pathway and economic impact of including eltrombopag in addition to IST (antithymocyte globulin and cyclosporine A) as a first-line treatment. A simulated cohort with two treatment arms underwent 6 months of treatment with either eltrombopag in addition to IST or with IST alone and were followed for a 3-year time period. Each arm received a diagnostic test measuring response at 6 months. Patients who achieved complete or partial response in either arm received low-dose cyclosporine A as maintenance therapy for an additional 6 months of treatment. Patients who did not respond in either arm continued with eltrombopag monotherapy as a second-line therapy for an additional 6 months. First-line therapy (eltrombopag with IST, IST alone), maintenance therapy (low-dose cyclosporine A), second-line therapy (eltrombopag monotherapy), administration, routine care, mortality and adverse event costs were included in the analysis. Workplace productivity related costs were not considered. Response rates, mortality, dosing, treatment duration and adverse event rates for each arm were based on a phase I/II trial (Townsley, et al 2017). Drug costs were obtained from a large online database (REDBOOK Online). Administration costs were based on the 2017 CMS Medical Fee Schedule. Routine care rates (visits, hospitalizations, tests and transfusions) were based on published data (Peffault De Latour, et al, 2017). Routine care, mortality and adverse event costs were based on CPT codes from the American Medical Association, HCUPnet and published data (Toner, et al 2011). All cost data are reported in 2018 US dollars. See figure 1 for details. Results: In a simulated cohort with a population of one million, the annual incidence of aplastic anemia was 0.000234% and SAA accounted for 83.8% of those cases. The two treatment paths were compared for their consequences. Based on the clinical trial data, in the treatment arm with eltrombopag and IST, 94% of patients experienced treatment response relative to the IST arm where only 66% of patients experienced treatment response. Further, in the treatment arm with eltrombopag and IST, the patients experienced a reduced annual risk of mortality by 0.3% relative to the IST arm. Use of eltrombopag therapy as a first-line therapy produced a cost increase of $77,442 over 3 years. First-line drug costs accounted for an increase of $109,147, while improvements in response rates led to cost offsets for second-line drugs and produced $29,663 in savings. Adverse event, routine care and mortality costs had relatively negligible effects on either treatment arm over a 3-year time period. Sensitivity analyses confirmed the robustness of the analysis. Conclusion: When following treatment approaches specified in clinical studies, high response rates combined with reduced risk of mortality and less usage of rescue medication, and a low disease incidence are likely to lead to manageable economic consequences with eltrombopag + IST therapy from the American private healthcare system perspective. In a simulated cohort with a population of one million, this was estimated to be $77,442 over 3 years. Disclosures Said: Novartis: Employment. Cai:Novartis: Employment. Forsythe:Novartis: Consultancy. Roy:Novartis: Employment.



Blood ◽  
2015 ◽  
Vol 126 (23) ◽  
pp. 5154-5154
Author(s):  
Francois-Xavier Mahon ◽  
An Tran-Duy ◽  
Raechelle G. Ocampo ◽  
David Ray ◽  
Estella Mendelson ◽  
...  

Abstract Background : Tyrosine kinase inhibitors (TKIs) have dramatically improved outcomes in patients with Ph+ CML, in part by allowing achievement of sustained, low levels of BCR-ABL1 transcripts (quantified on the International Scale [IS]). TFR studies (eg, STIM, ENESTfreedom) evaluate whether some of these patients can stop TKI therapy and maintain a therapeutic molecular response off treatment. Here, we evaluated BCR-ABL1IS transcript levels of patients treated with frontline NIL 300 mg twice daily or IM 400 mg once daily for ≥ 1 year to predict time to TFR eligibility according to the ENESTfreedom trial criteria. Methods : A statistical model was developed to predict the probability of future premature treatment discontinuation (due to adverse events, progression, or suboptimal response) and BCR-ABL1IS transcript levels at any time point after 1 year of treatment. The 5-year data from the ENESTnd clinical trial, in which BCR-ABL1IS transcript levels were assessed every 3 months, were the basis for this model. Probabilities of premature treatment discontinuation were modeled using parametric survival methods; early molecular response (EMR; BCR-ABL1IS ≤ 10% at 3 months) status was used as a predictor. For patients remaining on treatment, a second-order Markov chain model was used to predict probabilities of BCR-ABL1IS transcript levels being in each of 5 clinically relevant categories (≤ 0.0032% [MR4.5 ], > 0.0032% to ≤ 0.01% [MR4 ], > 0.01% to ≤ 0.1%, > 0.1% to ≤ 10%, and > 10%) at any time point after 1 year of therapy. Probabilities were a function of EMR status, the proportion of previous BCR-ABL1IS observations at or below MR4, and BCR-ABL1IS categories from the previous 2 assessments. A simulated cohort of 1000 patients was created to match the distribution of EMR status and BCR-ABL1IS categories in the first year of therapy in each of the trial populations (NIL or IM). Premature treatment discontinuation and BCR-ABL1IS categories were randomly drawn at each 3-month interval based on their corresponding predicted probabilities. Time to eligibility criteria for TFR was defined as: last BCR-ABL1IS assessment of MR4.5, none of the prior 3 assessments worse than MR4, and no more than 2 of the prior 3 assessments between MR4 and MR4.5. Results : For years 2 to 5 of the ENESTnd trial, the observed distribution of BCR-ABL1IS categories over time had reasonable agreement with the computer-simulated cohort. Simulation results (Figure) demonstrated that more patients on NIL than on IM were eligible for TFR by year 5 (52% vs 38%, respectively) and by year 10 (72% vs 64%, respectively; P < .0001 for both time points). Conclusion: Patients in our simulated cohort received a minimum of 3 years of frontline treatment with NIL or IM prior to TFR eligibility evaluation, similar to the current consensus in clinical disease management. Treatment with NIL resulted in significantly more patients becoming eligible for TFR by all time points vs treatment with IM. These data suggest that TFR as a therapeutic goal may be more attainable with NIL than IM. Studies evaluating the duration of TFR are presently being conducted. Disclosures Mahon: Novartis: Consultancy, Honoraria; Bristol-Myers Squibb: Consultancy, Honoraria; ARIAD: Consultancy; Pfizer: Consultancy. Tran-Duy:Pharmerit International: Consultancy. Ray:Novartis Pharmaceutical Corporation/Rutgers University: Other: I am currently a fellow with Rutgers University, conducting my "field" experience at Novartis.. Mendelson:Novartis Pharmaceutical Corporation: Employment, Equity Ownership. Buchbinder:Novartis Pharmaceutical Corporation: Employment, Equity Ownership. Edrich:Novartis Pharma AG: Employment. Snedecor:Pharmerit International: Employment, Other: Institution received payment to conduct this study. Saglio:Pfizer: Consultancy, Honoraria; ARIAD: Consultancy, Honoraria; Bristol-Myers Squibb: Consultancy, Honoraria; Novartis Pharmaceutical Corporation: Consultancy, Honoraria.





Sign in / Sign up

Export Citation Format

Share Document