Cost-efficacy of Knee Cartilage Defect Treatments in the United States

2019 ◽  
Vol 48 (1) ◽  
pp. 242-251 ◽  
Author(s):  
Joshua S. Everhart ◽  
Andrew B. Campbell ◽  
Moneer M. Abouljoud ◽  
J. Caid Kirven ◽  
David C. Flanigan

Background: Multiple knee cartilage defect treatments are available in the United States, although the cost-efficacy of these therapies in various clinical scenarios is not well understood. Purpose/Hypothesis: The purpose was to determine cost-efficacy of cartilage therapies in the United States with available mid- or long-term outcomes data. The authors hypothesized that cartilage treatment strategies currently approved for commercial use in the United States will be cost-effective, as defined by a cost <$50,000 per quality-adjusted life-year over 10 years. Study Design: Systematic review. Methods: A systematic search was performed for prospective cartilage treatment outcome studies of therapies commercially available in the United States with minimum 5-year follow-up and report of pre- and posttreatment International Knee Documentation Committee subjective scores. Cost-efficacy over 10 years was determined with Markov modeling and consideration of early reoperation or revision surgery for treatment failure. Results: Twenty-two studies were included, with available outcomes data on microfracture, osteochondral autograft, osteochondral allograft (OCA), autologous chondrocyte implantation (ACI), and matrix-induced ACI. Mean improvement in International Knee Documentation Committee subjective scores at final follow-up ranged from 17.7 for microfracture of defects >3 cm2 to 36.0 for OCA of bipolar lesions. Failure rates ranged from <5% for osteochondral autograft for defects requiring 1 or 2 plugs to 46% for OCA of bipolar defects. All treatments were cost-effective over 10 years in the baseline model if costs were increased 50% or if failure rates were increased an additional 15%. However, if efficacy was decreased by a minimum clinically important amount, then ACI (periosteal cover) of femoral condylar lesions ($51,379 per quality-adjusted life-year), OCA of bipolar lesions ($66,255) or the patella ($66,975), and microfracture of defects >3 cm2 ($127,782) became cost-ineffective over 10 years. Conclusion: Currently employed treatments for knee cartilage defects in the United States are cost-effective in most clinically acceptable applications. Microfracture is not a cost-effective initial treatment of defects >3 cm2. OCA transplantation of the patella or bipolar lesions is potentially cost-ineffective and should be used judiciously.

2019 ◽  
Vol 70 (7) ◽  
pp. 1353-1363 ◽  
Author(s):  
Emily P Hyle ◽  
Justine A Scott ◽  
Paul E Sax ◽  
Lucia R I Millham ◽  
Caitlin M Dugdale ◽  
...  

AbstractBackgroundUS guidelines recommend genotype testing at human immunodeficiency virus (HIV) diagnosis (“baseline genotype”) to detect transmitted drug resistance (TDR) to nonnucleoside reverse transcriptase inhibitors (NNRTIs), nucleoside reverse transcriptase inhibitors (NRTIs), and protease inhibitors. With integrase strand inhibitor (INSTI)-based regimens now recommended as first-line antiretroviral therapy (ART), the of baseline genotypes is uncertain.MethodsWe used the Cost-effectiveness of Preventing AIDS Complications model to examine the clinical impact and cost-effectiveness of baseline genotype compared to no baseline genotype for people starting ART with dolutegravir (DTG) and an NRTI pair. For people with no TDR (83.8%), baseline genotype does not alter regimen selection. Among people with transmitted NRTI resistance (5.8%), baseline genotype guides NRTI selection and informs subsequent ART after adverse events (DTG AEs, 14%). Among people with transmitted NNRTI resistance (7.2%), baseline genotype influences care only for people with DTG AEs switching to NNRTI-based regimens. The 48-week virologic suppression varied (40%–92%) depending on TDR. Costs included $320/genotype and $2500–$3000/month for ART.ResultsCompared to no baseline genotype, baseline genotype resulted in &lt;1 additional undiscounted quality-adjusted life-day (QALD), cost an additional $500/person, and was not cost-effective (incremental cost-effectiveness ratio: $420 000/quality-adjusted life-year). In univariate sensitivity analysis, clinical benefits of baseline genotype never exceeded 5 QALDs for all newly diagnosed people with HIV. Baseline genotype was cost-effective at current TDR prevalence only under unlikely conditions, eg, DTG-based regimens achieving ≤50% suppression of transmitted NRTI resistance.ConclusionsWith INSTI-based first-line regimens in the United States, baseline genotype offers minimal clinical benefit and is not cost-effective.


2022 ◽  
pp. 1-2
Author(s):  
Markus Stücker

<b>Importance:</b> One-year outcomes from the Early Venous Reflux Ablation (EVRA) randomized trial showed accelerated venous leg ulcer healing and greater ulcer-free time for participants who are treated with early endovenous ablation of lower extremity superficial reflux. <b>Objective:</b> To evaluate the clinical and cost-effectiveness of early endovenous ablation of superficial venous reflux in patients with venous leg ulceration. <b>Design, Setting, and Participants:</b> Between October 24, 2013, and September 27, 2016, the EVRA randomized clinical trial enrolled 450 participants (450 legs) with venous leg ulceration of less than 6 months’ duration and superficial venous reflux. Initially, 6555 patients were assessed for eligibility, and 6105 were excluded for reasons including ulcer duration greater than 6 months, healed ulcer by the time of randomization, deep venous occlusive disease, and insufficient superficial venous reflux to warrant ablation therapy, among others. A total of 426 of 450 participants (94.7%) from the vascular surgery departments of 20 hospitals in the United Kingdom were included in the analysis for ulcer recurrence. Surgeons, participants, and follow-up assessors were not blinded to the treatment group. Data were analyzed from August 11 to November 4, 2019. <b>Interventions:</b> Patients were randomly assigned to receive compression therapy with early endovenous ablation within 2 weeks of randomization (early intervention, n  =  224) or compression with deferred endovenous treatment of superficial venous reflux (deferred intervention, n  =  226). Endovenous modality and strategy were left to the preference of the treating clinical team. <b>Main Outcomes and Measures:</b> The primary outcome for the extended phase was time to first ulcer recurrence. Secondary outcomes included ulcer recurrence rate and cost-effectiveness. <b>Results:</b> The early-intervention group consisted of 224 participants (mean [SD] age, 67.0 [15.5] years; 127 men [56.7%]; 206 White participants [92%]). The deferred-intervention group consisted of 226 participants (mean [SD] age, 68.9 [14.0] years; 120 men [53.1%]; 208 White participants [92%]). Of the 426 participants whose leg ulcer had healed, 121 (28.4%) experienced at least 1 recurrence during follow-up. There was no clear difference in time to first ulcer recurrence between the 2 groups (hazard ratio, 0.82; 95% CI, 0.57–1.17; P  =  .28). Ulcers recurred at a lower rate of 0.11 per person-year in the early-intervention group compared with 0.16 per person-year in the deferred-intervention group (incidence rate ratio, 0.658; 95% CI, 0.480–0.898; P  =  .003). Time to ulcer healing was shorter in the early-intervention group for primary ulcers (hazard ratio, 1.36; 95% CI, 1.12–1.64; P  =  .002). At 3 years, early intervention was 91.6% likely to be cost-effective at a willingness to pay of £20 000 ($26 283) per quality-adjusted life year and 90.8% likely at a threshold of £35 000 ($45 995) per quality-adjusted life year. <b>Conclusions and Relevance:</b> Early endovenous ablation of superficial venous reflux was highly likely to be cost-effective over a 3-year horizon compared with deferred intervention. Early intervention accelerated the healing of venous leg ulcers and reduced the overall incidence of ulcer recurrence. <b>Trial Registration:</b> ClinicalTrials.gov identifier: ISRCTN02335796.


Circulation ◽  
2020 ◽  
Vol 141 (15) ◽  
pp. 1214-1224 ◽  
Author(s):  
Dhruv S. Kazi ◽  
Brandon K. Bellows ◽  
Suzanne J. Baron ◽  
Changyu Shen ◽  
David J. Cohen ◽  
...  

Background: In patients with transthyretin amyloid cardiomyopathy, tafamidis reduces all-cause mortality and cardiovascular hospitalizations and slows decline in quality of life compared with placebo. In May 2019, tafamidis received expedited approval from the US Food and Drug Administration as a breakthrough drug for a rare disease. However, at $225 000 per year, it is the most expensive cardiovascular drug ever launched in the United States, and its long-term cost-effectiveness and budget impact are uncertain. We therefore aimed to estimate the cost-effectiveness of tafamidis and its potential effect on US health care spending. Methods: We developed a Markov model of patients with wild-type or variant transthyretin amyloid cardiomyopathy and heart failure (mean age, 74.5 years) using inputs from the ATTR-ACT trial (Transthyretin Amyloidosis Cardiomyopathy Clinical Trial), published literature, US Food and Drug Administration review documents, healthcare claims, and national survey data. We compared no disease–specific treatment (“usual care”) with tafamidis therapy. The model reproduced 30-month survival, quality of life, and cardiovascular hospitalization rates observed in ATTR-ACT; future projections used a parametric survival model in the control arm, with constant hazards reduction in the tafamidis arm. We discounted future costs and quality-adjusted life-years by 3% annually and examined key parameter uncertainty using deterministic and probabilistic sensitivity analyses. The main outcomes were lifetime incremental cost-effectiveness ratio and annual budget impact, assessed from the US healthcare sector perspective. This study was independent of the ATTR-ACT trial sponsor. Results: Compared with usual care, tafamidis was projected to add 1.29 (95% uncertainty interval, 0.47–1.75) quality-adjusted life-years at an incremental cost of $1 135 000 (872 000–1 377 000), resulting in an incremental cost-effectiveness ratio of $880 000 (697 000–1 564 000) per quality-adjusted life-year gained. Assuming a threshold of $100 000 per quality-adjusted life-year gained and current drug price, tafamidis was cost-effective in 0% of 10 000 probabilistic simulations. A 92.6% price reduction from $225 000 to $16 563 would be necessary to make tafamidis cost-effective at $100 000/quality-adjusted life-year. Results were sensitive to assumptions related to long-term effectiveness of tafamidis. Treating all eligible patients with transthyretin amyloid cardiomyopathy in the United States with tafamidis (n=120 000) was estimated to increase annual healthcare spending by $32.3 billion. Conclusions: Treatment with tafamidis is projected to produce substantial clinical benefit but would greatly exceed conventional cost-effectiveness thresholds at the current US list price. On the basis of recent US experience with high-cost cardiovascular medications, access to and uptake of this effective therapy may be limited unless there is a large reduction in drug costs.


2012 ◽  
Vol 75 (7) ◽  
pp. 1292-1302 ◽  
Author(s):  
SANDRA HOFFMANN ◽  
MICHAEL B. BATZ ◽  
J. GLENN MORRIS

In this article we estimate the annual cost of illness and quality-adjusted life year (QALY) loss in the United States caused by 14 of the 31 major foodborne pathogens reported on by Scallan et al. (Emerg. Infect. Dis. 17:7–15, 2011), based on their incidence estimates of foodborne illness in the United States. These 14 pathogens account for 95% of illnesses and hospitalizations and 98% of deaths due to identifiable pathogens estimated by Scallan et al. We estimate that these 14 pathogens cause $14.0 billion (ranging from $4.4 billion to $33.0 billion) in cost of illness and a loss of 61,000 QALYs (ranging from 19,000 to 145,000 QALYs) per year. Roughly 90% of this loss is caused by five pathogens: nontyphoidal Salmonella enterica ($3.3 billion; 17,000 QALYs), Campylobacter spp. ($1.7 billion; 13,300 QALYs), Listeria monocytogenes ($2.6 billion; 9,400 QALYs), Toxoplasma gondii ($3 billion; 11,000 QALYs), and norovirus ($2 billion; 5,000 QALYs). A companion article attributes losses estimated in this study to the consumption of specific categories of foods. To arrive at these estimates, for each pathogen we create disease outcome trees that characterize the symptoms, severities, durations, outcomes, and likelihoods of health states associated with that pathogen. We then estimate the cost of illness (medical costs, productivity loss, and valuation of premature mortality) for each pathogen. We also estimate QALY loss for each health state associated with a given pathogen, using the EuroQol 5D scale. Construction of disease outcome trees, outcome-specific cost of illness, and EuroQol 5D scoring are described in greater detail in a second companion article.


Blood ◽  
2016 ◽  
Vol 128 (22) ◽  
pp. 2366-2366
Author(s):  
Gabriel Tremblay ◽  
Anna Forsythe ◽  
Vasudha Bal ◽  
Snigdha Santra ◽  
Andrew Briggs

Abstract Background In a Phase III COMPLEMENT 2 study,ofatumumab(OFA) plusfludarabine(F) and cyclophosphamide (C) demonstrated significantly improved median progression-free survival (PFS) by 54% compared to FC treatment alone (HR=0.67, p=0.0032) in patients with relapsed chronic lymphocytic leukemia (rCLL). However, the relative value of OFA in rCLL has not been formally assessed. The objective of this study was to estimate the incremental cost per (quality-adjusted) life-year of utilizing OFA+FC vs. FC for rCLL in the US. Methods A partition survival model was developed to estimate the expected outcomes and costs of treatment of OFA+FC vs FC forrCLLover a lifetime horizon. The model includes 4 health states: PFS on treatment, PFS off-treatment, post-progression and death. Time during PFS following protocol-defined treatment duration of 6 months, was considered a treatment-free period in the model. Data on PFS, OS and frequencies of adverse events (AEs)were obtained from the Phase III clinical trial for OFA (COMPLEMENT 2). For the extrapolation of OS and PFS a piecewise approach was used, where the efficacy was based on the patient-level data (Kaplan-Meier Survivor Function) until the trial cut-off and a tail extrapolation thereafter (gamma distribution). Health state utilities and dis-utilities for AEs were obtained from previously published vignette studies. Costs incorporated in the model included drug and administration for primary and follow-up therapies, adverse event treatments, medical costs for hospitalizations and physician visits; and end of life costs. The costs were derived from databases (AnalySourceOnline, AHRQ, CMS). Results Treatment with OFA+FC led to an increase of 0.803 life years and 0.543 quality-adjusted life years (QALYs) relative to FC. The total cost of OFA+FC was higher by $6,693 per patient relative to FC. Although addition of AFA to FC lead to higher drug and adverse event costs, these were partially offset by lower follow-up costs compared to FC. The ICER per LY and per QALY gained with OFA+FC vs. FC was $8,333 and $12,322, respectively. Based on probabilistic sensitivity analyses, there wasa85% probability that OFA+FC was cost-effective compared to FC at a societal willingness-to-pay threshold of $100,000 per QALY saved. Conclusions Our analysis suggests treatment with OFA+FC compared to FC is highly cost-effective based Phase 3 within-trial analysis. These results are driven by the improved PFS and OS of OFA+FC vs. FC, as well as the treatment-free period, during which patients experienced PFS without the burden of treatment AEs or costs. Future direct comparisons of OFA+FC versus other treatment options will further clarify the cost-effectiveness of OFA+FC to inform coverage and reimbursement policy decisions. Disclosures Tremblay: Novartis Pharmaceuticals Corporation: Consultancy. Forsythe:Novartis Pharmaceuticals Corporation: Consultancy. Bal:Novartis Pharmaceuticals: Employment. Santra:Novartis Pharmaceuticals Corporation: Employment. Briggs:Novartis Pharmaceuticals Corporation: Consultancy.


2019 ◽  
Vol 71 (1) ◽  
pp. 53-62 ◽  
Author(s):  
Gloria H Hong ◽  
Ana M Ortega-Villa ◽  
Sally Hunsberger ◽  
Ploenchan Chetchotisakd ◽  
Siriluck Anunnatsiri ◽  
...  

Abstract Background The natural history of anti-interferon-γ (IFN-γ) autoantibody-associated immunodeficiency syndrome is not well understood. Methods Data of 74 patients with anti-IFN-γ autoantibodies at Srinagarind Hospital, Thailand, were collected annually (median follow-up duration, 7.5 years). Annual data for 19 patients and initial data for 4 patients with anti-IFN-γ autoantibodies at the US National Institutes of Health were collected (median follow-up duration, 4.5 years). Anti-IFN-γ autoantibody levels were measured in plasma samples. Results Ninety-one percent of US patients were of Southeast Asian descent; there was a stronger female predominance (91%) in US than Thai (64%) patients. Mycobacterium abscessus (34%) and Mycobacterium avium complex (83%) were the most common nontuberculous mycobacteria in Thailand and the United States, respectively. Skin infections were more common in Thailand (P = .001), whereas bone (P &lt; .0001), lung (P = .002), and central nervous system (P = .03) infections were more common in the United States. Twenty-four percent of Thai patients died, most from infections. None of the 19 US patients with follow-up data died. Anti-IFN-γ autoantibody levels decreased over time in Thailand (P &lt; .001) and the United States (P = .017), with either cyclophosphamide (P = .01) or rituximab therapy (P = .001). Conclusions Patients with anti-IFN-γ autoantibodies in Thailand and the United States had distinct demographic and clinical features. While titers generally decreased with time, anti-IFN-γ autoantibody disease had a chronic clinical course with persistent infections and death. Close long-term surveillance for new infections is recommended.


2020 ◽  
Vol 79 (Suppl 1) ◽  
pp. 1027.2-1027
Author(s):  
A. R. Broder ◽  
W. Mowrey ◽  
A. Valle ◽  
B. Goilav ◽  
K. Yoshida ◽  
...  

Background:The development of ESRD due to lupus nephritis is one of the most common and serious complications of SLE. Mortality among SLE ESRD patients is 4-fold higher compared to lupus nephritis patients with preserved renal function1Mortality in SLE ESRD is also twice as high compared with non-SLE ESRD, even though SLE patients develop ESRD at a significantly younger age. In the absence of ESRD specific guidelines, medication utilization in SLE ESRD is unknown.Objectives:The objective of this study was to investigate the real-world current US-wide patterns of medication prescribing among lupus nephritis patients with new onset ESRD enrolled in the United States Renal Disease Systems (USRDS) registry. We specifically focused on HCQ and corticosteroids (CS) as the most used medications to treat SLE.Methods:Inclusion: USRDS patients 18 years and above with SLE as a primary cause of ESRD (International Classification of Diseases, 9thRevision (ICD9) diagnostic code 710.0, previously validated2). who developed ESRD between January 1st, 2006 and July 31, 2011 (to ensure at least 6 months of follow-up in the USRDS). Patients had to be enrolled in Medicare Part D (to capture pharmacy claims). The last follow-up date was defined as either the last date of continuous part D coverage or the end of the study period, Dec 31, 2013.Results:Of the 2579 patients included, 1708 (66%) were HCQ- at baseline, and 871 (34%) were HCQ+ at baseline. HCQ+ patients at baseline had a slightly lower duration of follow-up compared to HCQ- patients at baseline, median (IQR) of 2.32 (1.33, 3.97) years and 2.55 (1.44, 4.25) years, respectively, p= 0.02. During follow-up period, only 778 (30%) continued HCQ either intermittently or continuously to the last follow-up date, 1306 (51%) were never prescribed HCQ after baseline, and 495 (19%) discontinued HCQ before the last follow-up date. Of the 1801 patients who were either never prescribed or discontinued HCQ early after ESRD onset, 713 (40%) were prescribed CS to the end of the follow-up period: 55% were receiving a low dose <10mg/daily, and 43 were receiving moderate dose (10-20mg daily)Conclusion:HCQ may be underprescribed and CS may be overprescribed in SLE ESRD. Changing the current prescribing practices may improve outcomes in SLE ESRDReferences:[1]Yap DY et al., NDT 2012.[2]Broder A et al., AC&R 2016.Acknowledgments :The data reported here have been supplied by the United States Renal Data System (USRDS). The interpretation and reporting of these data are the responsibility of the author(s) and in no way should be seen as an official policy or interpretation of the U.S. government.Funding: :NIH/NIAMS K23 AR068441 (A Broder), NIH/NIAMS R01 AR 057327 and K24 AR 066109 (KH Costenbader)Disclosure of Interests: :Anna R. Broder: None declared, Wenzhu Mowrey: None declared, Anna Valle: None declared, Beatrice Goilav: None declared, Kazuki Yoshida: None declared, Karen Costenbader Grant/research support from: Merck, Consultant of: Astra-Zeneca


CHEST Journal ◽  
2021 ◽  
Author(s):  
Kim M. Kerr ◽  
C. Greg Elliott ◽  
Kelly Chin ◽  
Raymond L. Benza ◽  
Richard N. Channick ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document