Abstract 261: Applying Clinical Trial Data to Real-World: Apixaban, Dabigatran, and Rivaroxaban

Author(s):  
Alpesh Amin ◽  
Michael Stokes ◽  
Ning Wu ◽  
Elyse Gatt ◽  
Dinara Makenbaeva ◽  
...  

BACKGROUND: Data from randomized controlled trials and a real-world sample of non-valvular atrial fibrillation patients were combined to estimate the absolute effect of each new oral anticoagulant (NOAC, apixaban, dabigatran, and rivaroxaban) versus warfarin on stroke and major bleeding rates in real-world clinical practice. METHODS: Non-valvular atrial fibrillation patients were selected from Medco healthplans during 2007-2010. Reference rates for stroke and major bleeding excluding intracranial hemorrhage (to avoid double counting) were calculated for real-world Medco patients during warfarin use. Real-world event rates for NOACs were estimated by multiplying the corresponding relative risk from the randomized clinical trials by each reference rate. Absolute risk reductions and numbers needed to treat (NNT) or numbers needed to harm (NNH) for each NOAC vs. warfarin were then estimated. Reduction in net clinical outcome was calculated by summing the absolute risk reductions for stroke and major bleeding excluding intracranial hemorrhage for each NOAC versus warfarin. RESULTS: Each NOAC resulted in a reduction in stroke events compared with warfarin in the real-world (TABLE). Apixaban was the only NOAC to reduce the rate of major bleeding excluding intracranial hemorrhage compared with warfarin. The NNT to avoid one net clinical outcome (stroke plus major bleeding excluding intracranial hemorrhage) per year was 32 and 84 for apixaban and dabigatran, respectively. Rivaroxaban resulted in an increase in net clinical outcome (NNH=166). CONCLUSIONS: If relative risk reductions from randomized clinical trials persist in the real-world, apixaban would result in the greatest clinical benefit versus warfarin of all NOACs in terms of stroke and major bleeding excluding intracranial hemorrhage events avoided.

Blood ◽  
2015 ◽  
Vol 126 (23) ◽  
pp. 2335-2335
Author(s):  
Shammim Haji ◽  
Jignesh P Patel ◽  
Vivian Auyeung ◽  
Lara N Roberts ◽  
Julia Czuprynska ◽  
...  

Abstract Do the safety and efficacy outcomes reported in the clinical trials of direct oral anticoagulants (DOAC) translate to the 'real-world'? Background: A number of DOACs are now available for clinicians to prescribe in clinical practice. Whilst the results from large clinical trials demonstrate that these agents are as effective as vitamin K antagonists, there is some concern that the patients studied in the trials were not representative of patients, clinicians encounter in everyday practice. The aim of our study was to compare the real-world clinic population commenced on a DOAC to that from the clinical trials for these agents, in order to assess potential differences in safety and efficacy. Patients and methods: A retrospective observational cohort study was undertaken. Patients who were initiated on a DOAC (apixaban, dabigatran and rivaroxaban) at a large teaching hospital in South East London between 1st August 2012 and 31st July 2014 were identified through pharmacy issue data with those followed-up for a minimum of 6 months included. Baseline demographic data, rates of stroke/VTE and rates of major/non-major clinically relevant (NMCR) (ISTH definition) bleeding were assessed and compared to pooled data reported from the corresponding Phase III trials. Differences between groups were compared using t-tests or chi-squared tests. Results: During the review period, 748 patients were initiated on a DOAC, 365 for atrial fibrillation (AF) and 383 for venous thromboembolism (VTE). In terms of demographic differences, the real-world AF population comprised more females, were significantly older, had poorer renal function and a lower body weight. In contrast, the real-world VTE population typically had a higher body weight and poorer renal function, compared to the trial population, (table 1). Efficacy of DOACs was found to be similar across both the VTE and AF populations. With respect to safety, the real-world AF population experienced similar rates of major bleeding and a significantly lower rate of NMCR bleeding compared to the trial populations. In contrast, the real-world VTE population experienced a significantly higher rate of major bleeding, particularly gastrointestinal bleeding. Although the rate of NMCR bleeding was similar, there was a significantly higher rate of urogenital bleeding in the real-world VTE population, specifically heavy menstrual bleeding in women. Conclusions: The efficacy outcomes of DOAC use in a real-world AF and VTE population are consistent with the Phase III trials, despite some significant differences in baseline characteristics. However, a significantly increased rate of major bleeding was observed in the real-world VTE population, which requires further investigation. Table 1. Baseline demographic characteristics, efficacy and safety outcomes in the real-world population versus the trial population Atrial Fibrillation Venous Thromboembolism Trial population+N=28,342 Real-world population Trial population++ Real-world population N=365 N=8,716 N=383 Baseline Demographics, mean (SD) unless otherwise specified Age, years 72 (9.6) 76.8 * (12.1) 56.9 (14.2) 55.6 (18.7) Female (%) 10451 (36.9) 215 * (58.9) 3753 (43.1) 184 (48.0) Weight, kg 82.7 (19.5) 77.3 * (22.6) 84.9 (19.6) 88.2 * (23.0) Creatinine clearance, mL/min 69 (26.7) 58.1 * (26.9) 105.8 (40.7) 91.1 * (37.6) Concomitant aspirin therapy 10341 (36.5) 49 * (13.4) - 0 (0) Previous VKA use (%) 15711 (55.4) 193 (52.9) - 85 (22.2) Efficacy (%) All-cause mortality 1695 (6.0) 37 * (9.1) 160 (1.8) 10 (2.5) Stroke 676 (2.4) 8 (2.0) - 1 (0.3) VTE 39 (0.1) 1 (0.2) 192 (2.2) 7 (1.8) Safety (%) Major Bleeding 1419 (5.0) 17 (4.2) 79 (0.9) 15 * (3.8) Intracranial 170 (0.6) 1 (0.2) 6 (0.1) 2 * (0.5) Gastrointestinal 644 (2.3) 8 (2.0) 8 (0.1) 8 * (2.0) Non-major Clinically relevant (NMCR) bleeding 4824 (17.0) 30 * (7.4) 540 (6.2) 26 (6.6) Gastrointestinal - 9 (2.2) 53 (4.2) 10 (2.5) Urogenital 296 (4.2) 16 (3.9) 100 (2.5) 38 * (9.6) +Pooled data from ARISTOTLE, RE-LY and ROCKET-AF trials ++Pooled data from AMPLIFY, RE-COVER and EINSTEIN-PE/DVT trials *p<0.05 Disclosures Patel: Bayer plc: Research Funding. Auyeung:Bayer PLC: Research Funding. Arya:Bayer plc: Research Funding.


2015 ◽  
Vol 33 (3_suppl) ◽  
pp. 673-673
Author(s):  
Ziwei Wang ◽  
Lindsay Hwang ◽  
James Don Murphy

673 Background: Randomized clinical trials play a central role in clinical research though only a small fraction of patients partake in clinical studies. Questions thus arise regarding the generalizability of clinical trial results to the remainder of the population. This study evaluated whether patient survival from randomized clinical trials in metastatic colorectal cancer reflects real world outcomes. Methods: A Pubmed search was used to identify randomized phase III clinical trials of first-line treatment for metastatic colorectal cancer published between 2005 and 2010. We excluded secondary or pooled analyses, second-line treatments, non-metastatic patients, non-English language, and non-randomized studies. Thirty-one clinical trials met these criteria, comprised of 79 distinct clinical trial arms. Overall survival among clinical trial patients was compared to metastatic colorectal cancer patients within the Surveillance, Epidemiology, and End Results (SEER) program. Within SEER, we restricted the analysis time-period and age of patients to match the enrollment period and age of patients within each individual clinical trial. Results: The clinical trials enrolled a total of 16,614 patients. Among all clinical trial arms the median survival ranged from 6.7-62 months, 1-year survival ranged from 30-97%, and 2-year survival ranged from 6-88%. Compared to SEER, the median survival was higher in 95% of the individual clinical trial arms by an average of 5.4 months (p<0.0001). The 1-year survival was higher in 94% of the clinical trial arms by an average of 16.7% (p<0.0001). The 2-year survival was higher in 71% of the clinical trial arms by an average of 7.2% (p<0.0001). Conclusions: This study found substantially improved survival among clinical trial participants compared to patients in the SEER database suggesting that survival estimates from clinical trials may not generalize to the “real world.” Potential patient factors such as differences in underlying comorbidity, performance status, disease burden, as well as variation in treatment could not be addressed in this study, though these factors likely explain some of the observed survival differences.


PLoS ONE ◽  
2021 ◽  
Vol 16 (10) ◽  
pp. e0258487
Author(s):  
Agoston Gyula Szabo ◽  
Tobias Wirenfeldt Klausen ◽  
Mette Bøegh Levring ◽  
Birgitte Preiss ◽  
Carsten Helleberg ◽  
...  

Most patients cannot be included in randomized clinical trials. We report real-world outcomes of all Danish patients with multiple myeloma (MM) treated with daratumumab-based regimens until 1 January 2019. Methods Information of 635 patients treated with daratumumab was collected retrospectively and included lines of therapy (LOT), hematologic responses according to the International Myeloma Working Group recommendations, time to next treatment (TNT) and the cause of discontinuation of treatment. Baseline characteristics were acquired from the validated Danish Multiple Myeloma Registry (DMMR). Results Daratumumab was administrated as monotherapy (Da-mono) in 27.7%, in combination with immunomodulatory drugs (Da-IMiD) in 57.3%, in combination with proteasome inhibitors (Da-PI) in 11.2% and in other combinations (Da-other) in 3.8% of patients. The median number of lines of therapy given before daratumumab was 5 for Da-mono, 3 for Da-IMiD, 4 for Da-PI, and 2 for Da-other. In Da-mono, overall response rate (ORR) was 44.9% and median time to next treatment (mTNT) was 4.9 months. In Da-IMiD, ORR was 80.5%, and mTNT was 16.1 months. In Da-PI, OOR was 60.6% and mTNT was 5.3 months. In patients treated with Da-other, OOR was 54,2% and mTNT was 5.6 months. The use of daratumumab in early LOT was associated with longer TNT (p<0.0001). Patients with amplification 1q had outcome comparable to standard risk patients, while patients with t(4;14), t(14;16) or del17p had worse outcome (p = 0.0001). Multivariate analysis indicated that timing of treatment (timing of daratumumab in the sequence of all LOT that the patients received throughout the course of their disease) was the most important factor for outcome (p<0.0001). Conclusion The real-world outcomes of multiple myeloma patients treated with daratumumab are worse than the results of clinical trials. Outcomes achieved with daratumumab were best when daratumumab was used in combination with IMIDs and in early LOT. Patients with high-risk CA had worse outcomes, but patients with amp1q had similar outcomes to standard-risk patients.


PLoS ONE ◽  
2021 ◽  
Vol 16 (12) ◽  
pp. e0261684
Author(s):  
Eung Gu Lee ◽  
Tae-Hee Lee ◽  
Yujin Hong ◽  
Jiwon Ryoo ◽  
Jung Won Heo ◽  
...  

Background Idiopathic pulmonary fibrosis (IPF) is a chronic, progressive fibrosing interstitial pneumonia of unknown etiology. In several randomized clinical trials, and in the clinical practice, pirfenidone is used to effectively and safely treat IPF. However, sometimes it is difficult to use the dose of pirfenidone used in clinical trials. This study evaluated the effects of low-dose pirfenidone on IPF disease progression and patient survival in the real-world. Methods This retrospective, observational study enrolled IPF patients seen at the time of diagnosis at a single center from 2008 to 2018. Longitudinal clinical and laboratory data were prospectively collected. We compared the clinical characteristics, survival, and pulmonary function decline between patients treated and untreated with various dose of pirfenidone. Results Of 295 IPF patients, 100 (33.9%) received pirfenidone and 195 (66.1%) received no antifibrotic agent. Of the 100 patients who received pirfenidone, 24 (24%), 50 (50%), and 26 (26%), respectively, were given 600, 1200, and 1800 mg pirfenidone daily. The mean survival time was 57.03 ± 3.90 months in the no-antifibrotic drug group and 73.26 ± 7.87 months in the pirfenidone-treated group (p = 0.027). In the unadjusted analysis, the survival of the patients given pirfenidone was significantly better (hazard ratio [HR] = 0.69, 95% confidence interval [CI]: 0.48–0.99, p = 0.04). After adjusting for age, gender, body mass index, and the GAP score [based on gender (G), age (A), and two physiological lung parameters (P)], survival remained better in the patients given pirfenidone (HR = 0.56, 95% CI: 0.37–0.85, p = 0.006). In terms of pulmonary function, the decreases in forced vital capacity (%), forced expiratory volume in 1 s (%) and the diffusing capacity of lung for carbon monoxide (%) were significantly smaller (p = 0.000, p = 0.001, and p = 0.007, respectively) in patients given pirfenidone. Conclusions Low-dose pirfenidone provided beneficial effects on survival and pulmonary function decline in the real-world practice.


Circulation ◽  
2015 ◽  
Vol 132 (suppl_3) ◽  
Author(s):  
João Carmo ◽  
Francisco M Costa ◽  
Jorge Ferreira ◽  
Miguel Mendes

Background: In the clinical trial RE-LY, dabigatran showed a better efficacy/safety profile in comparison with warfarin, but clinical trials are few representative of the real world. We aim to access if dabigatran in real-world patients with atrial fibrillation (AF) showed a better profile in comparison with warfarin, through a systematic review and meta-analysis of observational studies comparing with vitamin K antagonists. Methods: PubMed, Embase and Scopus databases were searched through December 2014. We include observational studies comparing dabigatran to warfarin for non-valvular AF that reported clinical events during a follow-up for dabigatran 75mg, 110 mg or 150 mg, and warfarin. We proceeded to the extraction and analysis of data for clinical thromboembolic events, bleeding and mortality. Data were pooled by meta-analysis using a random-effects model. Results: We selected 9 studies involving a total of 291,703 patients, 85,399 treated with dabigatran and the remaining 206,304 with warfarin. The incidence of stroke was 1.71 / 100 patient-years for dabigatran and 2.44 / 100 patient years for warfarin (relative risk [RR] 0.91, 95% CI 0.66 to 1.27, p=0.58). The major bleeding rate was 3.90 / 100 patient-years for dabigatran and 3.92 / 100 patient years for warfarin (RR 0.90; 0.78 to 1.03, p=0.11). The all-cause mortality (RR 0.81, 0.75-0.88, p<0.001) and intracranial hemorrhage (RR 0.45, from 0.27 to 0.76, p=0.002) were significantly lower in patients treated with dabigatran in comparison to those treated with warfarin. There were no significant differences in risk of myocardial infarction (RR 0.55; 0.29 to 1.07, p=0.08), total hemorrhage (RR 1.00; 0.57 to 1.77, p=0.99), and gastro-intestinal bleeding (RR 1.14; 0.78 to 1.69, p=0.50). Conclusions: In this combined analysis of observational studies of real world, dabigatran compared to warfarin was associated with a similar risk of stroke, myocardial infarction, major bleeding, total bleeding and gastrointestinal bleeding, and a lower risk of intracranial hemorrhage and mortality.


Circulation ◽  
2015 ◽  
Vol 132 (suppl_3) ◽  
Author(s):  
Shital Kamble ◽  
Xianying Pan ◽  
Hemant Phatak ◽  
Hugh Kawabata ◽  
Cristina Masseria ◽  
...  

Aim: Limited data are available on the real-world safety of non-vitamin K antagonist oral anticoagulants (NOACs). The study purpose was to compare the first major bleeding event risk among non valvular atrial fibrillation patients (NVAF) patients newly initiated on dose-adjusted warfarin versus apixaban 5mg BID, dabigatran 150mg BID, or rivaroxaban 20 mg QD. Methods: Retrospective cohort study was conducted using MarketScan® commercial & Medicare supplemental database from 01/2012 to 12/2013. NVAF patients 18+ years with ≥1 year baseline and newly prescribed oral anticoagulant from 01/01/2013 to 12/31/2013 were included. Major bleeding was defined as bleeding requiring hospitalization on the index drug during the supply duration or within 30 days after the last supply day of the last prescription. A Cox proportional hazards model was used to estimate the hazard ratios (HR) of major bleeding adjusted for age, sex, baseline comorbidities and comedications. Results: Among 26,604 patients, 2,057 (7.73%) were newly initiated on apixaban 5mg, 3,768 (14.16%) on dabigatran 150mg, 8,066 (30.32%) on rivaroxaban 20mg and 12,713 (47.79%) on warfarin. Patients initiating warfarin (72.5±11.9 yrs) and apixaban 5mg (67.0±11.4 yrs) were older as compared to rivaroxaban 20mg (65.2±11.4 yrs) and dabigatran 150mg (65.4±11.5 yrs). Patients initiating warfarin had higher CHA 2 DS 2- VASc score (3.22±1.65) and Charlson comorbidity index score (2.37±2.33) (P <0.0001 across all treatments) as compared to those initiating NOACs. After adjusting for baseline characteristics, patients newly initiated on apixaban 5mg BID had significantly lower risk of major bleeding (HR: 0.53, 95% CI: 0.29-0.97, P=0.0399) as compared to those initiated on warfarin (Table). Conclusion: Among newly anticoagulated NVAF patients in the real world setting, as compared to dose adjusted warfarin, only patients initiating on apixaban 5mg BID were associated with significantly lower risk of major bleeding.


EP Europace ◽  
2021 ◽  
Vol 23 (Supplement_3) ◽  
Author(s):  
WY Ding ◽  
JM Rivera-Caravaca ◽  
F Marin ◽  
G Li ◽  
V Roldan ◽  
...  

Abstract Funding Acknowledgements Type of funding sources: None. Background The benefit of oral anticoagulation (OAC) in atrial fibrillation (AF) must be balanced against any potential risk of harm. We aimed to evaluate the "NNT for net effect" (NNTnet) using CARS in anticoagulated patients with AF. Methods We used patient-level data from the real-world Murcia AF Project and AMADEUS clinical trial. Baseline risk of stroke was calculated using CARS while major bleeding was estimated from prior studies. Stroke and major bleeding events at 1-year were determined. NNTnet was calculated as a reciprocal of the net effect of ARR with OAC (NNTnet= 1 / (ARRstroke - ARIbleeding)). Results 3,511 patients were included (1,306 [37.2%] real-world patients and 2,205 [62.8%] clinical trial). The absolute 1-year stroke risk was similar across both cohorts and the main results are presented in the Table. In both cohorts, the NNTnet was significantly lower in patients with an excess stroke risk of ≥2% by CARS. Among real-world patients with a very high (&gt;10%) baseline stroke risk, the use of OAC was associated with an ARRstroke of 10.9% and ARIbleeding of 1.2%, generating an overall NNTnet of 11. In the clinical trial, the use of OAC was associated with an ARRstroke of 11.0% and ARIbleeding of 0.6%, generating an overall NNTnet of 10. Conclusion Overall, the NNTnet approach in AF incorporates information regarding baseline risk of stroke and major bleeding, and relative effects of OAC with the potential to include multiple additional outcomes and weighting of events based on their perceived effects by individual patients. This simple and intuitive metric may be useful to improve communication and optimise the patient-centred management of AF. NNT in Real-World and Clinical Trial Real-World Clinical Trial Ischaemic stroke risk at 1-year Baseline risk without anticoagulation (%) 5.7% (95% CI 5.5 - 6.0) 5.1% (95% CI 4.9 - 5.3) Anticoagulation-mediated risk (%) 1.7% (95% CI 1.1 - 2.6) 1.3% (95% CI 0.8 - 1.8) Absolute risk reduction (%) 4.0% 3.8% NNTbenefit 25 27 Major bleeding risk at 1-year Baseline risk without anticoagulation (%) 2.3% 2.3% Anticoagulation-mediated risk (%) 3.3% (95% CI 2.4 - 4.4) 3.9% (95% CI 3.1 - 4.8) Absolute risk increase (%) 1.0% 1.6% NNTharm 100 63 NNTnet 34 46


2021 ◽  
Vol 10 (15) ◽  
pp. 3357
Author(s):  
Wern Yew Ding ◽  
José Miguel Rivera-Caravaca ◽  
Francisco Marin ◽  
Christian Torp-Pedersen ◽  
Vanessa Roldán ◽  
...  

Our ability to evaluate residual stroke risk despite anticoagulation in atrial fibrillation (AF) is currently lacking. The Calculator of Absolute Stroke Risk (CARS) has been proposed to predict 1-year absolute stroke risk in non-anticoagulated patients. We aimed to determine whether a modified CARS (mCARS) may be used to assess the residual stroke risk in anticoagulated AF patients from ‘real-world’ and ‘clinical trial’ cohorts. We studied patient-level data of anticoagulated AF patients from the real-world Murcia AF Project and AMADEUS clinical trial. Individual mCARS were estimated for each patient. None of the patients were treated with non-vitamin K antagonist oral anticoagulants. The predicted residual stroke risk was compared to actual stroke risk. 3503 patients were included (2205 [62.9%] clinical trial and 1298 [37.1%] real-world). There was wide variation of CARS for each category of CHA2DS2-VASc score in both cohorts. Average predicted residual stroke risk by mCARS (1.8 ± 1.8%) was identical to actual stroke risk (1.8% [95% CI, 1.3–2.4]) in the clinical trial, and broadly similar in the real-world (2.1 ± 1.9% vs. 2.4% [95% CI, 1.6–3.4]). AUCs of mCARS for prediction of stroke events in the clinical trial and real-world were 0.678 (95% CI, 0.598–0.758) and 0.712 [95% CI, 0.618–0.805], respectively. mCARS was able to refine stroke risk estimation for each point of the CHA2DS2-VASc score in both cohorts. Personalised residual 1-year absolute stroke risk in anticoagulated AF patients may be estimated using mCARS, thereby allowing an assessment of the absolute risk reduction of treatment and facilitating a patient-centred approach in the management of AF. Such identification of patients with high residual stroke risk could help target more aggressive interventions and follow-up.


2020 ◽  
Vol 26 (1) ◽  
pp. 5-9
Author(s):  
Monika Kozieł ◽  
Gregory Y. H. Lip ◽  
Tatjana S. Potpara

Real world registries of patients with atrial fi brillation (AF) have provided important evidence on contemporary AF management and adherence to guidelines in real-world patients across most of regions in Europe. While prospective randomized clinical trials are the ‘gold standard’ of evidence, we recognize that trials have specifi c inclusion/exclusion criteria and many groups of patients can be under-represented. Thus, real world evidence is needed to supplement and augment the evidence, especially for the under-represented patient groups (eg. the very elderly and frail, ethnic minorities, end stage renal failure, those in nursing homes, cognitive impairment, etc) that have been largely under-represented or excluded from clinical trials. The BALKAN-AF survey is the largest prospective, multicenter (a total of 49 centres), observational AF dataset from the Balkans, a European region inhabited by about 10% of the European population that has been under-represented in many prior clinical trials or registries. In BALKAN-AF, data regarding consecutive subjects with electrocardiographically documented non-valvular AF were collected in seven Balkan countries (Albania, Bosnia & Herzegovina, Bulgaria, Croatia, Montenegro, Romania and Serbia) by a cardiologist or an internal medicine specialist where cardiologist was not available. The Serbian Atrial Fibrillation Association created and conducted the BALKAN-AF survey (performed from December 2014 to February 2015).


Sign in / Sign up

Export Citation Format

Share Document