scholarly journals Outcomes associated with apixaban vs warfarin in patients with renal dysfunction

2020 ◽  
Vol 4 (11) ◽  
pp. 2366-2371 ◽  
Author(s):  
Claudia Hanni ◽  
Elizabeth Petrovitch ◽  
Mona Ali ◽  
Whitney Gibson ◽  
Christopher Giuliano ◽  
...  

Abstract Apixaban in patients with impaired renal function is supported by limited data. Landmark clinical trials evaluating apixaban in patients with atrial fibrillation and/or acute venous thromboembolism excluded patients with creatinine clearance (CrCl) <25 mL/min. This multicenter, retrospective chart review was conducted to evaluate the safety and effectiveness of apixaban compared with warfarin in patients with CrCl <25 mL/min. Included patients were newly initiated on apixaban or warfarin for at least 45 days with a CrCl <25 mL/min. Patients were evaluated for thrombosis and bleeding outcomes 6 months following initiation of anticoagulation. The primary outcome was the time to first bleeding or thrombosis event. A total of 128 patients met inclusion criteria in the apixaban group and 733 patients in the warfarin group. Time to first bleeding or thrombosis event was significantly different between the apixaban and warfarin groups. Cox proportional hazards model was conducted to control for potential confounding factors for the primary outcome. After controlling for atrial fibrillation and coronary artery bypass grafting, risk of thrombotic and bleeding events was lower in the apixaban group (hazard ratio, 0.47; 95% confidence interval, 0.25-0.92). There was not a statistical difference between time to thrombosis (83 days vs 54 days, P = .648), rate of thrombosis (5.5% vs 10.3%, P = .08), time to bleeding (46 days vs 54 days, P = .886), or rate of bleeding (5.5% vs 10.9%, P = .06). The severity of bleeding and thrombotic events was not different between groups. Apixaban may serve as a reasonable alternative compared with warfarin in patients with severe renal dysfunction.

Circulation ◽  
2014 ◽  
Vol 130 (suppl_2) ◽  
Author(s):  
Raffaele De Caterina ◽  
Ulrika Andersson ◽  
John H Alexander ◽  
M.Cecilia Bahit ◽  
Patrick J Commerford ◽  
...  

Background: History of bleeding is important in decisions for anticoagulation. We analyzed outcomes in relation to history of bleeding and randomized treatments in patients with atrial fibrillation (AF) in the ARISTOTLE trial. Methods: The on-treatment safety population included 18,140 patients receiving ≥1 dose of study drug, apixaban 5 mg bd (2.5 mg bd if 2 of the following: age >80 yrs; body weight <60 kg; or creatinine >133 μmol/L) or warfarin aiming for INR 2.0-3.0 (median TTR 66%), for a median of 1.8 yrs. Adjudicated outcomes in relation to randomization and history of bleeding were analyzed using a Cox proportional hazards model. Efficacy endpoints were analyzed in the intention-to-treat population. Results: A history of bleeding was reported in 3033 patients (16.7%), who more often were male (68% vs 64%, p <0.0005); with a history of prior stroke/TIA/systemic embolism (23% vs 19%, p <0.0001); diabetes (27% vs 24%, p=0.0010); higher CHADS2 score (CHADS2 >3: 35% vs 29%), age (mean [SD] 71 [9] vs 69 [10], p <0001) and body weight (86 [21] vs 84 [21], p <0.0001); lower creatinine clearance (77 [33] vs 80 [33], p=0.0007) and mean systolic blood pressure (131 [17] vs 132 [16], p=0.0027). Calcium channel blockers, statins, non-steroidal anti-inflammatory drugs and proton pump inhibitors were used more often in patients with vs without a history of bleeding. Major bleeding was the only outcome event occurring more frequently in patients with vs without a history of bleeding, HR 1.7 (95% CI 1.4-2.3) with apixaban and 1.5 (1.2-1.0) with warfarin. Primary efficacy and safety outcomes in relation to randomization, see Table. Conclusions: In patients with AF, a history of bleeding was associated with several risk factors for stroke and bleeding and, accordingly, a higher bleeding risk during anticoagulation. Benefits with apixaban vs warfarin as to stroke, mortality and major bleeding, are however consistent irrespective of bleeding history.


2019 ◽  
Vol 40 (Supplement_1) ◽  
Author(s):  
M Fukunaga ◽  
K Hirose ◽  
A Isotani ◽  
T Morinaga ◽  
K Ando

Abstract Background Relationship between atrial fibrillation (AF) and heart failure (HF) is often compared with proverbial question of which came first, the chicken or the egg. Some patients showing AF at the HF admission result in restoration of sinus rhythm (SR) at discharge. It is not well elucidated that the restoration into SR during hospitalization can render the preventive effect for rehospitalization. Purpose To investigate the impact of restoration into SR during hospitalization for readmission rate of the HF patients showing AF. Methods We enrolled consecutive 640 HF patients hospitalized from January 2015 to December 2015. Patients data were retrospectively investigated from medical record. Patients showing atrial fibrillation on admission but unrecognized ever were defined as “incident AF”; patients with AF diagnosed before admission were defined as “prevalent AF”. Primary endpoint was a composite of death from cardiovascular disease or hospitalization for worsening heart failure. Secondary endpoints were death from cardiovascular disease, unplanned hospitalization related to heart failure, and any hospitalization. Results During mean follow up of 19 months, 139 patients (22%) were categorized as incident AF and 145 patients (23%) were categorized as prevalent AF. Among 239 patients showing AF on admission, 44 patients were discharged in SR (39 patients in incident AF and 5 patients in prevalent AF). Among incident AF patients, the primary composite end point occurred in significantly fewer in those who discharged in SR (19% vs. 42% at 1-year; 23% vs. 53% at 2-year follow-up, p=0.005). To compare the risk factors related to readmission due to HF with the cox proportional-hazards model, AF only during hospitalization [Hazard Ratio (HR)=0.37, p<0.01] and prevalent AF (HR=1.67, p=0.04) was significantly associated. There was no significant difference depending on LVEF. Conclusion Newly diagnosed AF with restoration to SR during hospitalization was a good marker to forecast future prognosis.


2019 ◽  
Vol 41 (3) ◽  
pp. 347-356 ◽  
Author(s):  
Emmanuel Sorbets ◽  
Kim M Fox ◽  
Yedid Elbez ◽  
Nicolas Danchin ◽  
Paul Dorian ◽  
...  

Abstract Aims Over the last decades, the profile of chronic coronary syndrome has changed substantially. We aimed to determine characteristics and management of patients with chronic coronary syndrome in the contemporary era, as well as outcomes and their determinants. Methods and results Data from 32 703 patients (45 countries) with chronic coronary syndrome enrolled in the prospective observational CLARIFY registry (November 2009 to June 2010) with a 5-year follow-up, were analysed. The primary outcome [cardiovascular death or non-fatal myocardial infarction (MI)] 5-year rate was 8.0% [95% confidence interval (CI) 7.7–8.3] overall [male 8.1% (7.8–8.5); female 7.6% (7.0–8.3)]. A cox proportional hazards model showed that the main independent predictors of the primary outcome were prior hospitalization for heart failure, current smoking, atrial fibrillation, living in Central/South America, prior MI, prior stroke, diabetes, current angina, and peripheral artery disease. There was an interaction between angina and prior MI (P = 0.0016); among patients with prior MI, angina was associated with a higher primary event rate [11.8% (95% CI 10.9–12.9) vs. 8.2% (95% CI 7.8–8.7) in patients with no angina, P &lt; 0.001], whereas among patients without prior MI, event rates were similar for patients with [6.3% (95% CI 5.4–7.3)] or without angina [6.4% (95% CI 5.9–7.0)], P &gt; 0.99. Prescription rates of evidence-based secondary prevention therapies were high. Conclusion This description of the spectrum of chronic coronary syndrome patients shows that, despite high rates of prescription of evidence-based therapies, patients with both angina and prior MI are an easily identifiable high-risk group who may deserve intensive treatment. Clinical registry ISRCTN43070564


2020 ◽  
Vol 48 (10) ◽  
pp. 030006052096234
Author(s):  
Ying Liu ◽  
Jie Wang ◽  
Wen-zhen Zeng ◽  
Qing-shan Lyu

Objective This study aimed to examine the relationship between total bilirubin levels and initial ischemic stroke in patients with non-valvular atrial fibrillation. Methods This was a retrospective study. Atrial fibrillation was diagnosed by 24-hour Holter electrocardiography and serum total bilirubin levels were divided into quintiles. Ischemic stroke was diagnosed by symptoms, signs, and a medical image examination. The multivariate Cox proportional hazards model and survival analysis were used to estimate the association of total bilirubin with initial ischemic stroke. Results We studied 316 patients with non-valvular atrial fibrillation. During follow-up, there were 42 (13.29%) first ischemic strokes. After multivariate adjustment, for each 1 µmol/L increase in total bilirubin, the risk of first ischemic stroke increased by 4% (95% confidence interval [CI]: 1.01, 1.07). When using the first quintile as the reference, from the second to fifth quintiles, the risks of first ischemic stroke were 0.52 (95% CI: 0.17, 1.65), 0.23 (95% CI: 0.06, 0.87), 0.92 (95% CI: 0.32, 2.67), and 1.33 (95% CI: 1.09, 4.41), respectively. The optimal cut-off point of total bilirubin for the lowest risk of ischemic stroke was 17.0 µmol/L. Conclusions Total bilirubin levels are nonlinearly associated with initial ischemic stroke in patients with non-valvular atrial fibrillation.


Blood ◽  
2012 ◽  
Vol 120 (21) ◽  
pp. 365-365
Author(s):  
Kevin M Francis ◽  
Kimberly Siu ◽  
Chen Yu ◽  
Hasmik Alvrtsyan ◽  
Yajing Rao ◽  
...  

Abstract Abstract 365 Objective: Oral anticoagulation (OAC) therapy is the primary tool in reducing stroke risk in patients with non-valvular atrial fibrillation (NVAF), yet it is under-utilized. Patients that are non-persistent to therapy contribute to this underutilization. The recent introduction of dabigatran provides an alternative to warfarin for anticoagulation, but no studies to date have examined the relative patterns of persistence among patients starting therapy. The objective is to compare persistence rates between newly diagnosed AF patients new to OAC therapy on warfarin to dabigatran. Methods: Claims data from the US Department of Defense was used to identify patients who received either dabigatran or warfarin between October 28, 2010 and June 30, 2012. Patient records were examined for a minimum of 12 months prior to their first prescription claim of dabigatran or warfarin to determine if they were naïve to treatment and if they were newly-diagnosed with atrial fibrillation (AF). If their first AF claim was within 3 months of their first dabigatran or warfarin prescription they are considered newly-diagnosed. Persistence was defined as time on therapy to discontinuation. Patients were identified as discontinued when the medication refill gap exceeded 30 days or if the patient switched therapies. Propensity score matching was used to compare cohorts. Kaplan-Meier curves were used to depict persistence over time. Cox proportional hazards model was used to determine the predictors of persistence, including stroke risk as estimated by CHADS2 score and bleeding risk as estimated by HEMORR2HAGES score. Results: Fewer newly-diagnosed, treatment-naïve patients were started on warfarin (1,775) than dabigatran (3,370). Overall median persistence rates of patients on dabigatran or warfarin were 389 and 135 days, respectively. The persistence rates at sixmonths (64% vs. 50%) and 1 year (41% vs. 24%) were higher for dabigatran than for warfarin. Of those who discontinued, 9.4% switched from warfarin to dabigatran, and 5.9% switched to warfarin from dabigatran. Patients on dabigatran with a low risk of stroke (CHADS2 ≤1) or with a higher bleed risk (HEMORR2HAGES score greater than 3) had a higher likelihood of non-persistence (hazard ratio (HR), 1.1; 95% CI, 1.0 to 1.3; P=0.02 and HR, 1.2, 95% CI, 1.1 to 1.4; P= 0.004). Patients on warfarin with a low risk of stroke (CHADS2 ≤1) had 1.2 times higher likelihood of non-persistence (HR, 1.2, 95% CI, 1.0 to 1.3; P=0.04) (See Figure 1). Conclusion: Patients who began dabigatran treatment were more persistent than patients who began warfarin treatment. Within each cohort, the patients with a low risk of stroke were more likely to discontinue therapy. Disclosures: Francis: Trinity Partners, LLC: Boehringer Ingelheim sponsored the study Other, Consultancy. Siu:Boehringer Ingelheim Pharmaceuticals, Inc: Employment. Yu:Trinity Partners, LLC: Boehringer Ingelheim sponsored the study Other, Consultancy. Alvrtsyan:Trinity Partners, LLC: Boehringer Ingelheim sponsored the study Other, Consultancy. Rao:Trinity Partners, LLC: Boehringer Ingelheim sponsored the study Other, Consultancy. Walker:Boehringer Ingelheim Pharmaceutical Inc.: Employment. Sander:Boehringer Ingelheim Pharmaceutical Inc.: Employment. Zalesak:Trinity Partners, LLC: Boehringer Ingelheim sponsored the study Other, Consultancy. Miyasato:Trinity Partners, LLC: Boehringer Ingelheim sponsored the study Other, Consultancy. Sanchez:Trinity Partners, LLC: Boehringer Ingelheim sponsored the study Other, Consultancy.


Circulation ◽  
2015 ◽  
Vol 132 (suppl_3) ◽  
Author(s):  
Steven Deitelzweig ◽  
Amanda Bruno ◽  
Natalie Tate ◽  
Augustina Ogbonnaya ◽  
Manan Shah ◽  
...  

Real-world evidence highlighting the risks and benefits of novel oral anticoagulants (NOCAs) is lacking. This study compared major and clinically relevant non-major (CRNM) bleeding risk and costs among non-valvular atrial fibrillation (NVAF) patients newly treated with apixaban, dabigatran, rivaroxaban, or warfarin. A retrospective analysis of NVAF patients newly treated with apixaban, dabigatran, rivaroxaban, or warfarin was conducted using PharMetrics Plus data from 1/ 2012 - 9/ 2014. Patients were indexed on the date of the first anticoagulant prescription, and were required to be ≥18 years old and have CHA 2 DS 2 -VASc score > 0 and ≥ 1 month of follow-up. Patients were followed until discontinuation (≥30-day gap in treatment), treatment switch, end of continuous enrollment, 1 year post-index, or end of study. Major and CRNM bleeding, and bleeding-related costs were measured. Cox proportional hazards model was used to examine the association between anticoagulants and risk of bleeding and GLM was used to evaluate bleeding-related costs. The study included 24,573 NVAF patients; distributed as apixaban 11.7%, dabigatran 12.0%, rivaroxaban 36.7%, and warfarin 39.6%. Mean age was 64.4 and 66.5% were males. HAS-BLED and CHA 2 DS 2 -VASc scores averaged 2.0 and 2.7, respectively. After adjusting for differences in baseline characteristics, when compared to apixaban patients, rivaroxaban (HR: 1.5; P =0.0013) and warfarin (HR: 1.7; P <0.0001) patients were more likely to have major bleeding, and dabigatran (HR: 1.3; P =0.0030), rivaroxaban (HR: 1.7; P <0.0001), and warfarin (HR: 1.4; P <0.0001) patients were more likely to have CRNM bleeding. Major bleeding risk was similar between apixaban and dabigatran patients. Major and CRNM bleeding costs, when compared to apixaban patients ($154 and $18), were significantly higher for dabigatran ($457; P <0.0001 and $39; P <0.0001), rivaroxaban ($420; P <0.0001 and $61; P <0.0001), and warfarin ($511; P <0.0001 and $63; P <0.0001) patients. Among anticoagulant-naive moderate-to-high risk NVAF patients encountered in real-world clinical setting, major bleeding was lower with apixaban compared to warfarin and rivaroxaban. Bleeding costs were lower with apixaban compared to alternative NOACs and warfarin.


Circulation ◽  
2015 ◽  
Vol 132 (suppl_3) ◽  
Author(s):  
Steve Deitelzweig ◽  
Amanda Bruno ◽  
Kiran Gupta ◽  
Jeffrey Trocio ◽  
Natalie Tate

To compare the risk of hospitalization among non-valvular atrial fibrillation (NVAF) patients newly initiated with an oral anticoagulant (OAC): apixaban, dabigatran, rivaroxaban, or warfarin. Retrospective cohort study using Humana Medicare Advantage data from 7/1/2009 - 9/30/2014. NVAF patients ≥18 years receiving one OAC on the index date with 6 months continuous enrollment prior to index prescription date and 3 months post-index were eligible. Hospitalizations were identified by standard codes for inpatient admission. Bleeding-related hospitalizations required an additional code for major/clinically relevant non-major (CRNM) bleeding. A cox proportional hazards model was used to estimate the hazard ratios (HR) of hospitalizations adjusted for age, sex, region, comorbidities and comedications. Adherence for each OAC was also calculated using a proportion of days covered approach to understand medication taking behaviors. Among the 53,168 patients initiated on an OAC, 2,028 (3.8%) apixaban, 5,644 (10.6%) dabigatran, 7,667 (14.4%) rivaroxaban and 37,829 (71.1%) warfarin. Patients in apixaban cohort were older (mean 75.5 years, P <0.05) with higher mean CHA 2 DS 2- VASc score (P <0.05). Abixaban patients had a higher mean HAS-BLED score vs. dabigatran (P <0.0001), lower mean score vs. warfarin (P <0.0001) and did not differ significantly vs. rivaroxaban (P =0.46). Patients receiving apixaban had a significantly lower risk for all-cause hospitalization across cohorts, and a sig. lower risk for bleeding-related hospitalization vs. patients receiving rivaroxaban or warfarin (Table). Adherence ranged from 87.8% to 90.4% across cohorts. In a real-world setting, initiation with apixaban was associated with a significantly lower risk for all-cause hospitalization, and a significantly lower risk of bleeding-related hospitalization compared to rivaroxaban or warfarin. Table: Adjusted Hazard Ratios of All-cause and Bleeding-related Hospitalizations


2013 ◽  
Vol 34 (5) ◽  
pp. E12 ◽  
Author(s):  
Kelly Wright ◽  
Polly Young ◽  
Cristina Brickman ◽  
Teena Sam ◽  
Neeraj Badjatia ◽  
...  

Object The authors evaluated the rates of ventriculostomy-related infections (VRIs) after antibiotic-coated extraventricular drains (ac-EVDs) were introduced as the standard of care. Methods A retrospective chart review was conducted of adult patients admitted to NewYork-Presbyterian Hospital neurological intensive care unit in whom an EVD was placed between February 2007 and November 2009, excluding individuals receiving EVDs due to an infection of a primary device. Three time periods were defined depending on type of EVD in use: Period 1, conventional EVDs; Period 2, either ac-EVDs or conventional EVDs; and Period 3, ac-EVDs. Definite/probable VRIs that occurred during the 3 periods were evaluated and established as determinants of VRIs by using a Cox proportional hazards model. Prolonged systemic antibiotics were given for the duration of EVD placement in each of the 3 periods per institutional policy. Results Data from 141 individuals were evaluated; mean patient age was 53.8 ± 17.2 years and 54% were female. There were 2 definite and 19 probable VRIs. The incidence of definite/probable VRI (per 1000 person-catheter days) decreased from Period 1 to 3 (24.5, 16.2, and 4.4 in Periods 1, 2, and 3, respectively; p < 0.0001). Patients with VRIs were more likely to be female than male (23.7% vs 3.1%, p < 0.003) and have had an EVD in place for a longer duration, although there was no significant difference among the 3 periods (7.9 ± 6.7 [Period 1], 8.1 ± 7.1 [Period 2], and 8.6 ± 5.8 [Period 3] mean days; p = 0.87, ANOVA). Analysis of effect modification in a stepwise model showed that period, age, and age and female interaction were significant predictors of VRIs. The period was the strongest predictor of VRI (p = 0.0075). After adjustment for age and age and sex interaction, the survival rate was 53% at the end of Period 2 and 91% at the end of Period 3. Conclusions Rates of VRIs have decreased with the addition of ac-EVDs to the routine use of prolonged systemic antibiotics at the authors' institution.


2018 ◽  
Vol 15 (12) ◽  
pp. 895-899
Author(s):  
Yasuhiko Kubota ◽  
Alvaro Alonso ◽  
Amil M. Shah ◽  
Lin Y. Chen ◽  
Aaron R. Folsom

Background: There is no research on the association of television (TV) watching with atrial fibrillation (AF). Methods: From 1987 to 1989, the authors obtained information on the frequency of TV watching in 14,458 participants, aged 45–64 years, without a history of AF. The authors used the Cox proportional hazards model to estimate hazard ratios and their 95% confidence intervals of AF according to the frequency of TV watching (“never or seldom,” “sometimes,” “often,” or “very often”). Results: During the 294,553 person-years of follow-up, the authors identified 2,476 AF events. Adjustment for other potential confounding factors, including physical activity, did not change the associations, in which “very often” watching TV carried 1.28 (95% confidence interval, 1.09–1.50) times AF risk compared with “never or seldom” watching TV (P for trend = .002). Even among individuals who met a recommended level of physical activity, watching TV “very often” carried 1.36 (1.02–1.82) times AF risk, compared with watching TV “never or seldom.” Conclusion: Greater frequency of TV watching was independently associated with increased risk of AF even after adjusting for physical activity. Moreover, a recommended level of physical activity did not eliminate the increased risk of frequent TV watching for AF. Avoiding frequent TV watching might be beneficial for AF prevention.


Crisis ◽  
2018 ◽  
Vol 39 (1) ◽  
pp. 27-36 ◽  
Author(s):  
Kuan-Ying Lee ◽  
Chung-Yi Li ◽  
Kun-Chia Chang ◽  
Tsung-Hsueh Lu ◽  
Ying-Yeh Chen

Abstract. Background: We investigated the age at exposure to parental suicide and the risk of subsequent suicide completion in young people. The impact of parental and offspring sex was also examined. Method: Using a cohort study design, we linked Taiwan's Birth Registry (1978–1997) with Taiwan's Death Registry (1985–2009) and identified 40,249 children who had experienced maternal suicide (n = 14,431), paternal suicide (n = 26,887), or the suicide of both parents (n = 281). Each exposed child was matched to 10 children of the same sex and birth year whose parents were still alive. This yielded a total of 398,081 children for our non-exposed cohort. A Cox proportional hazards model was used to compare the suicide risk of the exposed and non-exposed groups. Results: Compared with the non-exposed group, offspring who were exposed to parental suicide were 3.91 times (95% confidence interval [CI] = 3.10–4.92 more likely to die by suicide after adjusting for baseline characteristics. The risk of suicide seemed to be lower in older male offspring (HR = 3.94, 95% CI = 2.57–6.06), but higher in older female offspring (HR = 5.30, 95% CI = 3.05–9.22). Stratified analyses based on parental sex revealed similar patterns as the combined analysis. Limitations: As only register-­based data were used, we were not able to explore the impact of variables not contained in the data set, such as the role of mental illness. Conclusion: Our findings suggest a prominent elevation in the risk of suicide among offspring who lost their parents to suicide. The risk elevation differed according to the sex of the afflicted offspring as well as to their age at exposure.


Sign in / Sign up

Export Citation Format

Share Document