scholarly journals Anticoagulation Quality of Warfarin and the Role of Physician–Pharmacist Collaborative Clinics in the Treatment of Patients Receiving Warfarin: A Retrospective, Observational, Single-Center Study

2021 ◽  
Vol 11 ◽  
Author(s):  
Sha Qiu ◽  
Na Wang ◽  
Chi Zhang ◽  
Zhi-Chun Gu ◽  
Yan Qian

Background: The management of patients receiving warfarin is complicated. This study evaluated the anticoagulation quality of warfarin, explored potential predictors associated with poor anticoagulation quality, and elucidated the role of clinical pharmacists in the management of warfarin treatment.Methods: We retrospectively collected data on patients who either initially received warfarin or returned to warfarin after withdrawal between January 1, 2015 and January 1, 2020. The primary outcome was time in therapeutic range (TTR), and a TTR of ≥60% was considered as good anticoagulation quality. The secondary outcomes included thromboembolic and bleeding events during the follow-up. We assessed the TTR of each participant and investigated the potential predictors of poor anticoagulation quality (TTR < 60%) using logistic regression analysis. Additionally, we compared the warfarin anticoagulant quality and the incidence of clinical adverse events between atrial fibrillation patients in physician–pharmacist collaborative clinics (PPCCs) and general clinics.Results: Totally, 378 patients were included. The mean TTR of patients was 42.6 ± 29.8%, with only 32% of patients having achieved good anticoagulation quality. During a mean follow-up period of 192 ± 92 days, we found no significant differences in the incidences of thromboembolic events (5.0% vs. 5.1%, p = 0.967) and bleeding events (1.7% vs. 4.7%, p = 0.241) between patients with good and those with poor anticoagulation quality. The presence of PPCCs (odds ratio [OR]: 0.47, 95% confidence interval [CI]: 0.25–0.90, p = 0.022) was an independent protective factor of poor anticoagulation quality, while the presence of more than four comorbidities (OR: 1.98, 95% CI: 1.22–3.24, p = 0.006) and an average interval of international normalized ratio monitoring of >30 days (OR: 1.74, 95% CI: 1.10–2.76, p = 0.019) were independent risk factors of poor anticoagulation quality. Compared with atrial fibrillation patients in general clinics, patients in PPCCs were found to have a significantly increased mean TTR level (48.4% ± 25.7% vs. 38.0% ± 27.6%, p = 0.014).Conclusion: The anticoagulation quality of warfarin was relatively low at our institution. The presence of more than four comorbidities and an average interval of international normalized ratio monitoring of >30 days independently contributed to poor anticoagulation quality. Meanwhile, the use of PPCC model improved the anticoagulation quality of warfarin.

2019 ◽  
Vol 40 (Supplement_1) ◽  
Author(s):  
M Proietti ◽  
G F Romiti ◽  
B Olshansky ◽  
G Y H Lip

Abstract Introduction Quality of anticoagulation control is essential to ensure better clinical outcomes in patients with atrial fibrillation (AF). Time in therapeutic range (TTR) is recommended as a measure of the quality of anticoagulation control. The International normalized ratio (INR) variability has been suggested as an alternative index, even though large independent validations for this index are still lacking. Purpose To provide validation of clinical usefulness of INR variability as a measure of the quality of anticoagulation control in a large cohort of AF patients. Methods Data from the Atrial Fibrillation Follow-up Investigation of Rhythm Management (AFFIRM) trial were analysed. INR variability was defined as the standard deviation (SD) of mean INR values [INR-SD] recorded throughout the follow-up observation for each patient. All patients with available INR values were included in the analysis. Stroke, major bleeding, cardiovascular (CV) death and all-cause death were study outcomes. Results Among the original 4060 patients, a total of 3185 (78.4%) were available for analysis. Mean (SD) INR-SD was 0.58 (0.25). According to INR-SD patients were categorized into four quartiles. Mean (SD) CHA2DS2-VASc score was increased (p=0.040), with no difference in proportions of CHA2DS2-VASc ≥2 (p=0.582) between the subgroups. A significant inverse correlation was found between INR-SD and TTR (Spearman's Rho: −0.536, p<0.001). Continuous INR-SD, after multiple adjustments, was inversely associated with TTR (standardized beta: −0.451, p<0.001) and directly associated with SAMe-TT2R2score (standardized beta: 0.084, p<0.001). A fully adjusted Cox multivariate regression analysis found that INR-SD was directly associated with increased risk of stroke, major bleeding and all-cause death (Table). An INR-SD ≥0.85 was directly associated with all the study outcomes, on multivariate analysis (Table). Cox Regression Analysis INR-SD INR-SD ≥0.85 HR (95% CI) HR (95%) Stroke 2.52 (1.34–4.67) 1.62 (1.00–2.63) Major Bleeding 2.43 (1.49–3.96) 1.61 (1.10–2.36) CV Death 1.50 (0.87–2.59) 1.54 (1.07–2.24) All-Cause Death 1.79 (1.21–2.66) 1.55 (1.17–2.05) CI = Confidence Interval; CV = Cardiovascular; HR = Hazard Ratio; INR-SD = International Normalized Ratio Standard Deviation. Conclusions INR variability, expressed as INR-SD, was significantly correlated and associated with TTR. Both continuous INR-SD and INR-SD ≥0.85 were significantly associated with a higher risk of all study adverse outcomes. Acknowledgement/Funding None


2021 ◽  
Vol 24 (3) ◽  
pp. E456-E460
Author(s):  
Guangpu Fan ◽  
Jing Liu ◽  
Suixin Dong ◽  
Yu Chen

Objective: To evaluate the risk factors and explore the mid-term outcomes of postoperative atrial fibrillation (POAF) after minimally invasive direct coronary artery bypass (MIDCAB). Methods: A total of 165 patients, who underwent isolated MIDCAB from 2012 to 2015, were enrolled in the study and retrospectively reviewed. Patients with preoperative arrhythmia, concomitant surgical procedures were excluded. All patients were continuously monitored for POAF until discharge, and two groups were formed: the non-POAF group (140 patients, 71.4% men, mean age 58.83±10.3 years) and the POAF group (25 patients, 84.0% men, mean age 64.52±9.0 years). Early and mid-term outcomes were evaluated, perioperative factors associated with POAF were analyzed with a binary logistic regression model, and the relationship between POAF and major adverse cardiac event (MACE) was analyzed with Cox regression model. Results: The incidence of POAF in this study was 15.15%. Patients in the POAF group had a significant higher risk of re-entry to ICU (2 cases: 2 cases=8.0%: 1.4%, P = 0.049), renal failure (2 cases: 1 case=8.0%: 0.7%, P = 0.018), and death (1 case: 0 case=4.0%: 0%, P = 0.018). Binary logistic regression showed gender (male), age were independent risk factors of POAF (P = 0.038, 95% confidence interval 1.082-16.286; P = 0.011, 95% confidence interval 1.015-1.117, respectively), preoperative ACEI or ARB usage was a protective factor of POAF (P = 0.010, 95% confidence interval 0.113-0.748). With a 5-year follow up, the overall MACE rate showed no statistical difference between the two groups (P = 0.067). Conclusions: POAF after MIDCAB was related to postoperative morbidities, such as re-entry to ICU, renal failure, and death. Gender (male) and age were independent risk factors, while preoperative ACEI or ARB usage was a protective factor. POAF has not associated the occurrence of MACE with a mid-term follow-up.


2017 ◽  
Vol 117 (12) ◽  
pp. 2261-2266 ◽  
Author(s):  
María Esteve-Pastor ◽  
José Rivera-Caravaca ◽  
Alena Shantsila ◽  
Vanessa Roldán ◽  
Gregory Lip ◽  
...  

Background The HAS-BLED (hypertension, abnormal renal/liver function, previous stroke, bleeding history or predisposition, labile international normalized ratio [INR], elderly and drugs/alcohol consumption) score has been validated in several scenarios but the recent European guidelines does not recommend any clinical score to assess bleeding risk in atrial fibrillation (AF) patients and only focus on modifiable clinical factors. Purpose The aim of this study was to test the hypothesis that the HAS-BLED score would perform at least similarly to an approach only based on modifiable bleeding risk factors (i.e. a ‘modifiable bleeding risk factors score’) for predicting bleeding events. Methods We performed a comparison between the HAS-BLED score and the new ‘modifiable bleeding risk factors score’ in a post hoc analysis in 4,576 patients included in the AMADEUS trial. Results After 347 (interquartile range, 186–457) days of follow-up, 597 patients (13.0%) experienced any clinically relevant bleeding event and 113 (2.5%) had a major bleeding. Only the HAS-BLED score was significantly associated with the risk of any clinically relevant bleeding (Cox's analysis for HAS-BLED ≥ 3: hazard ratio 1.38; 95% confidence interval [CI], 1.10–1.72; p = 0.005). The ‘modifiable bleeding risk factors score’ ≥ 2 were non-significantly associated with any clinical relevant bleeding. The two scores had modest ability in predicting bleeding events. The HAS-BLED score performed best in predicting any clinically relevant bleeding (c-indexes for HAS-BLED, 0.545 [95% CI, 0.530–0.559] vs. the ‘modifiable bleeding risk factors score’, 0.530 [95% CI, 0.515–0.544]; c-index difference 0.015, z-score = 2.063, p = 0.04). The HAS-BLED score with one, two and three modifiable factors performed significantly better than the ‘modifiable bleeding risk factors scores’ with one, two and three modifiable risk factors. Conclusion When compared with an approach only based on modifiable bleeding risk factors proposed by European Society of Cardiology (ESC) AF guidelines, the HAS-BLED score performed significantly better in predicting any clinically relevant bleeding in this clinical trial cohort. While modifiable bleeding risk factors should be addressed in all AF patients, the use of a formal bleeding risk score (HAS-BLED) has better predictive value for bleeding risks, and would help decision-making in identifying ‘high risk’ patients for scheduling reviews and follow-up.


Author(s):  
Zsuzsanna Kis ◽  
Astrid Amanda Hendriks ◽  
Taulant Muka ◽  
Wichor M. Bramer ◽  
Istvan Kovacs ◽  
...  

Introduction: Atrial Fibrillation (AF) is associated with remodeling of the atrial tissue, which leads to fibrosis that can contribute to the initiation and maintenance of AF. Delayed- Enhanced Cardiac Magnetic Resonance (DE-CMR) imaging for atrial wall fibrosis detection was used in several studies to guide AF ablation. The aim of present study was to systematically review the literature on the role of atrial fibrosis detected by DE-CMR imaging on AF ablation outcome. Methods: Eight bibliographic electronic databases were searched to identify all published relevant studies until 21st of March, 2016. Search of the scientific literature was performed for studies describing DE-CMR imaging on atrial fibrosis in AF patients underwent Pulmonary Vein Isolation (PVI). Results: Of the 763 citations reviewed for eligibility, 5 articles (enrolling a total of 1040 patients) were included into the final analysis. The overall recurrence of AF ranged from 24.4 - 40.9% with median follow-up of 324 to 540 days after PVI. With less than 5-10% fibrosis in the atrial wall there was a maximum of 10% recurrence of AF after ablation. With more than 35% fibrosis in the atrial wall there was 86% recurrence of AF after ablation. Conclusion: Our analysis suggests that more extensive left atrial wall fibrosis prior ablation predicts the higher arrhythmia recurrence rate after PVI. The DE-CMR imaging modality seems to be a useful method for identifying the ideal candidate for catheter ablation. Our findings encourage wider usage of DE-CMR in distinct AF patients in a pre-ablation setting.


Perfusion ◽  
2021 ◽  
pp. 026765912199599
Author(s):  
Esther Dreier ◽  
Maximilian Valentin Malfertheiner ◽  
Thomas Dienemann ◽  
Christoph Fisser ◽  
Maik Foltan ◽  
...  

Background: The role of venovenous extracorporeal membrane oxygenation (VV ECMO) in patients with COVID-19-induced acute respiratory distress syndrome (ARDS) still remains unclear. Our aim was to investigate the clinical course and outcome of those patients and to identify factors associated with the need for prolonged ECMO therapy. Methods: A retrospective single-center study on patients with VV ECMO for COVID-19-associated ARDS was performed. Baseline characteristics, ventilatory and ECMO parameters, and laboratory and virological results were evaluated over time. Six months follow-up was assessed. Results: Eleven of 16 patients (68.8%) survived to 6 months follow-up with four patients requiring short-term (<28 days) and seven requiring prolonged (⩾28 days) ECMO support. Lung compliance before ECMO was higher in the prolonged than in the short-term group (28.1 (28.8–32.1) ml/cmH2O vs 18.7 (17.7–25.0) ml/cmH2O, p = 0.030). Mechanical ventilation before ECMO was longer (19 (16–23) days vs 5 (5–9) days, p = 0.002) and SOFA score was higher (12.0 (10.5–17.0) vs 10.0 (9.0–10.0), p = 0.002) in non-survivors compared to survivors. Low viral load during the first days on ECMO tended to indicate worse outcomes. Seroconversion against SARS-CoV-2 occurred in all patients, but did not affect outcome. Conclusions: VV ECMO support for COVID-19-induced ARDS is justified if initiated early and at an experienced ECMO center. Prolonged ECMO therapy might be required in those patients. Although no relevant predictive factors for the duration of ECMO support were found, the decision to stop therapy should not be made dependent of the length of ECMO treatment.


2021 ◽  
Vol 28 (Supplement_1) ◽  
Author(s):  
TE Graca Rodrigues ◽  
N Cunha ◽  
P Silverio-Antonio ◽  
P Couto Pereira ◽  
B Valente Silva ◽  
...  

Abstract Funding Acknowledgements Type of funding sources: None. Introduction There is some evidence suggesting that exaggerated hypertensive response to exercise (HRE) may be associated with higher risk of future cardiovascular events, however the relationship between systolic blood pressure (SPB) during exercise test and stroke is not fully understood. Purpose To evaluate the ability to predict the risk of stroke in patients with HRE in exercise test. Methods Single-center retrospective study of consecutive patients submitted to exercise test from 2012 to 2015 with HRE to stress test. HRE was defined as a peak systolic blood pressure (PSBP) &gt; 210 mmHg in men and &gt; 190 mmHg in women, or a rise of the SBP of 60 mmHg in men or 50 mmHg in women or as a diastolic blood pressure &gt; 90 mmHg or a rise of 10 mmHg. Patient’s demographics, baseline clinical characteristics, vital signs during the stress test and the occurrence of stroke during follow-up were analysed Results We included 458 patients with HRE (76% men, 57.5 ± 10.83 years). The most frequent comorbidities were hypertension (83%), dyslipidaemia (61%), previously known coronary disease (32%), diabetes (28%) and smoking (38%). Atrial fibrillation was present in 5.9% of patients. During a mean follow-up of 60 ± 2 months, the incidence of stroke was 2.1% (n = 8), all with ischemic origin. Considering the parameters analysed on exercise test, only PSBP demonstrated to be an independent predictor of stroke (HR 1.042, CI95% 1.002-1.084, p = 0.039,) with moderate ability to predict stroke (AUC 0.735, p = 0.0016) with a most discriminatory value of 203 mmHg (sensibility 56%, specify 67%). Regarding baseline characteristics, after age, sex and comorbidities adjustment, previously controlled hypertension was found to be an independent protective factor of stroke (OR 4.247, CI 95% 0.05-0.9, p = 0.036) and atrial fibrillation was an independent predictor of stroke occurrence (HR 8.1, CI95% 1.4-46.9, p = 0.018). Atrial fibrillation was also associated with hospitalization of cardiovascular cause and major cardiovascular events occurrence (mortality, coronary syndrome and stroke). Baseline SBP was associated with atrial fibrillation development (p = 0.008). Conclusion According to our results, PSBP during exercise test is an independent predictor of stroke occurrence and should be considered as a potencial additional tool to predict stroke occurrence, particularly in high risk patients. The identification of diagnosed hypertension as a protective factor of stroke may be explained by the cardioprotective effect of antihypertensive drugs.


2021 ◽  
Vol 12 ◽  
Author(s):  
Ilaria Righi ◽  
Valentina Vaira ◽  
Letizia Corinna Morlacchi ◽  
Giorgio Alberto Croci ◽  
Valeria Rossetti ◽  
...  

Chronic lung allograft dysfunction (CLAD) is the main cause of poor survival and low quality of life of lung transplanted patients. Several studies have addressed the role of dendritic cells, macrophages, T cells, donor specific as well as anti-HLA antibodies, and interleukins in CLAD, but the expression and function of immune checkpoint molecules has not yet been analyzed, especially in the two CLAD subtypes: BOS (bronchiolitis obliterans syndrome) and RAS (restrictive allograft syndrome). To shed light on this topic, we conducted an observational study on eight consecutive grafts explanted from patients who received lung re-transplantation for CLAD. The expression of a panel of immune molecules (PD1/CD279, PDL1/CD274, CTLA4/CD152, CD4, CD8, hFoxp3, TIGIT, TOX, B-Cell-Specific Activator Protein) was analyzed by immunohistochemistry in these grafts and in six control lungs. Results showed that RAS compared to BOS grafts were characterized by 1) the inversion of the CD4/CD8 ratio; 2) a higher percentage of T lymphocytes expressing the PD-1, PD-L1, and CTLA4 checkpoint molecules; and 3) a significant reduction of exhausted PD-1-expressing T lymphocytes (PD-1pos/TOXpos) and of exhausted Treg (PD-1pos/FOXP3pos) T lymphocytes. Results herein, although being based on a limited number of cases, suggest a role for checkpoint molecules in the development of graft rejection and offer a possible immunological explanation for the worst prognosis of RAS. Our data, which will need to be validated in ampler cohorts of patients, raise the possibility that the evaluation of immune checkpoints during follow-up offers a prognostic advantage in monitoring the onset of rejection, and suggest that the use of compounds that modulate the function of checkpoint molecules could be evaluated in the management of chronic rejection in LTx patients.


2020 ◽  
Vol 4 (2) ◽  
pp. 211-220
Author(s):  
Legiman

This research aims to determine the role of the head of Madrasah in improving the quality of education in MTs Negeri 4 Kulon Progo, and to help improve the competency of teachers in the learning activities with a real, measurable, controlled, directional program to achieve learning objectives. This research uses qualitative descriptive methods with data collection through observation, interviews and documentation. The results of this research are: 1) supervision of the head of the district of the state of the principal of the school 4 Kulon Progo includes education, managerial, administrator, supervisor, leader, innovator, and motivator; 2) The role of quality improvement is done by increasing the professionalism of teachers by giving opportunities to teachers; 3) Factors affecting the implementation of the head of the Madrasah include: the planning, implementation and follow-up that is carried out continuously to the teachers and education personnel.


Author(s):  
S Sunil Kumar ◽  
Oliver Joel Gona ◽  
Nagaraj Desai ◽  
B Shyam Prasad Shetty ◽  
KS Poornima ◽  
...  

Introduction: Vitamin K Antagonists (VKAs) have been in use for more than 50 years. They have remained as mainstay therapy in the prevention of thromboembolic events in atrial fibrillation, mechanical heart valves and venous thromboembolism. Despite many years of clinical experience with VKAs, the quality of anticoagulation achieved in routine clinical practice is suboptimal. Aim: To study the effects of structured Anticoagulation Clinic (ACC) interventions on patient centred outcomes in subjects taking VKAs. Materials and Methods: A retrospective study was conducted among patients taking VKAs enrolled in ACC. A total of 169 patients receiving VKAs for at least six months with 4 INR (International Normalised Ratio) values and completed 12 months of follow-up were analysed. Anticoagulation related quality measures like Time in the Therapeutic Range (TTR), Percentage of International Normalised Ratios in the therapeutic Range (PINRR) and clinical outcomes like stroke, systemic embolic events and bleeding was analysed at the time of enrolment and compared with those during ACC care. Results: Among 352 patients enrolled in ACC, 169 patients were eligible for analysis. The mean age of the study population was 55.62±13.77 years. Atrial fibrillation (59%) was the most common indication for VKA therapy. Hypertension (66.3%) was the most common co-morbidity. Mean TTRs were significantly higher in the ACC care when compared with the pre-ACC care at 12 months follow-up (77.58±8.85% vs 51.01±16.7%, p<0.0001). There was a significant improvement in TTRs as early as three months of ACC intervention (73.18±13.56%). At the time of enrolment, 21.9% of patients had individual TTRs (i-TTR) >70% which increased to 70.4% at 12 months of follow-up. INR testing was done more frequently in ACC care. Adverse clinical events were higher in pre-ACC care than ACC care (4.7% vs 2.4%, p>0.05). Major bleeding and thromboembolic events were higher in pre-ACC care than ACC care (1.8% vs. 0.6% and 2.4% vs. 0.6% respectively). Conclusion: ACC services helps in achieving better quality of anticoagulation control as measured by time in therapeutic range translating into better clinical outcomes.


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Rachel M. Lofthouse ◽  
Anthea Rose ◽  
Ruth Whiteside

PurposeThe research demonstrates the role of activity systems based in Cultural Historical Activity Theory as a means of analysing characteristics and efficacy of specific provisions of coaching in education.Design/methodology/approachThree examples of coaching in education were selected, involving 51 schools in England. The three examples were re-analysed using activity systems. This drew on existing evaluation evidence, gathered through interviews, questionnaires, focus groups and recordings of coaching.FindingsIn each example, the object of the coaching was to address a specific challenge to secure the desired quality of education. Using activity systems it is possible to demonstrate that coaching has a range of functions (both intended and consequential). The individual examples illustrate the potential of coaching to support change in complex and diverse education settings.Research limitations/implicationsThe use of existing data from evaluations means that direct comparisons between examples are not made. While data were collected throughout the duration of each coaching programme no follow-up data was available.Practical implicationsThe analysis of the examples of coaching using activity systems provides evidence of the efficacy of specific coaching provision in achieving individually defined objectives related to sustaining and improving specific educational practices.Originality/valueThe research offers insights into how coaching in education might be better tuned to the specific needs of contexts and the challenges experienced by the individuals working in them. In addition, it demonstrates the value of activity systems as an analytical tool to make sense of coaching efficacy.


Sign in / Sign up

Export Citation Format

Share Document