Abstract 114: Relationship Between Duration of QRS Complex and Long Term Mortality in U.S. Veterans With Atherothrombotic Risk Factors

Author(s):  
Amish Patel ◽  
Jonathan Pollock ◽  
Edward Sam Roberto ◽  
Thein Tun Aung ◽  
Ronald Markert ◽  
...  

Background: The 12-lead electrocardiogram (ECG) remains a cost-effective diagnostic tool in risk stratification for cardiovascular disease. Little is known of the prognostic value of QRS duration but recent reports suggest that a prolonged QRS duration may be associated with adverse outcomes. We investigated the relationship between QRS duration and long term mortality in Veterans with atherothrombotic risk factors. Methods: We retrospectively collected data from a Veterans Affairs (VA) medical center for consecutive patients (October 2001 to January 2005) to determine the long term mortality rates associated with different intervals of QRS duration in patients who presented for coronary angiography. Results: Of the 1193 charts reviewed, 1186 had a QRS duration reading recorded. For these 1186 patients the mean follow-up period was 103±52 months. Mean age was 63.2±10.8 years with 98% male. Mean body mass index was 30.0±5.9. The prevalence of comorbidities was: hypertension (88%), hyperlipidemia (79%), obstructive coronary artery disease (73%), left ventricular hypertrophy (50.4%), diabetes mellitus (45%), peripheral vascular disease (17%), and cerebrovascular accident (8%). Mean left ventricular ejection fraction (LVEF) was 47±13%, and mean PR interval was 172.5±30.5 milliseconds (ms). Most patients were on beta-blocker (82%). Among patients with bundle branch blocks (BBB), left BBB was present in 4.6% and right BBB was present in 6.9%. Mean QRS duration was 102.2±23.6 ms. As the QRS duration increased by intervals of 10-milliseconds, the mortality rate (%) increased [QRS ≤100 (40.7%), 101 to 110 (51.3%), 111 to 120 (66.3%), >120 (71.2%), p<0.001]. Among patients with QRS duration >120, mortality was higher in those >150 vs. 121 to 150 (79.7% vs 65.7, p=0.045). While QRS duration was a significant univariate predictor of morality, QRS duration is not significant when adjusted for 10 covariates listed above (odds ratio = 1.00 [95% Cl = 0.98 to 1.01], p = 0.72). Conclusion: Long term mortality was higher as QRS duration increased. QRS duration had utility in predicting mortality within this cohort of US Veterans with atherothrombotic risk factors.

EP Europace ◽  
2019 ◽  
Vol 21 (7) ◽  
pp. 1070-1078 ◽  
Author(s):  
Wagner L Gali ◽  
Alvaro V Sarabanda ◽  
José M Baggio ◽  
Eduardo F Silva ◽  
Gustavo G Gomes ◽  
...  

Aims Data on long-term follow-up of patients with Chagas’ heart disease (ChHD) receiving a secondary prevention implantable cardioverter-defibrillator (ICD) are limited and its benefit is controversial. The aim of this study was to evaluate the long-term outcomes of ChHD patients who received a secondary prevention ICD. Methods and results We assessed the outcomes of consecutive ChHD patients referred to our Institution from 2006 to 2014 for a secondary prevention ICD [89 patients; 58 men; mean age 56 ± 11 years; left ventricular ejection fraction (LVEF), 42 ± 12%]. The primary outcome included a composite of death from any cause or heart transplantation. After a mean follow-up of 59 ± 27 months, the primary outcome occurred in 23 patients (5.3% per year). Multivariate analysis showed that LVEF < 35% [hazard ratio (HR) 4.64; P < 0.01] and age ≥ 65 years (HR 3.19; P < 0.01) were independent predictors of the primary outcome. Using these two risk factors, a risk score was developed, and lower- (no risk factors), intermediate- (one risk factor), and higher-risk (two risk factors) groups were recognized with an annual rate of primary outcome of 1.4%, 7.4%, and 20.4%, respectively. A high burden of appropriate ICD therapies (16% per year) and electrical storms were documented, however, ICD interventions did not impact on the primary outcome. Conclusion Among ChHD patients receiving a secondary prevention ICD, older age (≥65 years) and left ventricular dysfunction (LVEF < 35%) portend a poor outcome and were associated with increased risk of death or heart transplantation. Most patients received appropriate ICD therapies, however, ICD interventions did not impact on the primary outcome.


2021 ◽  
Vol 39 (15_suppl) ◽  
pp. e24060-e24060
Author(s):  
Alma Farooque ◽  
Cibele Carroll ◽  
Amye Juliet Tevaarwerk ◽  
Priyanka Avinash Pophali

e24060 Background: Anthracyclines are known to cause long-term cardiotoxicity. There are no specific guidelines for CV screening and follow-up of AYA patients treated with anthracyclines. Pediatric guidelines focus on long-term imaging surveillance, while for adults, LVEF assessment prior to anthracyclines is recommended. Multiple studies have demonstrated LVEF assessment rarely impacts treatment decisions, especially in the absence of CV symptoms/risk factors, adds to unnecessary costs and delays treatment initiation. Our study aimed to determine the pre-treatment LVEF assessment practices in AYA lymphoma patients treated with anthracyclines and its association with long-term cardiotoxicity. Methods: AYA survivors diagnosed with lymphoma > 5 years ago and treated with anthracyclines at age 15-39 years were identified in a retrospective single institution registry. To ensure adequate follow-up, at least 2 follow-up visits during 2015-19 were required. Data abstracted on eligible subjects included documentation of pre-treatment LVEF evaluation, clinical rationale and treatment regimen. CV risk factors and events were collected pre-treatment and during follow-up. Descriptive statistics were used to summarize data. Results: 64/115 (56%) of AYA lymphoma patients underwent pre-treatment LVEF assessment. Rationale for/against LVEF assessment was rarely documented: low CV risk was recorded as rationale for no LVEF assessment in 2 subjects. Among AYAs who underwent pre-treatment LVEF assessment, no significant abnormalities were detected and no changes in subsequent treatment plans were found. During median follow-up of 6.7 (inter-quartile range 5.4-9.5) years, 6/115 (5%) experienced CV events. Only 2 (1.7%) survivors experienced potential anthracycline-related CV events: 1 moderate cardiomyopathy at 9 years, 1 peri-partum cardiomyopathy and atrial fibrillation due to post-radiation SVC occlusion at 15 years post-treatment. Both these AYAs (aged 38 and 31 years at time of CV events) also had other CV risk factors- family history, smoking, obesity, and hyperlipidemia. Four (3.5%) survivors’ experienced CV events (1 sinus tachyarrhythmia, 1 junctional rhythm, 2 acute/asymptomatic drop in LVEF) unrelated to anthracyclines with clear alternative etiology e.g. sepsis/symptom burden. There was no correlation between having pre-treatment LVEF assessment and occurrence of CV events. 13/115 (11.3%) developed new CV risk factors: 4 hypertension, 6 hyperlipidemia, 3 diabetes. Conclusions: Pre-treatment LVEF assessment is done inconsistently in AYA lymphoma patients but does not impact initial treatment or predict late cardiotoxicity. CV events in long-term AYA lymphoma survivors are rare but evaluation of CV risk factors, early detection and management may be more important than focusing on LVEF assessment.


Angiology ◽  
2020 ◽  
Vol 71 (10) ◽  
pp. 903-908
Author(s):  
Nihat Polat ◽  
Mustafa Oylumlu ◽  
Mehmet Ali Işik ◽  
Bayram Arslan ◽  
Mehmet Özbek ◽  
...  

In patients with unstable angina pectoris (UAP) or non-ST elevation myocardial infarction (NSTEMI), long-term mortality remains high despite improvements in the diagnosis and treatment. In this study, we investigated whether serum albumin level is a useful predictor of long-term mortality in patients with UAP/NSTEMI. Consecutive patients (n = 403) who were hospitalized with a diagnosis of UAP/NSTEMI were included in the study. Patients were divided into 2 groups based on the presence of hypoalbuminemia and the relationship between hypoalbuminemia and mortality was analyzed. Hypoalbuminemia was detected in 34% of the patients. The median follow-up period was 35 months (up to 45 months). Long-term mortality rate was 32% in the hypoalbuminemia group and 8.6% in the group with normal serum albumin levels ( P < .001). On multivariate analysis, hypoalbuminemia, decreased left ventricular ejection fraction, and increased age were found to be independent predictors of mortality ( P < .05). The cutoff value of 3.10 g/dL for serum albumin predicted mortality with a sensitivity of 74% and specificity of 67% (receiver-operating characteristic area under curve: 0.753, 95% CI: 0.685-0.822). All-cause long-term mortality rates were significantly increased in patients with hypoalbuminemia. On-admission albumin level was an independent predictor of mortality in patients with UAP/NSTEMI.


QJM ◽  
2019 ◽  
Vol 112 (12) ◽  
pp. 900-906 ◽  
Author(s):  
X -B Wei ◽  
Z -D Su ◽  
Y -H Liu ◽  
Y Wang ◽  
J -L Huang ◽  
...  

Summary Background Older age, renal dysfunction and low left ventricular ejection fraction are accepted predictors of poor outcome in patients with infective endocarditis (IE). This study aimed to investigate the prognostic significance of the age, creatinine and ejection fraction (ACEF) score in IE. Methods The study involved 1019 IE patients, who were classified into three groups according to the tertiles of ACEF score: low ACEF (<0.6, n = 379), medium ACEF (0.6–0.8, n = 259) and high ACEF (>0.8, n = 381). The ACEF score was calculated as follows: age (years)/ejection fraction (%)+1 (if serum creatinine value was >2 mg/dL). The relationship between ACEF score and adverse events was analyzed. Results In-hospital mortality was 8.2%, which increased with the increase of ACEF score (4.2% vs. 5.0% vs. 14.4% for the low-, medium- and high-ACEF groups, respectively; P < 0.001). ACEF score had a good discriminative ability for predicting in-hospital death [areas under the curve (AUC), 0.706, P < 0.001]. The predictive value of ACEF score in surgical treatment was significantly higher than in conservative treatment for predicting in-hospital death (AUC, 0.812 vs. 0.625; P = 0.001). Multivariable analysis revealed that ACEF score was independently associated with in-hospital mortality (adjusted odds ratio, 2.82; P < 0.001) and long-term mortality (adjusted hazard ratio, 2.51; P < 0.001). Conclusion ACEF was an independent predictor for in-hospital and long-term mortality in IE patients, and it could be considered as a useful tool for risk stratification. ACEF score was more suitable for surgical patients in terms of assessing the risk of in-hospital mortality.


2020 ◽  
Vol 41 (Supplement_2) ◽  
Author(s):  
G Bugani ◽  
E Tonet ◽  
R Pavasini ◽  
M Serenelli ◽  
D Mele ◽  
...  

Abstract Background The number of older patients presenting with acute coronary syndrome (ACS) is increasing. Routine percutaneous coronary intervention (PCI) is performed in order to improve outcome, but comorbidities associated with aging lead to a higher risk of treatment complications. Contrast-induced acute kidney injury (CI-AKI) represents potential harm in older and frail patients, but its impact on long term prognosis is not clear. Purpose To evaluate occurrence, predictors, and impact on long term outcome of CI-AKI in elderly patients presenting with ACS. Methods A prospective cohort of 392 older (≥70 years) ACS patients who underwent coronary angiography was enrolled. CI-AKI was defined as a serum creatinine increase at least ≥0.3 mg/dl in 48 h or at least ≥50% in 7 days. According to our department protocol, prophylactic hydration was performed to all patients with isotonic saline, given intravenously at a rate of 1 ml/kg body weight/h (0.5 ml/kg for patients with left ventricular ejection fraction &lt;35%) for 12 h before (unless for emergent patients) and 24 h after PCI. Median follow up was 4 [3.0–4.1] years. Long term adverse outcomes include all-cause mortality and any hospitalization for cardiovascular causes (ACS, heart failure, arrhythmia, cerebrovascular accident). Results CI-AKI was observed in 72 patients (18.4%). Among patients who developed or not CI-AKI, no difference was found between clinical presentation (Non-ST segment elevation myocardial infarction (NSTEMI) vs. STEMI), left ventricular ejection fraction and multivessel coronary disease. Estimated glomerular filtration rate (odd ratio (OR) 3.59, confidence interval (CI) 1.79–7.20, p&lt;0.001), contrast media volume (OR 1.006, CI 1.002–1.009, P=0.001), white blood cells (OR 1.18, CI 1.10–1.27, p&lt;0.001), haemoglobin level (OR 0.81, CI 0.70–0.94, p=0.005) and chronic obstructive pulmonary disease (OR=5.37, CI 2.24–12.90, p&lt;0.001) were independent predictors for CI-AKI. Patients with CI-AKI presented increased mortality rate both at 30-days (2.7% vs 0%, p=0.038) and at 4-years follow-up (all cause death 23.6 vs. 11.6%, p=0.013) (Figure 1: long term adverse outcomes). Multivariable Cox proportional hazards analysis revealed that diabetes (hazard ratio, HR 1.99, CI 1.33–2.97, p=0.001), atrial fibrillation (HR 2.49, CI 1.59–3.91, p&lt;0.001), Killip class &gt;1 (HR 2.20, CI 1.32–3.67, p=0.003) and haemoglobin level (HR 0.84, CI 0.76–0.92, p&lt;0.001) were independently associated with adverse outcome, while CI-AKI represent a risk factor only at univariate analysis. Conclusions CI-AKI is a common complication among older adults undergoing coronary angiography for ACS. Patients who developed CI-AKI had worse outcome at long term follow-up. Actually, the occurrence of CI-AKI was not identified as an independent predictor for long-term adverse outcome, while it may represent a marker of severity of comorbidity and consequent poor prognosis, rather than a causal agent itself. Figure 1. Kaplan-Maier Curve Funding Acknowledgement Type of funding source: None


Sign in / Sign up

Export Citation Format

Share Document