scholarly journals Automated Electrocardiogram Analysis Identifies Novel Predictors of Ventricular Arrhythmias in Brugada Syndrome

2021 ◽  
Vol 7 ◽  
Author(s):  
Gary Tse ◽  
Sharen Lee ◽  
Andrew Li ◽  
Dong Chang ◽  
Guangping Li ◽  
...  

Background: Patients suffering from Brugada syndrome (BrS) are at an increased risk of life-threatening ventricular arrhythmias. Whilst electrocardiographic (ECG) variables have been used for risk stratification with varying degrees of success, automated measurements have not been tested for their ability to predict adverse outcomes in BrS.Methods: BrS patients presenting in a single tertiary center between 2000 and 2018 were analyzed retrospectively. ECG variables on vector magnitude, axis, amplitude and duration from all 12 leads were determined. The primary endpoint was spontaneous ventricular tachycardia/ventricular fibrillation (VT/VF) on follow-up.Results: This study included 83 patients [93% male, median presenting age: 56 (41–66) years old, 45% type 1 pattern] with 12 developing the primary endpoint (median follow-up: 75 (Q1–Q3: 26–114 months). Cox regression showed that QRS frontal axis > 70.0 degrees, QRS horizontal axis > 57.5 degrees, R-wave amplitude (lead I) <0.67 mV, R-wave duration (lead III) > 50.0 ms, S-wave amplitude (lead I) < −0.144 mV, S-wave duration (lead aVL) > 35.5 ms, QRS duration (lead V3) > 96.5 ms, QRS area in lead I < 0.75 Ashman units, ST slope (lead I) > 31.5 deg, T-wave area (lead V1) < −3.05 Ashman units and PR interval (lead V2) > 157 ms were significant predictors. A weighted score based on dichotomized values provided good predictive performance (hazard ratio: 1.59, 95% confidence interval: 1.27–2.00, P-value<0.0001, area under the curve: 0.84).Conclusions: Automated ECG analysis revealed novel risk markers in BrS. These markers should be validated in larger prospective studies.

2019 ◽  
Vol 40 (Supplement_1) ◽  
Author(s):  
V Probst ◽  
S Anys ◽  
F Sacher ◽  
J Briand ◽  
B Guyomarch ◽  
...  

Abstract Introduction Brugada syndrome (BrS) is an inherited arrhythmia syndrome with an increased risk of sudden cardiac death (SCD) despite a structurally normal heart. Many parameters have been suggested to be associated with the risk of ventricular arrhythmias, but only previous symptoms and spontaneous ECG pattern have been consistently associated with the risk of ventricular arrhythmia occurrence. Objective The aim of this study was to evaluate the association of these parameters with arrhythmic events in the largest cohort of BrS patients ever described. Methods Consecutive patients affected with BrS were recruited in a multicentric prospective registry in France (15 centers) between 1994 and 2016. Data were prospectively collected with an average follow-up of 6.5±4.7 years. ECGs were reviewed by 2 physicians blinded to clinical status. Results In this study, we enrolled a total of 1613 patients (mean age 45±15 years; 1119 males, 69%). At baseline, 462 patients (29%) were symptomatic (51 (3%) aborted SCD, 257 (16%) syncope). A spontaneous type 1 ECG pattern was present in 505 patients (31%). Implantable cardiac defibrillator was implanted in 477 patients (30%). During the follow-up, 91 patients (6%) underwent arrhythmic events (16 SCD (10%), 48 appropriate ICD therapy (3%) and 27 ventricular arrhythmias (2%). Thirty-six patients (2%) died of non-arrhythmic causes. Mean age at the first event was 44±15 years. In our cohort, event predictors were SCD (HR: 18.3; 95% CI: 11.2–29.8; p<0.0001), syncope (HR: 2.9; 95% CI: 1.8–4.9; p<0.0001), age >60 years (HR: 0.11; 95% CI: 0.032–0.377; p=0,0004), gender (HR: 2.96; 95% CI: 1.6–5.4; p=0.0005), spontaneous type 1 (HR: 2.14; 95% CI: 1.42–3.23; p=0.0003), type 1 ST elevation in peripheral ECG lead (HR: 3.6; 95% CI: 1.9–7.1; p=0,0001), fragmented QRS (HR: 3.37; 95% CI: 1.37–8.32; p=0,008), AvR sign (HR: 2.2; 95% CI: 1.4–3.8; p=0,0007), QRS >120ms in D2 lead (HR: 2.2; 95% CI: 1.4–3.6; p=0,001) and QRS >90ms in V6 (HR: 2.1; 95% CI: 1.3–3.3; p=0,001). All the others parameters including early repolarization pattern (ERP) and EPS were not predictor of events. Conclusion In the largest cohort of BrS patients ever described, we confirmed that symptoms, age, gender, spontaneous type 1, type 1 ST elevation in peripheral ECG lead, fragmented QRS, AvR sign, QRS >120ms in D2 and QRS >90ms in V6 are associated with arrhythmic events whereas ERP and EPS were not.


2020 ◽  
Vol 41 (Supplement_2) ◽  
Author(s):  
A Kartas ◽  
A Samaras ◽  
D Vasdeki ◽  
G Dividis ◽  
G Fotos ◽  
...  

Abstract Background The association of heart failure (HF) with the prognosis of atrial fibrillation (AF) remains unclear. OBJECTIVES To assess all-cause mortality in patients following hospitalization with comorbid AF in relation to the presence of HF. Methods We performed a cross-sectional analysis of data from 977 patients discharged from the cardiology ward of a single tertiary center between 2015 and 2018 and followed for a median of 2 years. The association between HF and the primary endpoint of death from any cause was assessed using multivariable Cox regression. Results HF was documented in 505 (51.7%) of AF cases at discharge, including HFrEF (17.9%), HFmrEF (16.5%) and HFpEF (25.2%). A primary endpoint event occurred in 212 patients (42%) in the AF-HF group and in 86 patients (18.2%) in the AF-no HF group (adjusted hazard ratio [aHR] 2.27; 95% confidence interval [CI], 1.65 to 3.13; P&lt;0.001). HF was associated with a higher risk of the composite secondary endpoint of death from any cause, AF or HF-specific hospitalization (aHR 1.69; 95% CI 1.32 to 2.16 p&lt;0.001). The associations of HF with the primary and secondary endpoints were significant and similar for AF-HFrEF, AF-HFmrEF, AF-HFpEF. Conclusions HF was present in half of the patients discharged from the hospital with comorbid AF. The presence of HF on top of AF was independently associated with a significantly higher risk of all-cause mortality than did absence of HF, irrespective of HF subtype. Funding Acknowledgement Type of funding source: None


2020 ◽  
Vol 41 (Supplement_2) ◽  
Author(s):  
E Havers-Borgersen ◽  
J.H Butt ◽  
M Groening ◽  
M Smerup ◽  
G.H Gislason ◽  
...  

Abstract Introduction Patients with tetralogy of Fallot (ToF) are considered at high risk of infective endocarditis (IE) as a result of altered hemodynamics and multiple surgical and interventional procedures including pulmonary valve replacement (PVR). The overall survival of patients with ToF has increased in recent years. However, data on the risk of adverse outcomes including IE are sparse. Purpose To investigate the risk of IE in patients with ToF compared with controls from the background population. Methods In this nationwide observational cohort study, all patients with ToF born in 1977–2017 were identified using Danish nationwide registries and followed from date of birth until occurrence of an outcome of interest (i.e. first-time IE), death, or end of study (July 31, 2017). The comparative risk of IE among ToF patients versus age- and sex-matched controls from the background population was assessed. Results A total of 1,156 patients with ToF were identified and matched with 4,624 controls from the background population. Among patients with ToF, 266 (23.0%) underwent PVR during follow-up. During a median follow-up time of 20.4 years, 38 (3.3%) patients and 1 (0.03%) control were admitted with IE. The median time from date of birth to IE was 10.8 years (25th-75th percentile 2.8–20.9 years). The incidence rates of IE per 1,000 person-years were 2.2 (95% confidence interval (CI) 1.6–3.0) and 0.01 (95% CI 0.0001–0.1) among patients and controls, respectively. In multivariable Cox regression models, in which age, sex, pulmonary valve replacement, and relevant comorbidities (i.e. chronic renal failure, diabetes mellitus, presence of cardiac implantable electronic devices, other valve surgeries), were included as time-varying coefficients, the risk of IE was significantly higher among patients compared with controls (HR 171.5, 95% CI 23.2–1266.7). Moreover, PVR was associated with an increased risk of IE (HR 3.4, 95% CI 1.4–8.2). Conclusions Patients with ToF have a substantial risk of IE and the risk is significantly higher compared with the background population. In particular, PVR was associated with an increased risk of IE. With an increasing life-expectancy of these patients, intensified awareness, preventive measures, and surveillance of this patient group are advisable. Figure 1. Cumulative incidence of IE Funding Acknowledgement Type of funding source: None


Antioxidants ◽  
2021 ◽  
Vol 10 (7) ◽  
pp. 1102
Author(s):  
Angelica Rodriguez-Niño ◽  
Diego O. Pastene ◽  
Adrian Post ◽  
M. Yusof Said ◽  
Antonio W. Gomes-Neto ◽  
...  

Carnosine affords protection against oxidative and carbonyl stress, yet high concentrations of the carnosinase-1 enzyme may limit this. We recently reported that high urinary carnosinase-1 is associated with kidney function decline and albuminuria in patients with chronic kidney disease. We prospectively investigated whether urinary carnosinase-1 is associated with a high risk for development of late graft failure in kidney transplant recipients (KTRs). Carnosine and carnosinase-1 were measured in 24 h urine in a longitudinal cohort of 703 stable KTRs and 257 healthy controls. Cox regression was used to analyze the prospective data. Urinary carnosine excretions were significantly decreased in KTRs (26.5 [IQR 21.4–33.3] µmol/24 h versus 34.8 [IQR 25.6–46.8] µmol/24 h; p < 0.001). In KTRs, high urinary carnosinase-1 concentrations were associated with increased risk of undetectable urinary carnosine (OR 1.24, 95%CI [1.06–1.45]; p = 0.007). During median follow-up for 5.3 [4.5–6.0] years, 84 (12%) KTRs developed graft failure. In Cox regression analyses, high urinary carnosinase-1 excretions were associated with increased risk of graft failure (HR 1.73, 95%CI [1.44–2.08]; p < 0.001) independent of potential confounders. Since urinary carnosine is depleted and urinary carnosinase-1 imparts a higher risk for graft failure in KTRs, future studies determining the potential of carnosine supplementation in these patients are warranted.


Diabetologia ◽  
2021 ◽  
Author(s):  
Peter Ueda ◽  
Viktor Wintzell ◽  
Mads Melbye ◽  
Björn Eliasson ◽  
Ann-Marie Svensson ◽  
...  

Abstract Aims/hypothesis Concerns have been raised regarding a potential association of use of the incretin-based drugs dipeptidyl peptidase 4 (DPP4) inhibitors and glucagon-like peptide-1 (GLP-1)-receptor agonists with risk of cholangiocarcinoma. We examined this association in nationwide data from three countries. Methods We used data from nationwide registers in Sweden, Denmark and Norway, 2007–2018, to conduct two cohort studies, one for DPP4 inhibitors and one for GLP-1-receptor agonists, to investigate the risk of incident cholangiocarcinoma compared with an active-comparator drug class (sulfonylureas). The cohorts included patients initiating treatment episodes with DPP4 inhibitors vs sulfonylureas, and GLP-1-receptor agonists vs sulfonylureas. We used Cox regression models, adjusted for potential confounders, to estimate hazard ratios from day 366 after treatment initiation to account for cancer latency. Results The main analyses of DPP4 inhibitors included 1,414,144 person-years of follow-up from 222,577 patients receiving DPP4 inhibitors (median [IQR] follow-up time, 4.5 [2.6–7.0] years) and 123,908 patients receiving sulfonylureas (median [IQR] follow-up time, 5.1 [2.9–7.8] years) during which 350 cholangiocarcinoma events occurred. Use of DPP4 inhibitors, compared with sulfonylureas, was not associated with a statistically significant increase in risk of cholangiocarcinoma (incidence rate 26 vs 23 per 100,000 person-years; adjusted HR, 1.15 [95% CI 0.90, 1.46]; absolute rate difference 3 [95% CI -3, 10] events per 100,000 person-years). The main analyses of GLP-1-receptor agonists included 1,036,587 person-years of follow-up from 96,813 patients receiving GLP-1-receptor agonists (median [IQR] follow-up time, 4.4 [2.4–6.9] years) and 142,578 patients receiving sulfonylureas (median [IQR] follow-up time, 5.5 [3.2–8.1] years) during which 249 cholangiocarcinoma events occurred. Use of GLP-1-receptor agonists was not associated with a statistically significant increase in risk of cholangiocarcinoma (incidence rate 26 vs 23 per 100,000 person-years; adjusted HR, 1.25 [95% CI 0.89, 1.76]; absolute rate difference 3 [95% CI -5, 13] events per 100,000 patient-years). Conclusions/interpretation In this analysis using nationwide data from three countries, use of DPP4 inhibitors and GLP-1-receptor agonists, compared with sulfonylureas, was not associated with a significantly increased risk of cholangiocarcinoma. Graphical abstract


2020 ◽  
Vol 41 (Supplement_2) ◽  
Author(s):  
S.O Troebs ◽  
A Zitz ◽  
S Schwuchow-Thonke ◽  
A Schulz ◽  
M.W Heidorn ◽  
...  

Abstract Background Global longitudinal strain (GLS) demonstrated a superior prognostic value over left ventricular ejection fraction (LVEF) in acute heart failure (HF). Its prognostic value across American Heart Association (AHA) stages of HF – especially under considering of conventional echocardiographic measures of systolic and diastolic function – has not yet been comprehensively evaluated. Purpose To evaluate the prognostic value of GLS for HF-specific outcome across AHA HF stages A to D. Methods Data from the MyoVasc-Study (n=3,289) were analysed. Comprehensive clinical phenotyping was performed during a five-hour investigation in a dedicated study centre. GLS was measured offline utilizing QLab 9.0.1 (PHILIPS, Germany) in participants presenting with sinus rhythm during echocardiography. Worsening of HF (comprising transition from asymptomatic to symptomatic HF, HF hospitalization, and cardiac death) was assessed during a structured follow-up with subsequent validation and adjudication of endpoints. AHA stages were defined according to current guidelines. Results Complete information on GLS was available in 2,400 participants of whom 2,186 categorized to AHA stage A to D were available for analysis. Overall, 434 individuals were classified as AHA stage A, 629 as stage B and 1,123 as stage C/D. Mean GLS increased across AHA stages of HF: it was lowest in stage A (−19.44±3.15%), −18.01±3.46% in stage B and highest in AHA stage C/D (−15.52±4.64%, P for trend &lt;0.0001). During a follow-up period of 3.0 [1.3/4.0] years, GLS denoted an increased risk for worsening of HF after adjustment for age and sex (hazard ratio, HRGLS [per standard deviation (SD)] 1.97 [95% confidence interval 1.73/2.23], P&lt;0.0001) in multivariable Cox regression analysis. After additional adjustment for cardiovascular risk factors, clinical profile, LVEF and E/E' ratio, GLS was the strongest echocardiographic predictor of worsening of HF (HRGLS [per SD] 1.47 [1.20/1.80], P=0.0002) in comparison to LVEF (HRLVEF [per SD] 1.23 [1.02/1.48], P=0.031) and E/E' ratio (HRE/E' [per SD] 1.12 [0.99/1.26], P=0.083). Interestingly, when stratifying for AHA stages, GLS denoted a similar increased risk for worsening of HF in individuals classified as AHA stage A/B (HRGLS [per SD] 1.63 [1.02/2.61], P=0.039) and in those classified as AHA stage C/D (HRGLS [per SD] 1.95 [1.65/2.29], P&lt;0.0001) after adjustment for age and sex. For further evaluation, Cox regression models with interaction analysis indicated no significant interaction for (i) AHA stage A/B vs C/D (P=0.83) and (ii) NYHA functional class &lt;II vs ≥II in individuals classified as AHA stage C/D (P=0.12). Conclusions GLS demonstrated a higher predictive value for worsening of HF than conventional echocardiographic measures of systolic and diastolic function. Interestingly, GLS indicated an increased risk for worsening of HF across AHA stages highlighting its potential value to advance risk prediction in chronic HF. Funding Acknowledgement Type of funding source: Public grant(s) – National budget only. Main funding source(s): German Center for Cardiovascular Research (DZHK), Center for Translational Vascular Biology (CTVB) of the University Medical Center of the Johannes Gutenberg-University Mainz


Circulation ◽  
2021 ◽  
Vol 143 (Suppl_1) ◽  
Author(s):  
Jeffrey R Misialek ◽  
Elizabeth R Stremke ◽  
Elizabeth Selvin ◽  
Sanaz Sedaghat ◽  
James S Pankow ◽  
...  

Introduction: Diabetes is a major risk factor for cardiovascular disease. Osteocalcin is a vitamin K-dependent, bone-derived hormone that functions as an endocrine regulator of energy metabolism, male fertility, and cognition. Early studies of endocrine effects of osteocalcin have shown that genomic deletion of osteocalcin in mice resulted in a diabetic phenotype (i.e. glucose intolerance, and insulin resistance). However, results from clinical studies have shown mixed associations between blood levels of osteocalcin and risk of incident type 2 diabetes mellitus. Hypothesis: Lower values of plasma osteocalcin would be associated with an increased risk of diabetes. Methods: A total of 11,557 ARIC participants without diabetes at baseline were followed from ARIC visit 3 (1993-1995) through 2018. Diabetes cases were identified through self-report on annual and semi-annual follow-up phone calls. Plasma osteocalcin data was measured using an aptamer-based proteomic profiling platform (SomaLogic). We used Cox regression to evaluate the association of quintiles of plasma osteocalcin and incident diabetes. The primary model adjusted for age, sex, and race-center. Results: Participants were age 60 ± 5.6 years at visit 3, 56% identified as female, 21% identified as Black. There were 3,031 incident diabetes cases over a median follow-up of 17.9 years. Mean ± SD was 10.053 ± 0.775. When comparing the highest quintile of plasma osteocalcin (values 10.42 to 14.66) to the lowest quintile (values 9.03 to 9.52), there was no association with incident diabetes (HRs [95% CIs]: 0.92 [0.81, 1.02]). There was also no significant trend across the quintiles (p = 0.19). Results were similar when adjusting for additional potential confounders, and when limiting the follow-up time to 10 years. Conclusions: These data do not support the hypothesis that total plasma osteocalcin, as measured by Somalogic proteomic panel, is a biomarker associated with diabetes risk. It is possible that total plasma or serum osteocalcin and/or other isoforms of osteocalcin protein (i.e. gamma carboxylated or uncarboxylated osteocalcin) measured via other validated methodologies may be linked to diabetes.


Author(s):  
Rongrong Wei ◽  
Xinyu Du ◽  
Jing Wang ◽  
Qi Wang ◽  
Xiaojie Zhu ◽  
...  

Introduction: The incidence and prognostic impact of subsequent primary gastric cancer (GC) in a population of other cancer survivors is unclear. We aimed to evaluate susceptibility to subsequent primary GC in cancer survivors and prognosis of GC with prior cancer history. Methods: 2,211 and 23,416 GC cases with and without prior cancer history were retrospectively selected from the Surveillance, Epidemiology and End Results (SEER) database. Potential risk of developing subsequent primary GC was assessed through standardized incidence ratios (SIRs). Cox regression were adopted to analyze the influence of prior cancer history and clinical characteristic factors on the prognosis of subsequent primary GC. A nomogram was established to predict overall survival (OS). Propensity score matching (PSM) was conducted to eliminate possible bias. Results: Compared with general population, cancer survivors had an increased risk of subsequent primary GC (SIR 1.17, 95% CI 1.15-1.20, P<0.05). Prior cancer history was related to poor OS of GC [adjusted hazard ratio (aHR) 1.12, 95% CI 1.06-1.19, P<0.001], but not cancer-specific survival (aHR 0.97, 95% CI 0.89-1.05, P=0.441). In addition, age, grade, stage, year of diagnosis, surgery, TNM stage and tumor size were independent prognostic factors for OS in GC cases with prior cancers. The concordance index of the nomogram was 0.72 (95% CI 0.71-0.74), and calibrate curves showed good agreement between prediction by the nomogram and actual observation. Conclusions: Cancer survivors with increased risk of developing subsequent primary GC should strengthen their monitoring and follow-up to prevent occurrence of subsequent primary gastric cancer.


2020 ◽  
Vol 35 (10) ◽  
pp. 949-960
Author(s):  
Else Toft Würtz ◽  
Johnni Hansen ◽  
Oluf Dimitri Røe ◽  
Øyvind Omland

Abstract Environmental asbestos exposure and occupational asbestos exposure increase the risk of several types of cancer, but the role of such exposures for haematological malignancies remains controversial. We aimed to examine the risk of haematological malignancies: first, in subjects exposed early in life, independently of any occupational exposure occurring later; second, in subjects exposed occupationally. We established an environmentally exposed cohort from four schools located near the only former asbestos cement production plant in Denmark. We identified nearly all pupils in the seventh grade and created an age and sex-matched 1:9 reference cohort from the Danish Central Population Register. Participants were born 1940–1970 and followed up in national registers until the end of 2015. Occupational asbestos exposure was assessed for all participants using two different job exposure matrices. The school cohort included 12,111 participants (49.7% girls) and the reference cohort 108,987 participants. Eight subgroups of haematological malignancy were identified in the Danish Cancer Registry. These cases were analysed for combined overall haematological malignancy, a combined subgroup of lymphomas and a combined subgroup of leukaemias. The data were analysed using Cox regression (hazard ratios (HR)) including other cancers and death as competing risks. Haematological malignancy was identified in 1125 participants. The median follow-up was 49.3 years (0.1–63.4). Early environmental asbestos exposure was not associated with an increased risk of haematological malignancy. Long-term occupational asbestos exposure was associated with overall haematological malignancy (HR 1.69, 95% CI 1.04–2.73); in particular for the leukaemia subgroup (HR 2.14, 95% CI 1.19–3.84). This large follow-up study suggests that long-term occupational asbestos exposure is associated with increased leukaemia risk. However, further studies are needed to confirm these observations.


2020 ◽  
Vol 8 (1) ◽  
pp. e001325 ◽  
Author(s):  
Ramachandran Rajalakshmi ◽  
Coimbatore Subramanian Shanthi Rani ◽  
Ulagamathesan Venkatesan ◽  
Ranjit Unnikrishnan ◽  
Ranjit Mohan Anjana ◽  
...  

IntroductionPrevious epidemiological studies have reported on the prevalence of diabetic kidney disease (DKD) and diabetic retinopathy (DR) from India. The aim of this study is to evaluate the effect of DKD on the development of new-onset DR and sight-threatening diabetic retinopathy (STDR) in Asian Indians with type 2 diabetes (T2D).Research design and methodsThe study was done on anonymized electronic medical record data of people with T2D who had undergone screening for DR and renal work-up as part of routine follow-up at a tertiary care diabetes center in Chennai, South India. The baseline data retrieved included clinical and biochemical parameters including renal profiles (serum creatinine, estimated glomerular filtration rate (eGFR) and albuminuria). Grading of DR was performed using the modified Early Treatment Diabetic Retinopathy Study grading system. STDR was defined as the presence of proliferative diabetic retinopathy (PDR) and/or diabetic macular edema. DKD was defined by the presence of albuminuria (≥30 µg/mg) and/or reduction in eGFR (<60 mL/min/1.73 m2). Cox regression analysis was used to evaluate the hazard ratio (HR) for DR and STDR.ResultsData of 19 909 individuals with T2D (mean age 59.6±10.2 years, mean duration of diabetes 11.1±12.1 years, 66.1% male) were analyzed. At baseline, DR was present in 7818 individuals (39.3%), of whom 2249 (11.3%) had STDR. During the mean follow-up period of 3.9±1.9 years, 2140 (17.7%) developed new-onset DR and 980 individuals with non-proliferative DR (NPDR) at baseline progressed to STDR. Higher serum creatinine (HR 1.5, 95% CI 1.3 to 1.7; p<0.0001), eGFR <30 mL/min/1.73 m2 (HR 4.9, 95% CI 2.9 to 8.2; p<0.0001) and presence of macroalbuminuria >300 µg/mg (HR 3.0, 95% CI 2.4 to 3.8; p<0.0001) at baseline were associated with increased risk of progression to STDR.ConclusionsDKD at baseline is a risk factor for progression to STDR. Physicians should promptly refer their patients with DKD to ophthalmologists for timely detection and management of STDR.


Sign in / Sign up

Export Citation Format

Share Document