Electrocardiographic parameters among beta-thalassemia major patients

2020 ◽  
Vol 41 (Supplement_2) ◽  
Author(s):  
D Patsourakos ◽  
C Aggeli ◽  
K Gatzoulis ◽  
S Delicou ◽  
Y Dimitroglou ◽  
...  

Abstract Background/Introduction The majority of beta thalassemia major (β-TM) patients suffer from cardiac disorders, while a significant proportion of them die suddenly. Twelve-lead and signal-averaged electrocardiography are simple, inexpensive, readily available tools for identifying an unfavorable arrhythmiological substrate by detecting the presence of arrhythmias, conduction abnormalities and late potentials (LPs) in these patients. Methods 47 β-TM patients and 30 healthy controls were submitted to twelve-lead and signal-averaged electrocardiography. Basal rhythm, heart rate, PR interval duration, QRS complex duration and morphology, QTc interval duration and prevalence of LPs were recorded. Results β-TM patients demonstrated a more prolonged PR segment (167.74 msec vs. 147.07 msec) (p=0.043), a higher prevalence of PR prolongation (21.05% vs. 0%) (p=0.013) and a higher prevalence of LPs (18/47, 38.3% vs. 2/30, 6.7%) (p=0.002) compared with controls. In particular, every single SAECG parameter significantly differed among patients compared with controls. Among patients, left ventricular ejection fraction was marginally lower and QTc more prolonged among LPs positive subgroup compared with LPs negative subgroup. The prevalence of atrial fibrillation among b-TM patients was estimated at 10.64%. Conclusions β-TM patients have a higher prevalence of a prolonged PR segment, atrial fibrillation and LPs. Twelve-lead and SAECG performance was feasible in all subjects and constitutes a readily available tool for assessing myocardial electrophysiological alterations in this patient group, that could have significant impact on survival and quality of life with timely application of appropriate treatment. Funding Acknowledgement Type of funding source: None

2020 ◽  
Author(s):  
Aubrey E. Jones ◽  
Zameer Abedin ◽  
Olesya Ilkun ◽  
Rebeka Mukherjee ◽  
Mingyuan Zhang ◽  
...  

AbstractBackgroundClinical decision support tools for atrial fibrillation (AF) should include CHA2DS2- VASc scores to guide oral anticoagulant (OAC) treatment.ObjectiveWe compared automated, electronic medical record (EMR) generated CHA2DS2- VASc scores to clinician-documented scores, and report the resulting proportions of patients in the OAC treatment group.MethodsPatients were included if they had both a clinician documented and EMR-generated CHA2DS2-VASc score on the same day. EMR scores were based on billing codes, left ventricular ejection fraction from echocardiograms, and demographics; documented scores were identified using natural language processing. Patients were deemed “re-classified” if the EMR score was ≥2 but the documented score was <2, and vice versa. For the overall cohort and subgroups (sex and age group), we compared mean scores using paired t-tests and re-classification rates using chi-squared tests.ResultsAmong 5,767 patients, the mean scores were higher using EMR compared to documented scores (4.05 [SD 2.1] versus 3.13 [SD 1.8]; p<0.01) for the full cohort, and all subgroups (p<0.01 for all comparisons). If EMR scores were used to determine OAC treatment instead of documented scores, 8.3% (n=479, p<0.01) of patients would be re-classified, with 7.2% moving into and 1.1% moving out of the treatment group. Among 2,322 women, 4.7% (n=109, p<0.01) would be re-classified, with 4.1% into and 0.7% out of the treatment group. Among 3,445 men, 10.7% (n=370, p<0.01) would be re-classified, with 9.2% into and 1.5% out of the treatment group. Among 2,060 patients <65 years old, 18.1% (n=372, p<0.01) would be re-classified, with 15.8% into and 2.3% out of the treatment group. Among 1,877 patients 65-74 years old, 5.4% (n=101, p<0.01) would be re-classified, with 4.4% into and 1.0% out of the treatment group. Among 1,830 patients ≥75 years old, <1% would move into to the treatment group and none would move out of the treatment group.ConclusionsEMR-based CHA2DS2-VASc scores were, on average, almost a full point higher than the clinician-documented scores. Using EMR scores in lieu of documented scores would result in a significant proportion of patients moving into the treatment group, with the highest re-classifications rates in men and patients <65 years old.


Hematology ◽  
2009 ◽  
Vol 2009 (1) ◽  
pp. 664-672 ◽  
Author(s):  
Heather A. Leitch ◽  
Linda M. Vickars

AbstractThe myelodysplastic syndromes (MDS) are characterized by cytopenias and risk of transformation to acute myeloid leukemia (AML). Although new treatments are available, a mainstay in MDS remains supportive care, which aims to minimize the impact of cytopenias and transfusion of blood products. Red blood cell (RBC) transfusions place patients at risk of iron overload (IOL). In beta-thalassemia major (BTM), IOL from chronic RBC transfusions inevitably leads to organ dysfunction and death. With iron chelation therapy (ICT), survival in BTM improved from the second decade to near normal and correlated with ICT compliance. Effects of ICT in BTM include reversal of cardiac arrhythmias, improvement in left ventricular ejection fraction, arrest of hepatic fibrosis, and reduction of glucose intolerance.It is not clear whether these specific outcomes are applicable to MDS. Although retrospective, recent studies in MDS suggest an adverse effect of transfusion dependence and IOL on survival and AML transformation, and that lowering iron minimizes this impact. These data raise important points that warrant further study. ICT is potentially toxic and cumbersome, is costly, and in MDS patients should be initiated only after weighing potential risks against benefits until further data are available to better justify its use. Since most MDS patients eventually require RBC transfusions, the public health implications both of transfusion dependence and ICT in MDS are considerable. This paper summarizes the impact of cytopenias in MDS and treatment approaches to minimize their impact, with a focus on RBC transfusions and their complications, particularly with respect to iron overload.


Author(s):  
Akshay Ashok Bafna ◽  
Hetan C. Shah

Background: To evaluate the myocardial function and its correlation with serum ferritin and the number of transfusions in beta-thalassemia major patients by using standard echocardiography and left ventricular strain imaging.Methods: This was a cross-sectional exploration study comprised of 56 beta-thalassemia patients conducted at a tertiary-care center in India between September 2016 and August 2017. Patients with age less than 18 years, diagnosed with thalassemia major, recipients of >20 units of blood transfusions, and normal Left Ventricular (LV) function by 2D-echocardiography were included in the study. Severity of iron overload was determined by using serum ferritin levels and LV strain imaging parameters were evaluated by using strain values of 17 LV segments.Results: A total of 56 beta-thalassemia patients were included in the study. Of these, 29(51.8%) patients were boys and 27(48.2%) patients were girls with a mean age of 7.8±1.84 years. Average serum ferritin level was found to be 4089.83 ng/dl. Strain values of the basal lateral wall of the left ventricle were significantly abnormal in patients who received more (>80) transfusions compared with those who received lesser transfusions (p=0.025 and p=0.045), respectively. Patients with serum ferritin >6000 ng/ml had impaired strain (p=0.03).Conclusions: Conventional echocardiographic parameters and Left Ventricular Ejection Fraction (LVEF) do not provide adequate information about LV dysfunction. Systolic strain index imaging of the LV indicated the presence of early LV systolic dysfunction in patients who received a greater number of blood transfusions and patients with higher serum ferritin levels.


2021 ◽  
Vol 42 (Supplement_1) ◽  
Author(s):  
D Patsourakos ◽  
C Aggeli ◽  
K Gatzoulis ◽  
S Delicou ◽  
Y Dimitroglou ◽  
...  

Abstract Introduction Atrial cardiomyopathy is present in a significant proportion of beta thalassemia major (β-TM) patients, complicating their clinical condition. The diagnosis of atrial cardiomyopathy is challenging using conventional echocardiographic techniques. Purpose In our study we aimed to identify the presence of atrial cardiomyopathy by applying novel echocardiographic techniques in these patients. Methods 56 β-TM patients (mean age 39.3±9 years, 50% male sex) and 30, age and sex matched, healthy controls were examined by transthoracic echocardiography. Conventional echocardiographic parameters were estimated alongside with deformation indices (left atrial strain at reservoir (LASr), conduit (LAScd) and contraction (LASct) phase respectively as well as left ventricular global longitudinal strain (GLS)). T2* was calculated by cardiac magnetic resonance imaging in β-TM patients. Results LAVI, E/e' ratio, GLS and left atrial deformation parameters differed between patients and controls. In patient group, left atrial deformation indices were correlated with LAVI, E/e' ratio, GLS and T2* (Table 1). GLS was also correlated with LAVI, but not with T2* or E/e' ratio. T2* was correlated only with left atrial deformation indices. Patient with prior episodes of atrial fibrillation were older, had increased E/e' and LAVI and impaired left atrial deformation indices but did not differ in terms of GLS or T2* (Figure 1). Patients with iron overload differed only in terms of left atrial deformation parameters. Conclusions Atrial deformation indices could be of clinical use in the early detection of atrial cardiomyopathy. Impaired left atrial strain may be associated with silent atrial fibrillation and be indicative of myocardial iron overload. FUNDunding Acknowledgement Type of funding sources: None. Correlation table Scatter plot of T2* and LASr


Blood ◽  
2008 ◽  
Vol 112 (11) ◽  
pp. 3886-3886
Author(s):  
Renzo Galanello ◽  
Antonio Piga ◽  
Maria- Eliana Lai ◽  
Gianluca Forni ◽  
Fabrice Danjou ◽  
...  

Abstract Introduction: Cardiac iron overload secondary to red blood cell transfusion is a common complication of thalassemia major despite the wide use of chelation therapy and cardiac disease still account for up to 70% of deaths in these patients. Furthermore, myocardial siderosis can be accurately assessed by cardiovascular magnetic resonance (CMR) using T2* sequence. Available chelators seem to have unique profiles of organ iron removal. We report a retrospective analysis of the effect of three available chelators on cardiac iron in patients with thalassemia major. Methods: Fifty-two patients on subcutaneous desferrioxamine (DFO), 28 on deferiprone (DFP) and 28 on deferasirox (DFX) are included in the study. DFO was administered at a mean dosage of 36 ± 8 mg/kg/d for 10 to 14 hours per day for 21 ± 13 months, DFP at 86 ± 10 mg/kg/d for 23 ± 12 months and DFX at 27 ± 6mg/kg/d for 15 ± 5 months. T2* was measured according to Westwood et al (2005). Entry criteria included in each patient a value of T2* lower than 20 ms at baseline or at final assessment. Left ventricular ejection fraction (LVEF) was measured by echocardiography. Results: At baseline, the 3 treatment groups did not show any significant difference in age blood consumption and cardiac T2* (DFO group: 13.4 ± 4.5 ms, DFP 13.9 ± 3.7 and DFX 12.8 ± 3.7; p=0.573).. At the last evaluation mean cardiac T2* was slightly increased in patients on DFO (13.9 ± 8.6 ms, p=0.8) and in those on DFX (13.8 ± 4.4 ms, p=0.04), while was substantially increased in patients on DFP (21.7 ± 9.2 ms, p=0.001). To correct for the different duration of treatment we calculated the percentage monthly cardiac T2* changes that were significantly higher in patients on DFP (1,84 ± 1.94), as compared to patients on DFO (0.2 3 ± 2.15) or DFX (0.45 ± 1.49) (p<0.001). No differences were detected between mean LVEF at baseline and at last assessment in all 3 groups. Conclusions: In this retrospective study monotherapy with deferiprone was significantly more effective than desferrioxamine and deferasirox in alleviating myocardial siderosis in patients with beta-thalassemia major. Further studies are needed to understand if the cardiac T2* changes are influenced by the chelator dosages.


2020 ◽  
Vol 41 (Supplement_2) ◽  
Author(s):  
K.V Bunting ◽  
S Gill ◽  
A Sitch ◽  
S Mehta ◽  
K O'Connor ◽  
...  

Abstract Introduction Echocardiography is essential for the management of patients with atrial fibrillation (AF), but current methods are time consuming and lack any evidence of reproducibility. Purpose To compare conventional averaging of consecutive beats with an index beat approach, where systolic and diastolic measurements are taken once after two prior beats with a similar RR interval (not more than 60 ms difference). Methods Transthoracic echocardiography was performed using a standardized and blinded protocol in patients enrolled into the RAte control Therapy Evaluation in permanent AF randomised controlled trial (RATE-AF; NCT02391337). AF was confirmed in all patients with a preceding 12-lead ECG. A minimum of 30-beat loops were recorded. Left ventricular function was determined using the recommended averaging of 5 and 10 beats and using the index beat method, with observers blinded to clinical details. Complete loops were used to calculate the within-beat coefficient of variation (CV) and intraclass correlation coefficient (ICC) for Simpson's biplane left ventricular ejection fraction (LVEF), global longitudinal strain (GLS) and filling pressure (E/e'). Results 160 patients (median age 75 years (IQR 69–82); 46% female) were included, with median heart rate 100 beats/min (IQR 86–112). For LVEF, the index beat had the lowest CV of 32% compared to 51% for 5 consecutive beats and 53% for 10 consecutive beats (p&lt;0.001). The index beat also had the lowest CV for GLS (26% versus 43% and 42%; p&lt;0.001) and E/e' (25% versus 41% and 41%; p&lt;0.001; see Figure for ICC comparison). Intra-operator reproducibility, assessed by the same operator from two different recordings in 50 patients, was superior for the index beat with GLS bias −0.5 and narrow limits of agreement (−3.6 to 2.6), compared to −1.0 for 10 consecutive beats (−4.0 to 2.0). For inter-operator variability, assessed in 18 random patients, the index beat also showed the smallest bias with narrow confidence intervals (CI). Using a single index beat did not impact on the validity of LVEF, GLS or E/e' measurement when correlated with natriuretic peptides. Index beat analysis substantially shortened analysis time; 35 seconds (95% CI 35 to 39 seconds) for measuring E/e' with the index beat versus 98 seconds (95% CI 92 to 104 seconds) for 10 consecutive beats (see Figure). Conclusion Index beat determination of left ventricular function improves reproducibility, saves time and does not compromise validity compared to conventional quantification in patients with heart failure and AF. After independent validation, the index beat method should be adopted into routine clinical practice. Comparison for measurement of E/e' Funding Acknowledgement Type of funding source: Public grant(s) – National budget only. Main funding source(s): National Institute of Health Research UK


2021 ◽  
Vol 10 (Supplement_1) ◽  
Author(s):  
H Santos ◽  
T Vieira ◽  
J Fernandes ◽  
AR Ferreira ◽  
M Rios ◽  
...  

Abstract Funding Acknowledgements Type of funding sources: None. Introduction The development of cardiogenic shock (CS) is associated with worse prognosis, and can produce several hemodynamic manifestations. Then, is not surprised the manifestation of new-onset atrial fibrillation (AF) in these patients. Purpose Evaluate the impact of cardiovascular previous history, clinical signs and diagnosis procedures at admission as predictors of new-onset of AF in CS. Methods Single-centre retrospective study, engaging patients hospitalized for CS between 1/01/2014-30/10/2018. 222 patients with CS are included, 40 of them presented new onset of AF. Chi-square test, T-student test and Mann-Whitney U test were used to compare categorical and continuous variables. Multiple linear regression analysis was performed to evaluate predictors of new-onset AF in CS patients. Results CS patients without AF had a mean age of 61.08 ± 13.77 years old, on the other hand new-onset of AF patients in the setting of CS had a mean age of 67.02 ± 14.21 years old (p = 0.016). Nevertheless, no differences between the two groups was detected regarding the sex cardiovascular history (namely arterial hypertension, diabetes, dyslipidemia, obesity, smoker status, alcohol intake, previous acute coronary syndrome, history of angina, previous cardiomyopathy), neoplasia history, cardiac arrest during the CS, clinical signs at admission (like heart rate, blood pressure, respiratory rate), blood results (hemoglobin, leukocytes, troponin, creatinine, C-Reactive protein), left ventricular ejection fraction and the culprit lesion. New-onset of AF in CS patient had not impact in mortality rates. Multiple logistic regression reveals that only age was a predictor of new onset of AF in CS patients (odds ratio 1.032, confident interval 1.004-1.060, p = 0.024). Conclusions Age was the best predictor of new-onset AF in CS patients. The presence of this arrhythmia can have a hemodynamic impact, however, seems not influenced the final outcome.


2020 ◽  
Vol 41 (Supplement_2) ◽  
Author(s):  
S Ikeda ◽  
M Iguchi ◽  
H Ogawa ◽  
Y Aono ◽  
K Doi ◽  
...  

Abstract Background Hypertension is one of the major risk factors of cardiovascular events in patients with atrial fibrillation (AF). However, relationship between diastolic blood pressure (DBP) and cardiovascular events in AF patients remains unclear. Methods The Fushimi AF Registry is a community-based prospective survey of AF patients in Japan. Follow-up data were available in 4,466 patients, and 4,429 patients with available data of DBP were examined. We divided the patients into three groups; G1 (DBP&lt;70 mmHg, n=1,946), G2 (70≤DBP&lt;80, n=1,321) and G3 (80≤DBP, n=1,162), and compared the clinical background and outcomes between groups. Results The proportion of female was grater in G1 group, and the patients in G1 group were older and had higher prevalence of heart failure (HF), diabetes mellitus (DM), chronic kidney disease (CKD). Prescription of beta blockers was higher in G1 group, but that of renin-angiotensin system-inhibitors and calcium channel blocker was comparable. During the median follow-up of 1,589 days, in Kaplan-Meier analysis, the incidence rates of cardiovascular events (composite of cardiac death, ischemic stroke and systemic embolism, major bleeding and HF hospitalization during follow up) were higher in G1 group and G3 group than G2 group (Figure 1). When we divided the patients based on the systolic blood pressure (SBP) at baseline (≥130 mmHg or &lt;130 mmHg), the incidence of rates of cardiovascular events were comparable among groups. Multivariate Cox proportional hazards regression analysis including female gender, age (≥75 years), higher SBP (≥130 mmHg), DM, pre-existing HF, CKD, low left ventricular ejection fraction (&lt;40%) and DBP (G1, G2, G3) revealed that DBP was an independent determinant of cardiovascular events (G1 group vs. G2 group; hazard ratio (HR): 1.40, 95% confidence intervals (CI): 1.19–1.64, G3 group vs. G2 group; HR: 1.23, 95% CI: 1.01–1.49). When we examined the impact of DBP according to 10 mmHg increment, patients with very low DBP (&lt;60 mmHg) (HR: 1.50,95% CI:1.24–1.80) and very high DBP (≥90 mmHg) (HR: 1.51,95% CI:1.15–1.98) had higher incidence of cardiovascular events than patients with DBP of 70–79 mmHg (Figure 2). However, when we examined the impact of SBP according to 20 mmHg increment, SBP at baseline was not associated with the incidence of cardiovascular events (Figure 3). Conclusion In Japanese patients with AF, DBP exhibited J curve association with higher incidence of cardiovascular events. Funding Acknowledgement Type of funding source: None


2021 ◽  
Vol 10 (9) ◽  
pp. 1829
Author(s):  
Marcin Wełnicki ◽  
Iwona Gorczyca ◽  
Wiktor Wójcik ◽  
Olga Jelonek ◽  
Małgorzata Maciorowska ◽  
...  

Background: Hyperuricemia is an established risk factor for cardiovascular disease, including atrial fibrillation (AF). The prevalence of hyperuricemia and its clinical significance in patients with already diagnosed AF remain unexplored. Methods: The Polish Atrial Fibrillation (POL-AF) registry includes consecutive patients with AF hospitalized in 10 Polish cardiology centers from January to December 2019. This analysis included patients in whom serum uric acid (SUA) was measured. Results: From 3999 POL-AF patients, 1613 were included in the analysis. The mean age of the subjects was 72 ± 11.6 years, and the mean SUA was 6.88 ± 1.93 mg/dL. Hyperuricemia was found in 43% of respondents. Eighty-four percent of the respondents were assigned to the high cardiovascular risk group, and 45% of these had SUA >7 mg/dL. Comparison of the extreme SUA groups (<5 mg/dL vs. >7 mg/dL) showed significant differences in renal parameters, total cholesterol concentration, and left ventricular ejection fraction (EF). Multivariate regression analysis showed that SUA >7 mg/dL (OR 1.74, 95% CI 1.32–2.30) and GFR <60 mL/min/1.73 m2 (OR 1.94, 95% CI 1.46–2.48) are significant markers of EF <40% in the study population. Female sex was a protective factor (OR 0.74, 95% CI 0.56–0.97). The cut-off point for SUA with 60% sensitivity and specificity indicative of an EF <40% was 6.9 mg/dL. Conclusions: Although rarely assessed, hyperuricemia appears to be common in patients with AF. High SUA levels may be a significant biomarker of reduced left ventricular EF in AF patients.


Sign in / Sign up

Export Citation Format

Share Document