scholarly journals Preoperative beta-blocker in ventricular dysfunction patients: need a more granular quality metric

2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Hanwei Tang ◽  
Kai Chen ◽  
Jianfeng Hou ◽  
Xiaohong Huang ◽  
Sheng Liu ◽  
...  

Abstract Background The use of preoperative beta-blockers has been accepted as a quality standard for patients undergoing coronary artery bypass graft (CABG) surgery. However, conflicting results from recent studies have raised questions concerning the effectiveness of this quality metric. We sought to determine the influence of preoperative beta-blocker administration before CABG in patients with left ventricular dysfunction. Methods The authors analyzed all cases of isolated CABGs in patients with left ventricular ejection fraction less than 50%, performed between 2012 January and 2017 June, at 94 centres recorded in the China Heart Failure Surgery Registry database. In addition to the use of multivariate regression models, a 1–1 propensity scores matched analysis was performed. Results Of 6116 eligible patients, 61.7% received a preoperative beta-blocker. No difference in operative mortality was found between two cohorts (3.7% for the non-beta-blockers group vs. 3.0% for the beta-blocker group; adjusted odds ratio [OR] 0.82 [95% CI 0.58–1.15]). Few differences in the incidence of other postoperative clinical end points were observed as a function of preoperative beta-blockers except in stroke (0.7% for the non-beta-blocker group vs. 0.3 for the beta-blocker group; adjusted OR 0.39 [95% CI 0.16–0.96]). Results of propensity-matched analyses were broadly consistent. Conclusions In this study, the administration of beta-blockers before CABG was not associated with improved operative mortality and complications except the incidence of postoperative stroke in patients with left ventricular dysfunction. A more granular quality metric which would guide the use of beta-blockers should be developed.

2021 ◽  
Author(s):  
Hanwei Tang ◽  
Kai Chen ◽  
Jianfeng Hou ◽  
Xiaohong Huang ◽  
Sheng Liu ◽  
...  

Abstract BackgroundThe use of preoperative beta-blockers has been accepted as a quality standard for patients undergoing coronary artery bypass graft (CABG) surgery. However, conflicting results from recent studies have raised questions concerning the effectiveness of this quality metric. We sought to determine the influence of preoperative beta-blocker administration before CABG in patients with left ventricular dysfunction.MethodsThe authors analyzed all cases of isolated CABGs in patients with left ventricular ejection fraction less than 50%, performed between 2012 January and 2017 June, at 94 centres recorded in the China Heart Failure Surgery Registry database. In addition to the use of multivariate regression models, a 1 to 1 propensity scores matched analysis was performed.ResultsOf 6,116 eligible patients, 61.7% received a preoperative beta-blocker. No difference in operative mortality was found between two cohorts (3.7% for the non-beta-blockers group vs 3.0% for the beta-blocker group; adjusted odds ratio [OR], 0.82 [95% CI, 0.58-1.15]). Few differences in the incidence of other postoperative clinical end points were observed as a function of preoperative beta-blockers except in stroke (0.7% for the non-beta-blocker group vs 0.3 for the beta-blocker group; adjusted OR, 0.39 [95% CI, 0.16-0.96]). Results of propensity-matched analyses were broadly consistent.ConclusionsIn this study, the administration of beta-blockers before CABG was not associated with improved operative mortality and complications except the incidence of postoperative stroke in patients with left ventricular dysfunction. A more granular quality metric which would guide the use of beta-blockers should be developed.


2021 ◽  
Author(s):  
Hanwei Tang ◽  
Jianfeng Hou ◽  
Kai Chen ◽  
Xiaohong Huang ◽  
Sheng Liu ◽  
...  

Abstract BackgroundData on the effect of smoking on In-hospital outcome in patients with left ventricular dysfunction undergoing coronary artery bypass graft (CABG) surgery are limited. We sought to determine the influence of smoking on CABG patients with left ventricular dysfunction.MethodsA retrospective study was conducted using data from the China Heart Failure Surgery Registry database. Eligible patients with left ventricular ejection fraction less than 50% underwent isolated CABGS were included. In addition to the use of multivariate regression models, a 1 to 1 propensity scores matched analysis was performed. Our study (n=6,531) consisted of 3,635 smokers and 2896 non-smokers. Smokers were further divided into ex-smokers (n=2373) and current smokers (n=1262).ResultsThe overall in-hospital morality was 3.9%. Interestingly, current smokers have lower in-hospital mortality than non-smokers (2.3% vs 4.9%; adjusted odds ratio [OR], 0.612 [95%CI, 0.395-0.947]). No difference was detected in mortality between ex-smokers and non-smokers (3.6% vs 4.9%; adjusted OR, 0.974 [0.715-1.327]). No significant differences in other clinical end points were observed. Results of propensity-matched analyses were broadly consistent.ConclusionsIt is paradoxically that current smokers had lower in-hospital mortality than non-smokers. Future studies should be performed to further understand the biological mechanisms that may explain this ‘smoker’s paradox’ phenomenon.


2021 ◽  
Vol 23 (Supplement_E) ◽  
pp. E28-E32
Author(s):  
Irma Bisceglia ◽  
Maria Laura Canale ◽  
Domenico Cartoni ◽  
Sabrina Matera ◽  
Sandro Petrolati

Abstract Prevention of left ventricular dysfunction predominantly induced by anthracyclines and/or trastuzumab still represents a challenge for cardio-oncology today. Indeed, this complication threatens to limit the significant gain in cancer survival achieved to date. Oncology strategies with cumulative dose limitation, continuous infusion, dexrazoxane, and liposomal formulations have been shown to decrease the risk of anthracycline cardiotoxicity. The preventive use of ace inhibitors, sartans, and/or beta-blockers has not yet provided convincing evidence and the positive effect on left ventricular ejection fraction decline appears poor without a clear clinical relevance. Assessment of the cardiovascular risk profile is a key aspect of the baseline evaluation of any patient scheduled for cancer therapy. Control and/or correction of modifiable cardiovascular risk factors is the first form of primary prevention of cardiotoxicity. It will be necessary to select populations at higher risk of developing cardiac dysfunction, identify patients genetically predisposed to develop cardiotoxicity in order to build the most appropriate strategies to correctly and timely target cardioprotective therapies.


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Hanwei Tang ◽  
Jianfeng Hou ◽  
Kai Chen ◽  
Xiaohong Huang ◽  
Sheng Liu ◽  
...  

Abstract Background Data on the effect of smoking on In-hospital outcome in patients with left ventricular dysfunction undergoing coronary artery bypass graft (CABG) surgery are limited. We sought to determine the influence of smoking on CABG patients with left ventricular dysfunction. Methods A retrospective study was conducted using data from the China Heart Failure Surgery Registry database. Eligible patients with left ventricular ejection fraction less than 50% underwent isolated CABGS were included. In addition to the use of multivariate regression models, a 1–1 propensity scores matched analysis was performed. Our study (n = 6531) consisted of 3635 smokers and 2896 non-smokers. Smokers were further divided into ex-smokers (n = 2373) and current smokers (n = 1262). Results The overall in-hospital morality was 3.9%. Interestingly, current smokers have lower in-hospital mortality than non-smokers [2.3% vs 4.9%; adjusted odds ratio (OR) 0.612 (95% CI 0.395–0.947) ]. No difference was detected in mortality between ex-smokers and non-smokers [3.6% vs 4.9%; adjusted OR 0.974 (0.715–1.327)]. No significant differences in other clinical end points were observed. Results of propensity-matched analyses were broadly consistent. Conclusions It is paradoxically that current smokers had lower in-hospital mortality than non-smokers. Future studies should be performed to further understand the biological mechanisms that may explain this ‘smoker’s paradox’ phenomenon.


2020 ◽  
Vol 41 (Supplement_2) ◽  
Author(s):  
N Enzan ◽  
S Matsushima ◽  
T Ide ◽  
H Kaku ◽  
T Higo ◽  
...  

Abstract Background Withdrawal of optimal medical therapy has been reported to relapse cardiac dysfunction in patients with dilated cardiomyopathy (DCM) whose cardiac function had improved. However, it is unknown whether beta-blockers can prevent deterioration of cardiac function in those patients. Purpose We examined the effect of beta-blockers on left ventricular ejection fraction (LVEF) in recovered DCM. Methods We analyzed the clinical personal records of DCM, a national database of Japanese Ministry of Health, Labor and Welfare, between 2003 and 2014. Recovered DCM was defined as a previously documented LVEF <40% and a current LVEF ≥40%. Patients with recovered DCM were divided into two groups according to the use of beta-blockers. The primary outcome was defined as a decrease in LVEF >10% at two years of follow-up. A one to one propensity case-matched analysis was used. A per-protocol analysis was also performed. Considering intra- and inter-observer variability of echocardiographic evaluations, we also examined outcomes by multivariable logistic regression model after changing the inclusion criteria as follows; (1) previous LVEF <40% and current LVEF ≥40%; (2) previous LVEF <35% and current LVEF ≥40%; (3) previous LVEF <30% and current LVEF ≥40%; (4) previous LVEF <40% and current LVEF ≥50%. Outcomes were also changed as (1) decrease in LVEF ≥5% (2) decrease in LVEF ≥10% (3) decrease in LVEF ≥15%. The analysis of outcomes by using combination of multiple imputation and inverse probability of treatment weighting was also conducted to assess the effects of missing data and selection bias attributable to propensity score matching on outcomes. Results From 2003 to 2014, 40,794 consecutive patients with DCM were screened. Out of 5,338 eligible patients, 4,078 received beta-blockers. Propensity score matching yielded 998 pairs. Mean age was 61.7 years and 1,497 (75.0%) was male. Mean LVEF was 49.1±8.1%. The primary outcome was observed less frequently in beta-blocker group than in no beta-blocker group (18.0% vs. 23.5%; odds ratio [OR] 0.72; 95% confidence interval [CI] 0.58–0.89; P=0.003). The prevalence of increases in LVDd (11.5% vs. 15.8%; OR 0.70; 95% CI 0.54–0.91; P=0.007) and LVDs (23.1% vs. 27.2%; OR 0.80; 95% CI 0.65–0.99; P=0.041) was also lower in the beta-blocker group. Similar results were obtained in per-protocol analysis. These results were robust to several sensitivity analyses. As a result of preventing a decrease in LVEF, the deterioration to HFrEF was also prevented by the use of beta-blocker (23.6% vs. 30.6%). Subgroup analysis demonstrated that beta-blocker prevented decrease in LVEF regardless of atrial fibrillation. Conclusion Use of beta-blocker was associated with prevention of decrease in left ventricular ejection fraction in patients with recovered DCM. Funding Acknowledgement Type of funding source: Public grant(s) – National budget only. Main funding source(s): Health Sciences Research Grants from the Japanese Ministry of Health, Labour and Welfare (Comprehensive Research on Cardiovascular Diseases)


2021 ◽  
Vol 10 (14) ◽  
pp. 3013
Author(s):  
Juyoun Kim ◽  
Jae-Sik Nam ◽  
Youngdo Kim ◽  
Ji-Hyun Chin ◽  
In-Cheol Choi

Background: Left ventricular dysfunction (LVD) can occur immediately after mitral valve repair (MVr) for degenerative mitral regurgitation (DMR) in some patients with normal preoperative left ventricular ejection fraction (LVEF). This study investigated whether forward LVEF, calculated as left ventricular outflow tract stroke volume divided by left ventricular end-diastolic volume, could predict LVD immediately after MVr in patients with DMR and normal LVEF. Methods: Echocardiographic and clinical data were retrospectively evaluated in 234 patients with DMR ≥ moderate and preoperative LVEF ≥ 60%. LVD and non-LVD were defined as LVEF < 50% and ≥50%, respectively, as measured by echocardiography after MVr and before discharge. Results: Of the 234 patients, 52 (22.2%) developed LVD at median three days (interquartile range: 3–4 days). Preoperative forward LVEF in the LVD and non-LVD groups were 24.0% (18.9–29.5%) and 33.2% (26.4–39.4%), respectively (p < 0.001). Receiver operating characteristic (ROC) analyses showed that forward LVEF was predictive of LVD, with an area under the ROC curve of 0.79 (95% confidence interval: 0.73–0.86), and an optimal cut-off was 31.8% (sensitivity: 88.5%, specificity: 58.2%, positive predictive value: 37.7%, and negative predictive value: 94.6%). Preoperative forward LVEF significantly correlated with preoperative mitral regurgitant volume (correlation coefficient [CC] = −0.86, p < 0.001) and regurgitant fraction (CC = −0.98, p < 0.001), but not with preoperative LVEF (CC = 0.112, p = 0.088). Conclusion: Preoperative forward LVEF could be useful in predicting postoperative LVD immediately after MVr in patients with DMR and normal LVEF, with an optimal cut-off of 31.8%.


Circulation ◽  
2020 ◽  
Vol 142 (Suppl_3) ◽  
Author(s):  
Daniel N Silverman ◽  
Jeanne d de Lavallaz ◽  
Timothy B Plante ◽  
Margaret M Infeld ◽  
Markus Meyer

Introduction: Recent investigation has identified that discontinuation of beta-blockers in subjects with normal left ventricular ejection fraction (LVEF) leads to a reduction in natriuretic peptide levels. We investigated whether a similar trend would be seen in a hypertension clinical trial cohort. Methods: In 9,012 subjects hypertensive subjects without a history of symptomatic heart failure, known LVEF <35% or recent heart failure hospitalization enrolled in the Systolic Blood Pressure Intervention Trial (SPRINT), we compared incidence of loop diuretic initiation and time to initiation following start of a new anti-hypertensive medication. The categorical relationship (new antihypertensive class followed by loop-diuretic use) and temporal relationship (time to loop diuretic initiation) were each analyzed. The categorical relationship was assessed using a Pearson’s chi-squared test and the temporal relationship using a Wilcoxon rank sum test. Bonferroni-corrected p-values were utilized for all comparisons. Results: Among the 9,012 subjects analyzed, the incidence of anti-hypertensive initiation and loop diuretic initiation was greatest following start of a beta-blocker (16.6%) compared with other antihypertensive medication classes (calcium channel blocker 13.8%, angiotensin converting enzyme-inhibitor/angiotensin receptor blocker 12.9% and thiazide diuretic 10.2%; p<0.001). In addition, the median time between starting a new antihypertensive medication and loop diuretic was the shortest for beta-blockers and longest for thiazides (both p <0.01). No significant differences in renal function were identified between groups. Conclusion: Compared to other major classes of hypertensive agents, starting beta-blockers was associated with more common and earlier initiation of a loop diuretics in a population without heart failure at baseline. This finding may suggest beta-blocker induced heart failure in a population with a predominantly normal ejection fraction.


Chemotherapy ◽  
2018 ◽  
Vol 63 (6) ◽  
pp. 315-320 ◽  
Author(s):  
Matteo Sarocchi ◽  
Eleonora Arboscello ◽  
Giorgio Ghigliotti ◽  
Roberto Murialdo ◽  
Claudia Bighin ◽  
...  

Background: Patients developing cancer treatment-related left ventricular dysfunction (CTrLVD) require a prompt therapy. Hypotension, dizziness, and fatigue often limit the use of angiotensin-converting enzyme inhibitors (ACEi), angiotensin receptor blockers (ARB), and β-blockers (BB) in cancer patients who may already be afflicted by these symptoms. Ivabradine is a heart rate-lowering drug that does not cause hypotension and may be used in heart failure with reduced left ventricular ejection fraction (LVEF). Objective: The aim of this paper was to investigate the role of ivabradine to treat CTrLVD. Methods: A retrospective analysis in a cohort of 30 patients with CTrLVD (LVEF < 50%) receiving ivabradine on top of the maximal tolerated dose of ACEi/ARB and BB was performed. We evaluated cardiovascular treatment, oncologic treatment, LVEF, functional class (New York Heart Association [NYHA]), and fatigue during the study period. Results: Ivabradine was initially started at the dose of 2.5 mg/b.i.d. in most patients and then carefully titrated. Hypotension (70%) and fatigue (77%) were the main causes limiting the treatment with ACEi/ARB and BB. After a mean follow-up of 6.5 months, LVEF increased from 45.1% (SD = 6.4) to 53.2% (SD = 3.9; p < 0.001). When patients were analyzed according to the type of cancer therapy, no difference in LVEF changes across the groups was found. NYHA class ameliorated in 11 patients, while fatigue improved in 8 patients. No serious cardiovascular side effects were reported. Conclusions: The ability to improve symptoms and LVEF in unfit cancer patients makes ivabradine a reasonable pharmacological tool for treating CTrLVD.


2009 ◽  
Vol 297 (2) ◽  
pp. H743-H749 ◽  
Author(s):  
Alexandru B. Chicos ◽  
Prince J. Kannankeril ◽  
Alan H. Kadish ◽  
Jeffrey J. Goldberger

Depressed parasympathetic activity has been proposed to be associated with an increased risk of sudden death. Parasympathetic effects (PE) on cardiac electrophysiology during exercise and recovery have not been studied in patients with left ventricular dysfunction. We performed noninvasive electrophysiological studies (NI-EPS) and characterized the electrophysiological properties of the sinus node, atrioventricular (AV) node, and ventricle in subjects with depressed left ventricular ejection fraction and dual-chamber defibrillators. NI-EPS were performed during rest, exercise, and recovery at baseline and after parasympathetic blockade with atropine to assess PE (the difference between parameter values in the 2 conditions). Ten subjects (9 men: age, 60 ± 9 yr; and left ventricular ejection fraction, 29 ± 8%) completed the study. All NI-EPS parameters decreased during exercise and trended toward rest values during recovery. PE at rest, during exercise, and during recovery, respectively, were on sinus cycle length, 320 ± 71 ( P = 0.0001), 105 ± 60 ( P = 0.0003), and 155 ± 82 ms ( P = 0.0002); on AV block cycle length, 137 ± 136 ( P = 0.09), 37 ± 19 ( P = 0.002), and 61 ± 39 ms ( P = 0.006); on AV interval, 58 ± 32 ( P = 0.035), 22 ± 13 ( P = 0.002), and 36 ± 20 ms ( P = 0.001); on ventricular effective refractory period, 15.8 ± 11.3 ( P = 0.02), 4.7 ± 15.2 ( P = 0.38), and 6.8 ± 15.5 ms ( P = 0.20); and on QT interval, 13 ± 12 ( P = 0.13), 3 ± 17 ( P = 0.6), and 20 ± 23 ( P = 0.04). In conclusion, we describe for the first time the changes in cardiac electrophysiology and PE during rest, exercise, and recovery in subjects with left ventricular dysfunction. PEs are preserved in these patients. Thus the role of autonomic changes in the pathophysiology of sudden death requires further exploration.


Sign in / Sign up

Export Citation Format

Share Document