scholarly journals Serratus anterior plane block reduces perioperative sedation use and discharge analgesia in subcutaneous implantable defibrillator implantation

EP Europace ◽  
2021 ◽  
Vol 23 (Supplement_3) ◽  
Author(s):  
A Mistry ◽  
V Poornanachandran ◽  
H Dhutia ◽  
R Chelliah ◽  
R Pathmanathan

Abstract Funding Acknowledgements Type of funding sources: None. Background The subcutaneous implantable cardioverter defibrillator (S-ICD) has now become a well-recognised alternative to traditional devices and can be used as a first-line option, avoiding the risks associated with a transvenous lead. Standard implantation is either performed under general anaesthesia or under sedation. Ultrasound-guided serratus anterior plane block (SAPB) has recently been introduced to provide periprocedural anaesthesia and analgesia. Purpose To assess whether SAPB reduces periprocedural analgesia/anaesthesia and post-operative analgesia in S-ICD implantation compared with standard perioperative analgesia/anaesthesia. Methods One hundred and twenty patients eligible for ICD implantation for standard indications were offered a S-ICD over a five-year period (2014-2019) at a single tertiary cardiac centre. From July 2014 to September 2018, consecutive cases were performed with standard analgesia/anaesthesia and performed using a standard two-incision technique. From October 2018 onwards, SAPB was performed in addition to standard perioperative analgesia/anaesthesia. This involved ultrasound-guided infiltration of 50ml prilocaine into the interfascial plane between the serratus anterior muscle and latissimus dorsi at the mid axillary line over the level of the 5th rib. Collection of data was performed at the six-week follow up with all data obtained from routinely collected local registry. Results The mean age at implant was 52.0 years (+15.9 years) and 102 (85.0%) were male. The mean body mass index was 27.9 (+5.2). 85 (70.8%) had a primary prevention indication. 64 (52.3%) patients had a left ventricular ejection fraction (LVEF) of <35%. 79 (65.8%) patients underwent standard implantation without SAPB (SAPB-) and 41 (34.2%) patients with SAPB (SAPB+). There were no significant differences in age, sex, BMI, left ventricular ejection fraction, comorbidities, aetiology and indication between SAPB- and SAPB+ cohorts. In the SAPB+ cohort, a greater proportion were performed using conscious sedation (97.5% vs 84.8%; p = 0.036) with a lower required dose of midazolam (3.3mg vs 6.4mg; p < 0.001). 34 (83%) patients in the SABP+ cohort required no analgesia at discharge compared for 42 (53.2%) in the SAPB- cohort (p = 0.042). There was a trend towards lesser use of periprocedural morphine (6.2mg vs 7.4mg; p = 0.071) and reduced hospital stay (0.7 days vs 1.1 days; p = 0.102) in the SAPB+ cohort. The use of SAPB did not significantly increase total procedural time (63mins vs 57mins; p = 0.110), defined as the total duration for SAPB administration and S-ICD implantation. There were no periprocedural complications and no complications at follow up. Conclusion The use of SAPB significantly reduces the dose of sedation required for S-ICD implantation as well as the need for analgesia at discharge without a significant impact on procedure duration.

Author(s):  
Rory Hachamovitch ◽  
Benjamin Nutter ◽  
Manuel D Cerqueira ◽  

Background . The use of implantable cardiac defibrillators has been associated with improved survival in several well-defined patient (pt) subsets. Its utilization for primary prevention in eligible pts, however, is unclear. We sought to examine the frequency of ICD implantation (ICD-IMP) for primary prevention in a cohort prospectively enrolled in a prospective, multicenter registry of ICD candidates. Methods . We identified 961 pts enrolled in the AdreView Myocardial Imaging for Risk Evaluation in Heart Failure (ADMIRE-HF) study, a prospective, multicenter study evaluating the prognostic usefulness of 123I-mIBG scintigraphy in a heart failure population. Inclusion criteria limited patients to those meeting guideline criteria for ICD implantation; these criteria included left ventricular ejection fraction ≤35% and New York Heart Association functional class II-III. We excluded pts with an ICD at the time of enrollment, leaving a study cohort of 934 patients. Pts were followed up for 24 months after enrollment. Pts undergoing ICD-IMP after enrollment for secondary prevention were censored at the time of intervention. The association between ICD-IMP utilization and demographic, clinical, laboratory, and imaging data was examined using Cox proportional hazards analysis (CPH). Results . Of 934 pts, 196 (21%) were referred for ICD-IMP over a mean follow-up of 612±242 days. Implantations occurred 167±164 days after enrollment. Patients referred for ICD were younger (61±12 vs. 63±12), but did not differ with respect to proportion female (17% vs. 21%), African-American race (12% vs. 15%), diabetics (37% vs. 36%) (All p=NS). The frequency of ICD-IMP did not differ as a function of age, race, sex, LVEF, or imaging result (All p=NS). CPH revealed that a model including age, race, sex, diabetes, smoking, BMI, NYHA class, hypertension, heart failure etiology, and prior MI identified none of these as predictive of ICD-IMP. Conclusion: This analysis of prospective registry data reveals that in patients who are guideline-defined candidates for ICD-IMP, only about one in five receive an ICD over a two year follow-up interval. Multivariable modeling failed to identify any factor associated with ICD use.


Author(s):  
Parisa Gholami ◽  
Shoutzu Lin ◽  
Paul Heidenreich

Background: BNP testing is now common though it is not clear if the test results are used to improve patient care. A high BNP may be an indicator that the left ventricular ejection fraction (LVEF) is low (<40%) such that the patient will benefit from life-prolonging therapy. Objective: To determine how often clinicians obtained a measure of LVEF (echocardiography, nuclear) following a high BNP value when the left ventricular ejection fraction (LVEF) was not known to be low (<40%). Methods and Results: We reviewed the medical records of 296 consecutive patients (inpatient or outpatient) with a BNP values of at least 200 pg/ml at a single medical center (tertiary hospital with 8 community clinics). A prior diagnosis of heart failure was made in 65%, while 42% had diabetes, 79% had hypertension, 59% had ischemic heart disease and 31% had chronic lung disease. The mean age was 73 ± 12 years, 75% were white, 10% black, 15% other and the mean BNP was 810 ± 814 pg/ml. The LVEF was known to be < 40% in 84 patients (28%, mean BNP value of 1094 ± 969 pg/ml). Of the remaining 212 patients without a known low LVEF, 161 (76%) had a prior LVEF >=40% ( mean BNP value of 673 ± 635 pg/ml), and 51 (24%) had no prior LVEF documented (mean BNP 775 ± 926 pg/ml). Following the high BNP, a measure of LVEF was obtained (including outside studies documented by the primary care provider) within 6 months in only 53% (113 of 212) of those with an LVEF not known to be low. Of those with a follow-up echocardiogram, the LVEF was <40% in 18/113 (16%) and >=40% in 95/113 (84%). There was no significant difference in mean initial BNP values between those with a follow-up LVEF <40% (872 ± 940pg/ml), >=40% (704 ± 737 pg/ml), or not done (661 ± 649 pg/ml, p=0.5). Conclusions: Follow-up measures of LVEF did not occur in almost 50% of patients with a high BNP where the information may have led to institution of life-prolonging therapy. Of those that did have a follow-up study a new diagnosis of depressesd LVEF was noted in 16%. Screening of existing BNP and LVEF data and may be an efficient strategy to identify patients that may benefit from life-prolonging therapy for heart failure.


2020 ◽  
Vol 1 (1) ◽  
pp. 12-17
Author(s):  
Mehmet Küçükosmanoğlu ◽  
Cihan Örem

Introduction: MPI is an echocardiographic parameter that exibit the left ventricular functions globally. NT-proBNP  is an important both diagnostic and prognostic factor in heart failure. In this study, we aimed to investigate the prognostic significance of serum NT-proBNP levels and MPI in patients with STEMI. Method: Totally 104 patients with a diagnosis of STEMI were included in the study. Patients followed for 30-days and questioned for presence of symptoms of heart failure (HF) and cardiac death. Patients were invited for outpatient control after 30-days and were divided into two groups: (HF (+) group) and (HF (-) group). Results: Totally 104 patients with STEMI were hospitalized in the coronary intensive care unit. Of those patients, 17 were female (16%), 87 were male (84%), and the mean age of the patients was 58.9±10.8 years. During the 30-day follow-up, 28 (27%) of 104 patients developed HF. The mean age, hypertension ratio and anterior STEMI rate were significantly higher in the HF (+) group compared to the HF (-) group. Ejection time (ET) and left ventricular ejection fraction (LVEF) were significantly lower and MPI was significantly higher in the HF (+) group. When the values on day first and  sixth were compared, NT-ProBNP levels were decreased in both groups. There was no significant difference between the two groups in terms of the change in MPI values on the first and sixth days. Multiple regression analysis showed that the presence of anterior MI, first day NT-proBNP level and LVEF were independently associated with development of HF and death. Conclusion: In our study, NT-proBNP levels were found to be positively associated with MPI in patients with acute STEMI. It was concluded that the level of NT-proBNP detected especially on the 1st day was more valuable than MPI in determining HF development and prognosis after STEMI.  


2021 ◽  
Vol 42 (Supplement_1) ◽  
Author(s):  
R Karmous ◽  
E Bennour ◽  
I Kammoun ◽  
A Sghaier ◽  
W Chaieb ◽  
...  

Abstract Background Cardio-oncology has arisen as one of the most rapidly expanding fields of cardiovascular medicine. The accumulated evidence on the possibilities of early diagnosis of cardiotoxicity provided by imaging techniques as well as on the benefits of preventive and therapeutic interventions is also increasing. Objective This study reported our echocardiography lab's initial experience of a cardio-oncology follow-up program. Methods We prospectively studied the outcomes of 107 patients diagnosed with breast cancer who attended our follow-up program between 2017 and 2020. An echocardiographic monitoring were realised according to the chemotherapy protocol. Cancer therapy-related cardiac dysfunction (CTRCD) is defined, according to the european society of cardiology (ESC) guidelines of 2016, as a drop of left ventricular ejection fraction (LVEF) by &gt;10 percentage points from baseline to a value &lt;50%. A new entity named subclinical systolic dysfunction, is defined by a drop of global longitudinal strain (GLS) by &gt;15% from baseline, however, LVEF remains &gt;50%. The diagnosis should be confirmed by a second echocardiogram after 2–3 weeks. Results We enrolled 107 patients diagnosed with breast cancer and receiving anthracycline and/or trastuzumab. 27 patients were excluded for many reasons: 17 patients were lost to follow-up, 10 patients had an inadequate echo-imaging (8 had a follow-up without measurement of GLS and 2 patients were poorly echogenic). Only eighty patients were finally retained. The average age of our patients was 49.9±10.8 years. The mean left ventricular ejection fraction (LVEF) was at 64±4.4%. The incidence of CTRCD was 6%. the mean delay of diagnosis from the onset of chemotherapy was 174 days. It was reversible in 60% of cases after the initiation of a cardioprotective treatment which allowed the anti-cancer treatment pursuit. The incidence of subclinical cardiac dysfunction was 25%. The mean delay between the initiation of anti-cancer treatment and the diagnosis was 314.5 days. A cardioprotective treatment with Bblockers and angiotensin-converting enzyme (ACE) inhibitors was prescribed and all these patients recovered a normal GLS with a mean delay of three months and pursuied their chemotherapy. Conclusion We showed that timely cardiovascular evaluation, intervention and close monitoring in the context of a structured service allowed the majority of patients (99%) to pursue their anti-cancer treatment and to avoid the evolution to CTRCD in patients diagnosed with subclinical cardiac dysfunction. FUNDunding Acknowledgement Type of funding sources: None. Treated subclinical cardiac dysfunction


Circulation ◽  
2014 ◽  
Vol 130 (suppl_2) ◽  
Author(s):  
Yoshimori An ◽  
Kenji Ando ◽  
Michio Nagashima ◽  
Masato Fukunaga ◽  
Kenichi Hiroshima ◽  
...  

Background: There are still limited data on the mortality for a long-term follow-up and the clinical factors influencing appropriate therapies in Japanese patients with implantable cardioverter-defibrillator (ICD) for primary prevention, who satisfied the criteria in Multicenter Automatic Defibrillator Implantation Trial 2 (MADIT2). Methods: Between January 2000 and December 2012, a total of 436 patients without prior ventricular arrhythmic event underwent ICD implantation for primary prevention at our institution. Among these patients, we enrolled consecutive 122 patients (69±10 years, male: 84%, biventricular-pacing: 54%, median follow-up: 1390 days) who met the MADIT2 criteria; left ventricular ejection fraction (LVEF) ≤30% with ischemic heart disease, more than 4 weeks after myocardial infarction. Results: At the 3 years of follow-up, the mortality rate (21%) was comparable with that of the original MADIT2 ICD group (20%). The Kaplan-Meier event rate for appropriate ICD therapy (shock and anti-tachycardia pacing therapy) (35%) was also similar to that of the original MADIT2 ICD group (32%). Multivariate analysis by Cox regression model revealed that left ventricular diastolic diameter (LVDd) ≥60mm (Hazard Ratio [HR]: 1.65, 95% Confidence Interval [CI]: 1.16-2.14, P=0.004) and non-sustained ventricular tachycardia (NSVT) (HR: 1.55, 95%CI: 1.13-2.15, P=0.007) were independent predictors for appropriate ICD therapy. On the other hand, LVEF, NYHA class, biventricular-pacing, amiodarone or inducibility of ventricular arrhythmia was not associated with appropriate ICD therapy. Conclusion: Appropriate ICD therapy was delivered in Japanese primary prevention patients as often as in the original MADIT2 ICD group and strongly predicted by dilated left ventricle and NSVT.


Blood ◽  
2014 ◽  
Vol 124 (21) ◽  
pp. 1300-1300
Author(s):  
Deborah L Enns ◽  
David M Aboulafia

Abstract Purpose: Doxorubicin-based chemotherapy (DOX) is commonly administered to patients with diffuse large B-cell lymphoma (DLBCL). Prior to chemotherapy, left ventricular ejection fraction (LVEF) is routinely measured to assess left ventricular dysfunction. While LVEF screening is recommended by many national regulatory bodies, evidence supporting the usefulness of LVEF measurement prior to administering DOX is lacking. Our goal was to perform a retrospective analysis of patients with DLBCL to establish (1) how often LVEF was measured prior to administering DOX, and (2) whether the chemotherapy regimen was modified based on LVEF values. As cumulative doses of doxorubicin greater than 400 mg/m2 have been associated with an increased risk of congestive heart failure (CHF), we also determined the incidence of CHF in patients with DLBCL who did receive DOX. Patients and Methods: We identified 268 patients diagnosed with DLBCL at Virginia Mason Medical Center between 2001 and 2012 and collected the following data: age at diagnosis; stage of lymphoma; type of chemotherapy given; cumulative doxorubicin dose (mg/m2); LVEF status; and incidence of CHF or cardiac disease. We also compared the number of CHF risk factors between patients who did and did not have LVEF measured. Statistical analyses included a Fischer’s exact or Chi-squared test to compare study groups as well as the number of CHF risk factors. The level of significance was set at a P value of < 0.05. Results: LVEF was measured in 238 patients (89%) prior to initiation of chemotherapy. LVEF values were normal in 225 patients (95%) and low (< 50%) in 13 patients (5%). Of the patients with normal LVEF, 193 received DOX (86%), and of these patients, 14 developed CHF post-treatment (7%). For the 13 patients with low LVEF, 8 received DOX (62%) and 1 developed post-treatment CHF (13%). The remaining thirty patients did not have LVEF measured and none received DOX. Of the 268 patients studied, 176 are alive (66%) and 3 were lost to follow-up. The mean follow-up time was 43 months (range 3 d to 12.1 y). The mean number of CHF risk factors did not differ between patients who did and did not have LVEF measured (1.70 vs. 1.65, P = 0.87). Conclusion: Our results suggest that the decision to administer DOX was not directly affected by LVEF values. These findings challenge the existing policy of routinely screening patients with DLBCL with echocardiograms or MUGA scans prior to treatment. Disclosures No relevant conflicts of interest to declare.


EP Europace ◽  
2020 ◽  
Vol 22 (Supplement_1) ◽  
Author(s):  
A Zupan Meznar ◽  
D Zizek ◽  
U D Breskvar Kac ◽  
K Writzl ◽  
M Jan ◽  
...  

Abstract Background Truncating variants in FLNC gene are associated with an overlapping phenotype of arrhythmogenic and dilated cardiomyopathy. There are reports of high arrhythmia propensity with sudden cardiac death (SCD) often being the first symptom of the disease. It has been suggested that the current European guidelines primary prevention (PP) recommendation about implantable cardioverter-defibrillator (ICD) implantation might not be applicable in these patients and that earlier intervention should be considered. Purpose We sought to investigate the arrhythmic burden in FLNC truncation carriers in our centre. Methods Adult FLNC truncation carriers diagnosed in our centre between the years 2018 and 2019 were included in the study. We retrospectively analysed clinical data, and ICD follow-up reports in the cohort. Patients implanted with an ICD were divided in 3 groups: group A (secondary prevention ICD implantation), group B (PP indication according to the current guidelines – left ventricular ejection fraction (LVEF) below 35%) and group C (early PP– FLNC truncation carrier, LVEF &lt; 50% and late gadolinium enhancement on cardiac magnetic resonance). We report the number of patients experiencing SCD and the number of appropriate and inappropriate ICD therapies per group. Results Twenty-four adult patients from 3 different families with three distinct FLNC truncating variants were identified. Ten (42%) were male; the average age was 45 ± 14 years. There were 3 (13%) SCDs in one family (2 male and one female, 29-42 years old) and two (8%) aborted SCDs in the remaining two families (one male and one female, 66 and 51 years old). Altogether eleven (46%) patients were implanted with an ICD. There were three patients in group A (2 aborted SCDs and 1 sustained ventricular tachycardia (VT)), two patients in group B and six patients in group C. Average ICD follow-up times were 42, 48 and 6 months for groups A, B and C, respectively. Eight appropriate ICD therapies occurred in 3 patients (27%). In group A, there were four sustained VT episodes successfully converted with an anti-tachycardia pacing (ATP) in two patients (67%); the average time to first therapy was 33 months. In group B, there was one appropriate shock for ventricular fibrillation (VF), and three sustained VT episodes in one patient (50%), time to first therapy was 60 months. After six months follow-up, there were no appropriate therapies registered in group C. Two patients (18%) experienced inappropriate shocks due to sinus tachycardia, one in group A and one in group C. Conclusion One-fifth of FLNC truncation carriers in our cohort experienced SCD. When patients received an ICD according to the current guidelines, majority experienced appropriate ICD therapy. Further clinical studies with longer follow-up will be needed to define appropriate risk stratification and optimal timing for prophylactic ICD intervention in these patients.


2019 ◽  
Vol 30 (3) ◽  
pp. 443-450
Author(s):  
Homare Okamura ◽  
Mamoru Arakawa ◽  
Naoyuki Kimura ◽  
Koichi Yuri ◽  
Atsushi Yamaguchi

Abstract Figure 4: OBJECTIVES We investigated the clinical and haemodynamic outcomes in elderly patients undergoing composite aortic root replacement. METHODS Between 2005 and 2017, 135 patients underwent aortic root surgery at our hospital. Of these 135 patients, 47 patients aged ≥65 years were included in this study. Pathologies included aneurysms in 31, chronic aortic dissection in 6, acute aortic dissection in 4 and other causes in 6 patients. A bioprosthesis was used in 27 and a mechanical valve in 20 patients. The mean age was 71.0 ± 4.3 years. The mean follow-up period was 61 ± 35 months. Follow-up echocardiographic data (average 48 months after surgery) were collected in 35 patients (74%). RESULTS The in-hospital mortality rate was 2.1% (1 patient). Seven late deaths occurred during follow-up. The 1-, 5- and 8-year overall survival was 93.6%, 82.9% and 82.9%, respectively. Infective endocarditis, Marfan syndrome and diabetes were independent predictors of poorer survival. During the follow-up, thromboembolism occurred in 1 patient, major bleeding events in 5 patients, or proximal reoperation for prosthetic valve endocarditis in 1 patient. The type of valve, mechanical or biological valve, did not affect late mortality and morbidity. Follow-up echocardiography revealed significantly improved left ventricular ejection fraction compared with that at discharge. CONCLUSIONS Composite aortic root replacement provided satisfactory midterm outcomes in patients aged ≥65 years. Further studies with a longer follow-up are warranted to evaluate late valve-related events.


Circulation ◽  
2014 ◽  
Vol 130 (suppl_2) ◽  
Author(s):  
Giselle L Peixoto ◽  
Rodrigo O Madia ◽  
Sérgio F Siqueira ◽  
Mariana M Lensi ◽  
Silvana D Nishioka ◽  
...  

Introduction: Chagas’ disease causes different clinical expressions in the heart and permanent pacing due to bradycardia is not uncommon. Objective: Predictors of death in patients with Chagas Cardiomyopathy requiring pacemaker (PM) are unknown. This is the aim of this study. Methods: We prospectively evaluated 529 patients included in the Pacinchagas Study - Risk Stratification in Pacemaker Patients with Chagas Cardiomyopathy, which primary objective is to create a risk score to predict death in this population. The patients are submitted to an extent questionnaire which included clinical (NYHA class, symptoms, comorbidities and medications) functional (electrocardiography, Holter and echocardiography) and electronic variables (burden of pacing and arrhythmias). Patients with at least 6 months of follow-up were included in this preliminar analysis. Results: The cohort included 337 (63.7%) females, the mean age was 62.3±11.9 years and 63.1% were in NYHA class I. Indication for PM implantation was atrioventricular block, sick sinus syndrome, atrial fibrillation with slow ventricular response and unknown in 72.0%; 20.4%; 5.1% and 2.5%, respectively. During a mean follow-up of 1.5±0.6 years, 62 (11.7%) patients died. The mean time of PM implantation was not different between the dead and the survivors (11.9±9.0years versus 11.1±8.6years, P=0.503). Twenty-five deaths (40.3%) were sudden, 22 (35.5%) were due to heart failure, 6 (9.7%) were due to other cardiovascular causes, and 7 (11.3%) were due to noncardiovascular causes. The cause of death could not be determined in two patients (3.2%). Cox proportional hazards identified three predictors of death: NYHA class III/IV (Hazard Ratio [HR] 5.661; 95% Confidence Interval [95%IC] 2.617-12.245; P<0.001); left ventricular ejection fraction (LVEF) ≤42% (HR 2.779; 95%IC 1.299-5.945; P=0.008) and chronic kidney disease (HR 2.635; 95%IC 1.167-5.948; P=0.020). Conclusions: This analysis of Pacinchagas study identified in a mean follow-up of one year and half, three predictors of death in PM users with Chagas Cardiomyopathy: NYHA class III/IV, LVEF≤42% and chronic kidney disease.


2019 ◽  
Vol 65 (11) ◽  
pp. 1391-1396
Author(s):  
Luiz Carlos Santana Passos ◽  
Rodrigo Morel Vieira de Melo ◽  
Yasmin Menezes Lira ◽  
Natalia Ferreira Cardoso de Oliveira ◽  
Thiago Trindade ◽  
...  

SUMMARY BACKGROUND: Cardiac resynchronization therapy (CRT) is a therapeutic modality for patients with heart failure (HF). The effectiveness of this treatment for event reduction is based on clinical trials where the population of patients with Chagas' disease (DC) is underrepresented. OBJECTIVE: To evaluate the prognosis after CRT of a population in which CD is an endemic cause of HF. METHODS: A retrospective cohort conducted between January 2015 and December 2016 that included patients with HF and left ventricular ejection fraction (LVEF) of less than 35% and undergoing CRT. Clinical and demographic data were collected to search for predictors for the combined outcome of death or hospitalization for HF at one year after CRT implantation. RESULTS: Fifty-four patients were evaluated, and 13 (24.1%) presented CD as the etiology of HF. The mean LVEF was 26.2± 6.1%, and 36 (66.7%) patients presented functional class III or IV HF. After the mean follow-up of 15 (±6,9) months, 17 (32.1%) patients presented the combined outcome. In the univariate analysis, CD was associated with the combined event when compared to other etiologies of HF, 8 (47%) vs. 9 (13,5%), RR: 3,91 CI: 1,46–10,45, p=0,007, as well as lower values of LVEF. In the multivariate analysis, CD and LVEF remained independent risk factors for the combined outcome. CONCLUSION: In a population of HF patients undergoing CRT, CD was independently associated with mortality and hospitalization for HF.


Sign in / Sign up

Export Citation Format

Share Document