scholarly journals Subsequent shock delivery and outcomes in out-of-hospital cardiac arrests with initial unshockable rhythm

EP Europace ◽  
2021 ◽  
Vol 23 (Supplement_3) ◽  
Author(s):  
Y Goto ◽  
A Funada ◽  
T Maeda ◽  
Y Goto

Abstract Funding Acknowledgements Type of funding sources: Public grant(s) – National budget only. Main funding source(s): The Japan Society for the Promotion of Science (Grant-in-Aid for Scientific Research) Background The conversion from initial non-shockable to shockable rhythms during cardiopulmonary resuscitation (CPR) by emergency medical service (EMS) providers may be associated with neurologically intact survival after out-of-hospital cardiac arrest (OHCA). However, the prognostic significance of rhythm conversion according to the type of initial nonshockable rhythm is unclear. Purpose To determine the association between shock after conversion to shockable rhythm with neurologically intact survival after OHCA and shock delivery time (time from EMS-initiated CPR to first shock delivery) in patients with two types of initial unshockable rhythm. Methods We analyzed the records of 90,334 adult patients with witnessed OHCA of cardiac origin who were treated by EMS providers and had an initial unshockable rhythm. Data were obtained from a prospectively recorded Japanese nationwide Utstein-style database for a 5-year period (2013–2017). The primary outcome was 1-month neurologically intact survival, defined as a cerebral performance categories score from 1 to 2. Patients were divided into initial pulseless electrical activity (PEA) (n = 37,977 [42.0%]) and initial asystole (n = 52,357 [58.0%]) groups. Results In the initial PEA group, the crude rate of 1-month neurologically intact survival was significantly higher in the subsequently shocked than in the non-shocked patients (4.2% [121/2,896]) vs. 2.4% [857/35,081], p <0.0001). After adjustment for ten prehospital variables, the adjusted odds ratios (aORs) of subsequent shock for 1-month neurologically intact survival compared to no shock delivery were as follows: shock delivery time <10 min, 2.21 (95% confidence interval [CI], 1.77–2.77, p< 0.0001); 10–14 min, 1.43 (0.89–2.28, p = 0.14); and ≥15 min, 0.36 (0.16–0.81; p = 0.013). In the initial asystole group, the crude rate of 1-month neurologically intact survival was significantly higher in the subsequently shocked than in the non-shocked (1.7% [47/2,687] vs. 0.4% [203/49,670], p <0.0001). A multivariate logistic regression model showed that subsequent shock with a shock delivery time <10 min was associated with increased odds of neurologically intact survival compared to no shock delivery (aOR, 5.67; 95% CI, 3.92–8.18; p <0.0001). However, there were no significant differences in neurological outcomes between subsequently shocked and non-shocked patients when the shock delivery time was 10–14 min (p = 0.21) or ≥15 min (p = 0.91). Conclusions In patients with witnessed OHCA of cardiac origin and initial nonshockable rhythm, subsequent shock after conversion to shockable rhythm during CPR was associated with increased odds of 1-month neurologically intact survival only when shock was delivered <10 min from EMS-initiated CPR, regardless of the type of initial rhythm. Further, in patients with initial PEA, subsequent shock was associated with decreased odds of neurologically intact survival when shock was delivered ≥15 min from EMS-initiated CPR.

EP Europace ◽  
2021 ◽  
Vol 23 (Supplement_3) ◽  
Author(s):  
Y Goto ◽  
A Funada ◽  
T Maeda ◽  
Y Goto

Abstract Funding Acknowledgements Type of funding sources: Public grant(s) – National budget only. Main funding source(s): The Japan Society for the Promotion of Science (Grant-in-Aid for Scientific Research) Background/Introduction: The rhythm conversion from initial non-shockable to shockable rhythm during cardiopulmonary resuscitation (CPR) by emergency medical services (EMS) providers may be associated with neurologically intact survival after out-of-hospital cardiac arrest (OHCA) in children with an initial non-shockable rhythm. However, the prognostic significance of rhythm conversion stratified by the type of initial non-shockable rhythm is still unclear. Purpose We aimed to investigate the association of subsequent shock after rhythm conversion to shockable rhythm with neurologically intact survival and shock delivery time (time from EMS-initiated CPR to first shock delivery) by the type of initial non-shockable rhythm in children with OHCA. Methods We analysed the records of 19,095 children (age <18 years) with OHCA treated by EMS providers. Data were obtained from a prospectively recorded Japanese nationwide Utstein-style database for a 13-year period (2005–2017). The primary outcome measure was 1-month neurologically intact survival, defined as cerebral performance category score of 1 to 2. Patients were divided into the initial pulseless electrical activity (PEA) (n = 3,326 [17.4%]) and initial asystole (n = 15,769 [82.6%]) groups. Results The proportion of patients who received subsequent shock after conversion to shockable rhythm was significantly higher in the initial PEA than in the initial asystole groups (3.3% [109/3,326] vs. 1.4% [227/15,769], p < 0.0001). The shock delivery time was significantly shorter in the initial PEA than in the initial asystole groups (median [IQR], 8 min [5 min – 12 min] vs. 10 min [6 min – 16 min], p < 0.01). Among the initial PEA patients, there was no significant difference between subsequently shocked (10.0% [11/109]) and subsequently non-shocked patients (6.0% [192/3,217], p = 0.10) regarding the rate of 1-month neurologically intact survival. However, after adjusting for 9 pre-hospital variables, subsequent shock with a delivery time of <10 min was associated with increased odds of neurologically intact survival compared with no shock delivery (adjusted odds ratio [OR], 2.45; 95% confidence interval [CI], 1.16–5.16], p = 0.018). Among the initial asystole patients, the rate of 1-month neurologically intact survival was significantly higher in the subsequently shocked (4.4% [10/227]) than in the subsequently non-shocked (0.7% [106/15,542], p < 0.0001). A multivariate logistic regression model showed that subsequent shock with a delivery time of <10 min was associated with increased odds of neurologically intact survival compared with no shock delivery (adjusted OR, 9.77 [95% CI, 4.2–22.5], p < 0.0001). Conclusions In children with OHCA with an initial non-shockable rhythm, subsequent shock after conversion to shockable rhythm during CPR was associated with increased odds of 1-month neurologically intact survival only when shock was delivered <10 min from EMS-initiated CPR regardless of the type of initial rhythm.


Circulation ◽  
2019 ◽  
Vol 140 (Suppl_2) ◽  
Author(s):  
Hill Stoecklein ◽  
Andrew Pugh ◽  
Michael Stroud ◽  
Scott Youngquist

Introduction: Recognition and rapid defibrillation of shockable rhythms is strongly associated with increased survival from out-of-hospital cardiac arrest (OHCA). The Salt Lake City Fire Department (SLCFD) adopted ECG rhythm filtering technology in 2011, along with a protocol to rapidly defibrillate shockable rhythms without awaiting the end of the 2-minute CPR epoch. Paramedics were also trained to empirically shock asystole, as studies have shown poor agreement in cases of fine and moderate amplitude Ventricular Fibrillation (VF). Hypothesis: We hypothesized that the mandate to shock perceived asystole plus the use of filtering technology would result in high case sensitivity for shockable rhythms at the expense of an unknown frequency of shock delivery to organized rhythms. Methods: Prospectively collected defibrillator data from cardiac arrest cases treated by SLCFD between Dec 2011 and June 2019 were analyzed. Timing of rhythm changes and defibrillation events was manually abstracted using the manufacturer’s review software. The gold standard for rhythm interpretation was post-incident physician interpretation. Results: Paramedics attempted resuscitation in 942 OHCAs. We excluded 41 pediatric cases, 140 cases of BLS or bystander-only AED resuscitation, and 65 cases in which the defibrillator file was unavailable. Overall, 696 adult cardiac arrests with 1,389 shocks delivered were available for analysis. Shocks were delivered to 958 (69%) shockable, 261 (19%) asystole, 158 (11%) PEA, 4 (0.3%) SVT, and 8 (0.6%) unknown underlying rhythms. In 280 cases no shock was delivered despite an initial shockable rhythm in 3 of these cases. Shock delivery case sensitivity was 180/183 (0.98, 95% confidence interval [CI]:0.97-1.0) with false positive proportion of delivered shocks of 158/1,389 (0.11, 95% CI:0.10-0.13) for PEA only and 419/1,389 (0.30, 95% CI:0.28-0.33) for combined PEA and asystole. Neurologically intact (CPC 1-2) overall and Utstein survival rates were 15% and 46% respectively. Conclusions: Using ECG rhythm-filtering technology and an aggressive protocol to defibrillate VF and empirically shock asystole, we demonstrated high case sensitivity for VF at the expense of an 11% rate of shock delivery to underlying PEA.


CJEM ◽  
2018 ◽  
Vol 20 (S1) ◽  
pp. S7-S7
Author(s):  
A. Cournoyer ◽  
E. Notebaert ◽  
S. Cossette ◽  
J. Morris ◽  
L. de Montigny ◽  
...  

Introduction: Patients suffering from out-of-hospital cardiac arrest (OHCA) with an initial shockable rhythm (ventricular tachycardia or ventricular fibrillation) have higher odds of survival than those suffering from non-shockable rhythm (asystole or pulseless electrical activity). Because of that prognostic significance, patients with an initial non-shockable rhythm are often not considered for advanced resuscitation therapies such as extracorporeal resuscitation. However, the prognostic significance of the conversion to a shockable rhythm from an initially non-shockable rhythm remains uncertain. This study aimed to determine the degree of association between the conversion (or not) of a non-shockable rhythm to a shockable rhythm and resuscitation outcomes in patients with OHCA. It was hypothesized that such a conversion would be associated with a higher survival to discharge. Methods: The present study used a registry of adult OHCA between 2010 and 2015 in Montreal, Canada. Adult patients with non-traumatic OHCA and an initial non-shockable rhythm were included. The primary outcome measure was survival to hospital discharge, and the secondary outcome measure was prehospital return of spontaneous circulation (ROSC). The associations of interest were evaluated with univariate logistic regressions and multivariate models controlling for demographic and clinical variables (e.g. age, gender, type of initial non-shockable rhythm, witnessed arrest, bystander cardiopulmonary resuscitation). Assuming a survival rate of 3% and 25% of the variability explained by the control variables, including more than 4580 patients would allow to detect an absolute difference of 4% in survival between both groups with a power of more than 90%. Results: A total of 4893 patients (2869 men and 2024 women) with a mean age of 70 years (standard deviation 17) were included, of whom 450 (9.2%) experienced a conversion to a shockable rhythm during the course of their prehospital resuscitation. Among all patients, 146 patients (3.0%) survived to discharge and 633 (12.9%) experienced prehospital ROSC. In the univariate models, there was no association between the conversion to a shockable rhythm and survival (odds ratio [OR] 1.14 [95% confidence interval {CI} 0.66-1.95]), but a significant assocation was observed with ROSC (OR 2.00 [95% CI 1.57-2.55], p<0.001). However, there was no independent association between the conversion to a shockable rhythm and survival (adjusted OR [AOR] 0.92 [95% CI 0.51-1.66], p=0.78) and prehospital ROSC (AOR 1.30 [95% CI 0.98-1.72], p=0.073). Conclusion: There is no clinically significant association between the conversion to a shockable rhythm and resuscitation outcomes in patients suffering from OHCA. The initial rhythm remains a much better outcome predictor than subsequent rhythms and should be preferred when evaluating the eligibility for advanced resuscitation procedures.


Circulation ◽  
2014 ◽  
Vol 130 (suppl_2) ◽  
Author(s):  
Robert J Preston ◽  
Chris Stratford ◽  
Peter Taillac ◽  
Scott Youngquist

BACKGROUND: The identification and treatment of a reversible cause of arrest is considered an important step in post arrest care. Observational data suggests a survival benefit is derived from the performance of coronary angiography in post cardiac arrest patients who survive to hospital admission. We hypothesized that the benefit of this therapy would be diminished in victims who presented with non-shockable rhythms. METHODS: We analyzed a prospectively-collected registry of all cardiac arrests in the state of Utah between July 1, 2012 through December 31, 2013. We compared outcomes of patients surviving to hospital admission, stratified by presenting rhythm and performance of angiography. Logistic regression was used to control for confounders. RESULTS: During the study period, 464/1,534 (30.2%) out-of-hospital cardiac arrest victims survived to hospital admission. Coronary angiography was performed in 133 (28.7%). Neurologically-intact survival was higher in patients with shockable rhythms who underwent angiography compared with those who didn’t (73% vs 32%, p<0.0001, TABLE). However, this survival benefit was not apparent in patients with a presenting rhythm of asystole or pulseless electrical activity (17% vs 20%, p=0.7). When stratified by presenting rhythm and controlling for age, witnessed arrest, bystander CPR, and gender, only victims with an initial shockable rhythm demonstrated increased odds of neurologically-intact survival (OR 5.9, 95% CI 2.9-11.7, p<0.0001) CONCLUSIONS: Our analysis suggests that post arrest angiography, on average, provides a survival advantage only among victims with an initial shockable rhythm. However, this data is observational and prone to selection bias and residual confounding.


2020 ◽  
Vol 41 (Supplement_2) ◽  
Author(s):  
A Riva ◽  
A Camporeale ◽  
F Sturla ◽  
S Pica ◽  
L Tondi ◽  
...  

Abstract Background Ischemic cardiomyopathy (ICM) is often associated with negative LV remodelling after myocardial infarction, sometimes resulting in impaired LV function and dilation (iDCM). 4D Flow CMR has been recently exploited to assess intracardiac hemodynamic changes in presence of LV remodelling. Purpose To quantify 4D Flow intracardiac kinetic energy (KE) and viscous energy loss (EL) and investigate their relation with LV dysfunction and remodelling. Methods Patients with prior anterior myocardial infarction underwent a CMR study with 4D Flow sequences acquisition; they were divided into ICM (n=10) and iDCM (n=10, EDV&gt;208 ml and EF&lt;40%). 10 controls were used for comparison. LV was semi-automatically segmented using short axis CMR stacks and co-registered with 4D Flow. Global KE and EL were computed over the cardiac cycle. NT-proBNP measurements were correlated with average and peak values, during systole and diastole. Results Both LV volume and EF significantly differ (P&lt;0.0001) between iDCM (EDV=294±56 ml, EF=24±8%), ICM (EDV=181±32 ml, EF=34±6%) and controls (EDV=124±29 ml, EF=72±5%). If compared to controls, both ICM and iDCM showed significantly lower KE (P≤0.0008); though lower than controls, EL was higher in iDCM than ICM. Within the iDCM subgroup, diastolic mean KE and peak EL reported good inverse correlation with NT-proBNP (r=−0.75 and r=−0.69, respectively). EL indexed (ELI) to average KE during systole was higher in the entire ischemic group as compared to controls (ELI(ischemic) = 0.17 vs. ELI(controls) = 0.10, P=0.0054). Conclusions 4D Flow analyses effectively mapped post-ischemic LV energetic changes, highlighting the disproportionate intraventricular EL relative to produced KE; preliminary good correlation between LV energetic changes and NT-proBNP will deserve further investigation in order to contribute to early detection of heart failure. Funding Acknowledgement Type of funding source: Public grant(s) – National budget only. Main funding source(s): Italian Ministry of Health


2021 ◽  
Vol 22 (Supplement_1) ◽  
Author(s):  
B Domenech-Ximenos ◽  
M Sanz-De La Garza ◽  
A Sepulveda-Martinez ◽  
D Lorenzatti ◽  
F Simard ◽  
...  

Abstract Funding Acknowledgements Type of funding sources: Public grant(s) – National budget only. Main funding source(s): Plan Nacional I.D., Del Programa Estatal de Fomento De La Investigación Científica y Técnica de Excelencia, Subprograma De Generación Del Conocimiento, Ministerio de Economía y Competitividad 2013. Background  Myocardial deformation integrated with cardiac dimensions provides a comprehensive assessment of the ventricular remodelling patterns induced by cumulative effects of intensive exercise. Feature tracking(FT) can measure myocardial deformation from cardiac magnetic resonance(CMR) cine sequences; however, its accuracy is still scarcely validated. Purpose  Our aim was to compare FT’s accuracy and reproducibility to speckle tracking echocardiography (STE) in highly trained endurance athletes (EAs). Methods  93 EAs (&gt;12 hours training/week during the last 5 years, 52% male, 35 ± 5.1 years) and 72 age-matched controls underwent a resting CMR and a transthoracic echocardiography to assess biventricular exercise-induced remodelling and biventricular global longitudinal strain (GLS) by CMR-FT and STE. Results   High endurance training load was associated with larger bi-ventricular and bi-atrial sizes and mildly reduced systolic function of both ventricles (p &lt; 0,05). Strain values (both by CMR-FT and STE) proportionally decreased with increasing ventricular volumes potentially depicting the increased volume and functional biventricular reserve that characterize EAs heart. Strain values were lower when assessed by CMR-FT as compared to STE (p &lt; 0.001), with good reproducibility for the LV (bias = 3.94%, LOA= ± 4.27%) but wider variability for RV strains (Figure 2). Conclusions   Biventricular longitudinal strain values were lower when assessed by FT compared to STE. Both methods were comparable when measuring LV strain but not RV strain. These differences might be justified by FT’s lower in-plane spatial and temporal resolution, which is particularly relevant for the complex anatomy of the RV. Abstract Figure. Fig 1. Bland-Altman plots; FT vs STE.


2021 ◽  
Vol 28 (Supplement_1) ◽  
Author(s):  
Y Imai ◽  
M Sakurai ◽  
H Nakagawa ◽  
A Hirata ◽  
Y Murakami ◽  
...  

Abstract Funding Acknowledgements Type of funding sources: Public grant(s) – National budget only. Main funding source(s): H20–Junkankitou [Seishuu]–Ippan–013; H23–Junkankitou [Seishuu]–Ippan–005; H26-Junkankitou [Seisaku]-Ippan-001; H29–Junkankitou–Ippan–003 and 20FA1002 OnBehalf EPOCH-JAPAN Introduction Absolute risk of Lifetime risk (LTR) is useful estimate for risk communication compared with short term risk or relative risk especially for young people. Proteinuria is leading cause of end-stage kidney disease (ESKD) and independent risk factor for cardiovascular disease (CVD). Although nonproteinuric renal disease is global burden of ESKD, it has been poorly focused. To date, there have been no reports of impact of proteinuria and low eGFR on LTR with the outcome of CVD death in Asian population. Purpose We aimed to estimate LTR of CVD death stratified by the status of proteinuria and low eGFR. Methods We used modified Kaplan-Meier approach to estimate the remaining lifetime risk of cardiovascular death based on EPOCH-JAPAN(Evidence for Cardiovascular Prevention From Observational Cohorts in Japan) database. LTR was estimated at each index age starting from 40 years for those with proteinuria and without proteinuria stratified by low eGFR, which is defined as eGFR &lt;60 ml/min/1.73 m². Participants were classified into three groups, which were those with proteinuria (Proteinuria (+)), those without proteinuria with low eGFR (Proteinuria (-)/Low eGFR (+)), those without proteinuria without low eGFR (Proteinuria (-)/Low eGFR (-)). Results A total of 47,292 participants from 9 cohorts was included in the analysis. Mean follow-up period was 14.6 years with 690,463 person years and total CVD death was 1,075 in men and 1,193 in women. The LTRs at the index age of 40 years were as follows: 17.7% (95% confidence interval: 15.4 – 19.0%) in Proteinuria (-)/Low eGFR (-) group, 26.2% (20.2 – 31.1%) in Proteinuria (-)/low eGFR (+) group, 24.5% (15.1 – 29.3%) in Proteinuria (+) group for men; 15.3%(13.7 – 16.5%), 29.9%(14.7 – 46.8%) , 28.3%(19.4 – 34.7%) for women. Conclusions We observed that those without proteinuria with low eGFR have equivalently high LTR with those with proteinuria. These results indicate that even in the absence of proteinuria, low eGFR has high impact on LTR. Lifestyle modification from young age is necessary to prevent from renal dysfunction.


2021 ◽  
Vol 28 (Supplement_1) ◽  
Author(s):  
I Chaikovsky ◽  
A Popov ◽  
D Fogel ◽  
A Kazmirchyk

Abstract Funding Acknowledgements Type of funding sources: Public grant(s) – National budget only. Main funding source(s): National Academy of Science of Ukraine Background Electrocardiogram (ECG) is still the primary source for the diagnostic and prognostic information about cardiovascular diseases. The concept of "normal ECG" parameters is crucial for the reliable diagnosis, since it provides reference for the ECG under examination. With the development of new methods and tools for ECG feature extraction and classification based on artificial intelligence (AI), it becomes possible to identify subtle changes in the heart activity to detect  possible abnormalities at the early stage.  The challenge of this work is to identify the deviations in  ECG of clinically healthy persons  from the conditional "population" norm . Methods The normal ECG is described as a feature vector composed of the time-magnitude parameters of signal-averaged ECG (SAECG). To define the subjects that possibly have variations from the "population" norm, the outlier detection approach is proposed: first the cloud of the vectors , constructed from the set of normal ECG"s , obtained from  young, clinically similar healthy persons  was created in feature space. Then, a particular ECG is considered deviant and requires the attention of the clinician when it is considered an outlier of the cloud of normal ECGs. In the experiment, SAECGs from the group of 139 young subjects (male, age 18-28  years) with no reported cardiovascular problems are used to extract 34 features from SAECG leads (magnitudes and durations of ECG waves, duration of ECG segments, etc.). ECGs were routinely previewed by qualified physicians, and no obvious anomalies were noticed. The Isolation Forest anomaly detection method is used with variable numbers of trees and different contamination parameters.  Results The ratio of outliers were changed from 5 to 10% (7-12 subjects) with various numbers of estimator trees. Seven outlier SAECGs were repeatedly appearing for various settings. Out of these, 4 subjects were the oldest persons in group examined , and 3 others had a rare ventricular premature beats during routine ECG examination. Conclusion The proposed method is promising for application in routine and express ECG tests since it is able to quantify the subtle deviations from the normal ECG group.


2020 ◽  
Vol 41 (Supplement_2) ◽  
Author(s):  
T Koopsen ◽  
N Van Osta ◽  
E Willemen ◽  
F.A Van Nieuwenhoven ◽  
J Gorcsan ◽  
...  

Abstract Background/Introduction The mechanical properties of infarcted myocardium are important determinants of cardiac pump function and risk of developing heart failure following myocardial infarction (MI). Purpose To better understand the effects of infarct stiffness on compensatory hypertrophy and dilation of non-infarcted tissue in the left (LV) and right ventricle (RV), by using a computational model. Methods The CircAdapt computational model of the human heart and circulation was applied to simulate an acute MI involving 20% of LV wall mass. The simulation was validated using previously published experimental data. Subsequently, two degrees of increased infarct stiffness were simulated. In all three simulations, a model of structural myocardial adaptation of the non-infarcted tissue was applied, based on sensing of mechanical loading of myocytes and extracellular matrix (ECM). Results Mild and severe stiffening of the infarct reduced the increase of LV end-diastolic volume (EDV) from +23 mL to +17 mL and +16 mL, respectively, and the increase of LV non-infarcted tissue mass from +31% to +21% and +18%. RV EDV decreased after adaptation, and mild and severe infarct stiffening reduced the decrease of RV EDV from −21 mL to −12 mL and −10 mL, respectively. Increase of RV tissue mass was reduced from +13% to +8% and +7% with mild and severe infarct stiffening. In the LV, reduced dilation and hypertrophy were driven mainly by a reduction of maximum stress in the ECM and a higher stress between the myocytes and ECM following infarct stiffening. The decreased RV hypertrophy, but not EDV reduction, was caused by a reduction of maximum RV ECM stress and maximum RV active myofiber stress. Conclusions Model simulations predicted that a stiffened LV infarct reduces both LV and RV non-infarcted tissue hypertrophy as well as LV dilation. In LV remodeling, maximum ECM stress and stress between myocyte and ECM played a more prominent role than in RV remodeling, while maximum active stress was more important in the RV. Overview of all model simulations Funding Acknowledgement Type of funding source: Public grant(s) – National budget only. Main funding source(s): This work was funded by the Netherlands Organisation for Scientific Research and the Dutch Heart Foundation.


Sign in / Sign up

Export Citation Format

Share Document