scholarly journals In-hospital gastrointestinal bleeding in patients with acute myocardial infarction: incidence, outcomes and risk factors analysis from China Acute Myocardial Infarction Registry

BMJ Open ◽  
2021 ◽  
Vol 11 (9) ◽  
pp. e044117
Author(s):  
Wence Shi ◽  
Xiaoxue Fan ◽  
Jingang Yang ◽  
Lin Ni ◽  
Shuhong Su ◽  
...  

ObjectiveTo investigate the incidence of gastrointestinal bleeding (GIB) in patients with acute myocardial infarction (AMI), clarify the association between adverse clinical outcomes and GIB and identify risk factors for in-hospital GIB after AMI.DesignRetrospective cohort study.Setting108 hospitals across three levels in China.ParticipantsFrom 1 January 2013 to 31 August 2014, after excluding 2659 patients because of incorrect age and missing GIB data, 23 794 patients with AMI from 108 hospitals enrolled in the China Acute Myocardial Infarction Registry were divided into GIB-positive (n=282) and GIB-negative (n=23 512) groups and were compared.Primary and secondary outcome measuresMajor adverse cardiovascular and cerebrovascular events (MACCEs) are a composite of all-cause death, reinfarction and stroke. The association between GIB and endpoints was examined using multivariate logistic regression and Cox proportional hazards models. Independent risk factors associated with GIB were identified using multivariate logistic regression analysis.ResultsThe incidence of in-hospital GIB in patients with AMI was 1.19%. GIB was significantly associated with an increased risk of MACCEs both in-hospital (OR 2.314; p<0.001) and at 2-year follow-up (HR 1.407; p=0.0008). Glycoprotein IIb/IIIa (GPIIb/IIIa) receptor inhibitor, percutaneous coronary intervention (PCI) and thrombolysis were novel independent risk factors for GIB identified in the Chinese AMI population (p<0.05).ConclusionsGIB is associated with both in-hospital and follow-up MACCEs. Gastrointestinal prophylactic treatment should be administered to patients with AMI who receive primary PCI, thrombolytic therapy or GPIIb/IIIa receptor inhibitor.Trial registration numberNCT01874691.

BMJ Open ◽  
2019 ◽  
Vol 9 (9) ◽  
pp. e025628 ◽  
Author(s):  
Ailifeire Maimaiti ◽  
Yang Li ◽  
Yong-Tao Wang ◽  
Xiang Yang ◽  
Xiao-Mei Li ◽  
...  

ObjectiveInsufficient myocardial reperfusion for patients with acute myocardial infarction (AMI) during primary percutaneous coronary intervention (PPCI) has a great influence on prognosis. The aim of this study was to investigate the association of the platelet-to-lymphocyte ratio (PLR) with myocardial reperfusion and in-hospital major adverse cardiac events (MACEs) in patients with AMI undergoing PPCI.DesignRetrospective cohort study.SettingPatients and researchers from two tertiary hospitals.ParticipantsA total of 445 consecutive AMI patients who underwent PPCI between January 2015 and December 2017 were enrolled. Patients were divided into two groups based on the PLR value: patients with PLR values in the third tertile were defined as the high-PLR group (n=150), and those in the lower two tertiles were defined as the low-PLR group (n=295). Explicit criteria for inclusion and exclusion were applied.InterventionsNo interventions.Primary and secondary outcome measuresPrimary outcome measures were defined as cardiovascular death, reinfarction or target vessel revascularisation. Secondary outcome measures were defined as stroke, non-lethal myocardial infarction, ventricular tachycardia/ventricular fibrillation and in-hospital mortality.ResultsThe high-PLR group had insufficient myocardial perfusion (23% vs 13%, p=0.003), greater postprocedural thrombolysis in myocardial infarction flow grade (0–2) (17% vs 10%, p=0.037), greater myocardial blush grade (0–1) (11% vs 4%, p=0.007) and higher B-type natriuretic peptide (BNP) (614±600 vs 316±429, p<0.001) compared with the low-PLR group. Multivariate logistic regression analysis indicated that the independent risk factors for impaired myocardial perfusion were high PLR (OR 1.256, 95% CI 1.003 to 1.579, p=0.056) and high BNP (OR 1.328, 95% CI 1.056 to 1.670, p=0.015). The high-PLR group had significantly more MACEs (43% vs 32%, p=0.029).ConclusionsThis study suggested that high PLR and BNP were independent risk factors for insufficient myocardial reperfusion in patients with AMI. Higher PLR was related to advanced heart failure and in-hospital MACEs in patients with AMI undergoing PPCI.


2021 ◽  
pp. 1-11
Author(s):  
Yini Wang ◽  
Xueqin Gao ◽  
Zhenjuan Zhao ◽  
Ling Li ◽  
Guojie Liu ◽  
...  

Abstract Background Type D personality and depression are the independent psychological risk factors for adverse outcomes in cardiovascular patients. The aim of this study was to examine the combined effect of Type D personality and depression on clinical outcomes in patients suffering from acute myocardial infarction (AMI). Methods This prospective cohort study included 3568 patients diagnosed with AMI between February 2017 and September 2018. Type D personality and depression were assessed at baseline, while the major adverse cardiac event (MACE) rate (cardiac death, recurrent non-fatal myocardial infarction, revascularization, and stroke) and in-stent restenosis (ISR) rate were analyzed after a 2-year follow-up period. Results A total of 437 patients developed MACEs and 185 had ISR during the follow-up period. The Type D (+) depression (+) and Type D (+) depression (−) groups had a higher risk of MACE [95% confidence interval (CI) 1.74–6.07] (95% CI 1.25–2.96) and ISR (95% CI 3.09–8.28) (95% CI 1.85–6.22). Analysis of Type D and depression as continuous variables indicated that the main effect of Type D, depression and their combined effect were significantly associated with MACE and ISR. Moreover, Type D (+) depression (+) and Type D (+) depression (−) emerged as significant risk factors for MACE and ISR in males, while only Type D (+) depression (+) was associated with MACE and ISR in female patients. Conclusions These findings suggest that patients complicated with depression and Type D personality are at a higher risk of adverse cardiovascular outcomes. Individual assessments of Type D personality and depression, and comprehensive interventions are required.


2021 ◽  
Author(s):  
Tao Chang ◽  
Xigang Yan ◽  
Chao Zhao ◽  
Yufu Zhang ◽  
Bao Wang ◽  
...  

Abstract Background There are few studies on the development and effect of coagulopathy in patients with a traumatic brain injury (TBI) during the early post-operative period. We determined the risk factors and neurologic outcomes of in patients with a TBI and coagulopathy diagnosed by routine laboratory tests within 72 hours post-operatively. Methods The baseline characteristics, intra-operative management, and follow-up results of 462 patients with TBIs were obtained and retrospectively analyzed by multivariate logistic regression from January 2015 to June 2019. Coagulopathy was defined as an activated partial thromboplastin time > 40 seconds, international normalized ratio >1.4, or a platelet count < 100×109 /L.Results Multivariate logistic regression analysis revealed that the Glasgow Coma Scale (GCS) at the time of admission, Injury Severity Score (ISS) at the time of admission, pupil mydriasis, duration of surgery, intra-operative blood loss, and intra-operative crystalloid resuscitation were independent risk factors for patients who developed a coagulopathy post-operatively. There were statistical differences in mortality (p = 0.049), the Glasgow Outcome Scale-Extended (GCS-E; p = 0.024), and the modified Rankin Scale (p = 0.043) between patients with and without coagulopathy 1 week after surgery. Coagulopathy within 72 h after surgery revealed a trend for higher mortality at 1 week (66.7%), 3 months (71.4%), and 6 months (76.2%). Furthermore, coagulopathy and contusion expansion in the early post-operative period were independent risk factors for TBI mortality after surgery. Intra-operative crystalloid resuscitation had a substantial diagnostic accuracy in predicting coagulopathy within 72 h post-operatively (area under the curve [AUC] = 0.972).Conclusion Coagulopathy within 72 h post-operatively in patients with a TBI predicted worse disease progression and unfavorable neurologic outcomes. Hence, we should take practical and reasonable measures to manage these risk factors, which may protect patients with a TBI from post-operative coagulopathy.


2021 ◽  
pp. 1-7
Author(s):  
Norimasa Ikeda ◽  
Shunsuke Fujibayashi ◽  
Bungo Otsuki ◽  
Kazutaka Masamoto ◽  
Takayoshi Shimizu ◽  
...  

OBJECTIVE The goal of this study was to investigate clinical outcomes and risk factors for the progression of sacroiliac joint (SIJ) degeneration and bone formation after S2 alar-iliac screw (S2AIS) insertion. METHODS Using preoperative and follow-up CT scan findings (median follow-up 26 months, range 16–43 months), the authors retrospectively studied 100 SIJs in 50 patients who underwent S2AIS placement. The authors measured the progression of SIJ degeneration and bone formation after S2AIS insertion, postoperative new-onset SIJ pain, S2AIS-related reoperation, and instrumentation failures. Stepwise multivariate logistic regression modeling was performed to clarify the risk factors associated with the progression of SIJ degeneration. RESULTS Significant progression of SIJ degeneration was observed in 10% of the group with preoperative SIJ degeneration (p = 0.01). Bone formation was observed in 6.9% of joints. None of the patients with these radiographic changes had new-onset SIJ pain or underwent reoperation related to instrumentation failures. Multivariate logistic regression analysis revealed that preoperative SIJ degeneration (p < 0.01) and a young age at surgery (p = 0.03) significantly affected the progression of SIJ degeneration. CONCLUSIONS The progression of SIJ degeneration and bone formation neither led to major screw-related complications nor affected the postoperative clinical course during the median follow-up period of 26 months. Although S2AIS insertion is a safe procedure for most patients, the results of this study suggested that preoperative degeneration and younger age at surgery affected SIJ degeneration after S2AIS insertion. Further long-term observation may reveal other effects of S2AIS insertion on SIJ degeneration.


2020 ◽  
Author(s):  
Sufen Zhou ◽  
Hongyan Guo ◽  
Heng Liu ◽  
Mingqun Li

Abstract Background: This study aimed to investigate potential predictors, including cerebroplacental ratio (CPR), middle cerebral artery (MCA)/uterine artery pulsatility index (PI) ratio, for adverse perinatal outcome in pregnancies at term.Methods: This was an observational, prospective study of recruited pregnancies at term. An adverse perinatal outcome was set as the primary observational endpoint. The receiver operating characteristic (ROC) curve was plotted to investigate the predictive and cut-off values of risk factors for adverse perinatal outcome. Independent risk factors (maternal, neonatal, prenatal ultrasound and Doppler variables) for adverse perinatal outcome were evaluated by the univariate and multivariate logistic regression analyses.Results: A total of 392 pregnancies at term were included and 19.4% of them had suffered adverse perinatal outcome. CPR (OR: 0.42, 95%CI: 0.20-0.93, P=0.032) and MCA/uterine artery PI ratio (OR: 0.25, 95%CI: 0.16-0.42, P=0.032) were two independent risk factors for adverse perinatal outcome by univariate and multivariate logistic regression analyses.Conclusions: MCA/uterine artery PI ratio is a good predictor of adverse perinatal outcome in pregnancies at term.


Author(s):  
Daniel A Jones ◽  
Paul Wright ◽  
Momin A Alizadeh ◽  
Sadeer Fhadil ◽  
Krishnaraj S Rathod ◽  
...  

Abstract Aim Current guidelines recommend the use of vitamin K antagonist (VKA) for up to 3–6 months for treatment of left ventricular (LV) thrombus post-acute myocardial infarction (AMI). However, based on evidence supporting non-inferiority of novel oral anticoagulants (NOAC) compared to VKA for other indications such as deep vein thrombosis, pulmonary embolism (PE), and thromboembolic prevention in atrial fibrillation, NOACs are being increasingly used off licence for the treatment of LV thrombus post-AMI. In this study, we investigated the safety and effect of NOACs compared to VKA on LV thrombus resolution in patients presenting with AMI. Methods and results This was an observational study of 2328 consecutive patients undergoing coronary angiography ± percutaneous coronary intervention (PCI) for AMI between May 2015 and December 2018, at a UK cardiac centre. Patients’ details were collected from the hospital electronic database. The primary endpoint was rate of LV thrombus resolution with bleeding rates a secondary outcome. Left ventricular thrombus was diagnosed in 101 (4.3%) patients. Sixty patients (59.4%) were started on VKA and 41 patients (40.6%) on NOAC therapy (rivaroxaban: 58.5%, apixaban: 36.5%, and edoxaban: 5.0%). Both groups were well matched in terms of baseline characteristics including age, previous cardiac history (previous myocardial infarction, PCI, coronary artery bypass grafting), and cardiovascular risk factors (hypertension, diabetes, hypercholesterolaemia). Over the follow-up period (median 2.2 years), overall rates of LV thrombus resolution were 86.1%. There was greater and earlier LV thrombus resolution in the NOAC group compared to patients treated with warfarin (82% vs. 64.4%, P = 0.0018, at 1 year), which persisted after adjusting for baseline variables (odds ratio 1.8, 95% confidence interval 1.2–2.9). Major bleeding events during the follow-up period were lower in the NOAC group, compared with VKA group (0% vs. 6.7%, P = 0.030) with no difference in rates of systemic thromboembolism (5% vs. 2.4%, P = 0.388). Conclusion These data suggest improved thrombus resolution in post-acute coronary syndrome (ACS) LV thrombosis in patients treated with NOACs compared to VKAs. This improvement in thrombus resolution was accompanied with a better safety profile for NOAC patients vs. VKA-treated patients. Thus, provides data to support a randomized trial to answer this question.


2019 ◽  
Vol 40 (Supplement_1) ◽  
Author(s):  
R Hoogeveen ◽  
J P Belo Pereira ◽  
V Zampoleri ◽  
M J Bom ◽  
W Koenig ◽  
...  

Abstract Background Currently used models to predict cardiovascular event risk have limited value. It has been shown repetitively that the addition of single biomarkers has modest impact. Recently we observed that a model consisting of a larger array of plasma proteins performed very well in predicting the presence of vulnerable plaques in primary prevention patients. However, the validation of this protein panel in predicting cardiovascular outcomes remains to be established. Purpose This study investigated the ability of a 384 preselected protein biomarkers to predict acute myocardial infarction, using state-of-the-art machine learning techniques. Secondly, we compared the performance of this multi-protein risk model to traditional risk engines. Methods We selected 822 subjects from the EPIC-Norfolk prospective cohort study, of whom 411 suffered a myocardial infarction during follow-up (median 15 years) compared to 411 controls who remained event-free (median follow-up 20 years). The 384 proteins were measured using proximity extension assay technology. Machine learning algorithms (random forests) were used for the prediction of acute myocardial infarction (ICD code I21–22). Performance of the model was tested against and on top of traditional risk factors for cardiovascular disease (refit Framingham). All performance measurements were averaged over several stability selection routines. Results Prediction of myocardial infarction using a machine-learning model consisting of 50 plasma proteins resulted in a ROC AUC of 0.74±0.14, in comparison to 0.69±0.17 using traditional risk factors (refit Framingham. Combining the proteins and refit Framingham resulted in a ROC AUC of 0.74±0.15. Focussing on events occurring within 3 years after baseline blood withdrawal, the ROC AUC increased to 0.80±0.09 using 50 plasma proteins, as opposed to 0.67±0.22 using refit Framingham (figure). Combining the protein model with refit Framingham resulted in a ROC AUC of 0.82±0.11 for these events. Diagnostic performance events <3yrs Conclusion High-throughput proteomics outperforms traditional risk factors in prediction of acute myocardial infarction. Prediction of myocardial infarction occurring within 3 years after inclusion showed highest performance. Availability of affordable proteomic approaches and developed machine learning pave the path for clinical implementation of these models in cardiovascular risk prediction. Acknowledgement/Funding This study was funded by an ERA-CVD grant (JTC2017) and EU Horizon 2020 grant (REPROGRAM, 667837)


2019 ◽  
Vol 30 (5) ◽  
pp. 655-663 ◽  
Author(s):  
Wei Shi ◽  
Shan Wang ◽  
Huifang Zhang ◽  
Guoqin Wang ◽  
Yi Guo ◽  
...  

OBJECTIVELaminoplasty has been used in recent years as an alternative approach to laminectomy for preventing spinal deformity after resection of intramedullary spinal cord tumors (IMSCTs). However, controversies exist with regard to its real role in maintaining postoperative spinal alignment. The purpose of this study was to examine the incidence of progressive spinal deformity in patients who underwent laminoplasty for resection of IMSCT and identify risk factors for progressive spinal deformity.METHODSData from IMSCT patients who had undergone laminoplasty at Beijing Tsinghua Changgung Hospital between January 2014 and December 2016 were retrospectively reviewed. Univariate tests and multivariate logistic regression analysis were used to assess the statistical relationship between postoperative spinal deformity and radiographic, clinical, and surgical variables.RESULTSOne hundred five patients (mean age 37.0 ± 14.5 years) met the criteria for inclusion in the study. Gross-total resection (> 95%) was obtained in 79 cases (75.2%). Twenty-seven (25.7%) of the 105 patients were found to have spinal deformity preoperatively, and 10 (9.5%) new cases of postoperative progressive deformity were detected. The mean duration of follow-up was 27.6 months (SD 14.5 months, median 26.3 months, range 6.2–40.7 months). At last follow-up, the median functional scores of the patients who did develop progressive spinal deformity were worse than those of the patients who did not (modified McCormick Scale: 3 vs 2, and p = 0.04). In the univariate analysis, age (p = 0.01), preoperative spinal deformity (p < 0.01), extent of tumor involvement (p < 0.01), extent of abnormal tumor signal (p = 0.02), and extent of laminoplasty (p < 0.01) were identified as factors associated with postoperative progressive spinal deformity. However, in subsequent multivariate logistic regression analysis, only age ≤ 25 years and preoperative spinal deformity emerged as independent risk factors (p < 0.05), increasing the odds of postoperative progressive deformity by 4.1- and 12.4-fold, respectively (p < 0.05).CONCLUSIONSProgressive spinal deformity was identified in 25.7% patients who had undergone laminoplasty for IMSCT resection and was related to decreased functional status. Younger age (≤ 25 years) and preoperative spinal deformity increased the risk of postoperative progressive spinal deformity. The risk of postoperative deformity warrants serious reconsideration of providing concurrent fusion during IMSCT resection or close follow-up after laminoplasty.


Blood ◽  
2014 ◽  
Vol 124 (21) ◽  
pp. 5809-5809
Author(s):  
Xiaoqin Feng ◽  
Lina Long ◽  
Chunfu Li

Abstract Objective: This retrospective study evaluated the risk factors involved in the changes in HBsAb status in patients with thalassemia major at a single center in China. Methods: A total of 104 children who underwent allo-HSCT, using NF-08-TM transplant protocol in our center, between January 2010 and June 2012 were recruited.Hepatitis B markers, including HBsAg, anti-HBs, HBeAg, anti-HBe and anti-HBc were examined by TRFIA (time-resolved fluoroimmunoassay) or ELISA (Enzyme-Linked Immunosorbent Assay) for recipients before and after allo-HSCT (at least up to 6 months) and for donors prior to transplantation. HBsAg positive recipients and donors received lamivudine antiviral therapy before allo-HSCT and the treatment was continued in recipients up to 6 months post transplantation. The demographic and clinical characteristics of the patients and their donors were summarized by descriptive statistics. For identification of risk factors that influenced the post-transplant anti-HBs loss and HBV reactivation, both univariate and multivariate logistic regression was used, and odds ratio (OR) and 95% confidence interval (CI) were determined for the covariates that were shown to be statistically significant. All tests were 2-sided, with the type I error rate fixed at 0.05. Statistical analyses were performed using IBM SPSS 20 (SPSS Statistics V20, IBM Corporation, Somers, New York). Results: Of the 104 patients, 2(1.9%) recipients were positive for HBsAg and 102(98.1%) recipients were negative for HBsAg. Of the 102 patients negative for HBsAg before transplantation, the proportion of positive anti-HBs was 69.6% (71 of 102 patients). Of the 104 donors, 99 (95.2%)were negative for HBsAg and 5 (4.8%)were positive for HBsAg. Of the 99 donors negative for HBsAg before transplantation, 72 donors (72.7%) had anti-HBs. After transplantation, of the 69 patients, 27 (39.1%) patients lost their HBV immunity in a median follow-up period of 30 months (range: 21–45); the remaining 42 (60.9 %) patients maintained the immunity against HBV after a median follow-up period of 28.5 months (range: 19–46). 33 patients were anti-HBs negative before the allo-HSCT. The 33 patients included 11 patients with donors who had no anti-HBs and 22 patients with donors who had anti-HBs. After the allo-HSCT, 15 of the 33 patients were found to have newly gained HBV immunity, as represented by the presence of anti-HBs. While 14 of them who developed adoptive immunity had immunized donors (63.6%; 14 out of 22), 1 of them (9.1%; 1 out of 11) with a non-immunized donor (donors without anti-HBs) also had developed HBV immunity. Multivariate logistic regression analysis of 104 patients who underwent allo-HSCT revealed that, patients with pre-HSCT titer of HBsAb < 257.47mIU/mL (adjusted odds ratio, 10.5, 95% CI, 2.1–53.3) and HBsAb-immunized donors (51.3, 2.8–938.6) were significant risk factors for post allo-HSCT HBV loss and acquisition, respectively. In addition, the post-transplant HBV reactivation rate was 11.1%. Conclusions: Current results indicate that pre-transplant HBsAb titer is a key determinant in the loss of HBV immunity after allo-HSCT and HBsAb negative patients with immunized donors are more likely to gain HBV immunity after allo-HSCT than those with non-immunized donors. Further, preemptive antiviral treatment with lamivudine significantly reduces HBV reactivation. This is the first study to have indicated the significant predictors of changes in HBsAg status in children with thalassemia major. Disclosures No relevant conflicts of interest to declare.


Sign in / Sign up

Export Citation Format

Share Document