P6428The waist-to-body mass index ratio is a better predictor for cardiovascular outcome in patients with established atherosclerotic cardiovascular disease - No u-shaped phenomenon

2019 ◽  
Vol 40 (Supplement_1) ◽  
Author(s):  
C F Hsuan ◽  
F J Lin ◽  
W K Tseng ◽  
Y W Wu ◽  
W H Yin ◽  
...  

Abstract Background Many studies have observed an “obesity paradox” in patients with established atherosclerotic cardiovascular disease (ASCVD), in which the body mass index (BMI)-mortality curve is U-shaped. Purpose To search a better anthropometric parameter to predict the cardiovascular events in patients with ASCVD. Methods The study was conducted from the Taiwanese Secondary Prevention for patients with AtheRosCLErotic disease (T-SPARCLE) Registry. Adult patients with stable ASCVD were enrolled. The primary composite endpoint of this study is the time of the first major cardiovascular event, defined as cardiovascular death, nonfatal myocardial infarction or stroke, or cardiac arrest with resuscitation. Dose response association between primary outcome events and various traditional anthropometric parameters and a new parameter, the waist-to-BMI ratio, was examined using the Cox proportional hazards regression model. We used restricted cubic spline regression to investigate the potential nonlinear relationship between each anthropometric measure and primary outcome events. Results A total of 6921 patients with ASCVD were included in this analysis, and were followed up for a median of 2.5 years. Multivariable Cox proportional hazards regression showed a significant positive association between the waist-to-BMI ratio and the primary outcome events (adjusted hazard ratio 1.67, 95% CI 1.12–2.49, p=0.01). Other traditional anthropometric parameters, such as BMI, weight, waist and waist-hip ratio, did not showed significant associations (p=0.10, 0.31, 0.90, and 0.52, respectively). In the restricted cubic spline regression, the positive dose response association between the primary outcome and the waist-to-BMI ratio persisted across all the waist-to-BMI ratio, and was non-linear (the likelihood ratio test for nonlinearity was statistically significant, p<0.001) with a much steeper increase in the major cardiovascular event for the waist-to-BMI ratio >3.6 cm m2/kg. Dose response curve of waist/BMI ratio Conclusion This study found the waist-to-BMI ratio to be a better predictor for major adverse cardiovascular events in established ASCVD patients than other traditional anthropometric parameters.

Cardiology ◽  
2018 ◽  
Vol 139 (4) ◽  
pp. 212-218 ◽  
Author(s):  
Yun Shen ◽  
Xueli Zhang ◽  
Yiting Xu ◽  
Qin Xiong ◽  
Zhigang Lu ◽  
...  

Objectives: To investigate whether serum fibroblast growth factor 21 (FGF21) levels can be used to predict the future development of major adverse cardiovascular events (MACEs). Methods: This study included 253 patients who received subsequent follow-up, and complete data were collected for 234 patients. Independent predictors of MACEs were identified by using the Cox proportional-hazards regression analysis. The prognostic value of FGF21 levels for MACEs was evaluated by Kaplan-Meier survival analysis. Results: Of 229 patients finally enrolled in the analysis, 27/60 without coronary artery disease (CAD) at baseline experienced a MACE, and 132/169 patients with CAD at baseline experienced a MACE. Among patients with CAD at baseline, serum FGF21 levels were significantly higher in patients with MACEs (p < 0.05) than in patients without MACEs. Kaplan-Meier survival analysis showed patients with a higher serum FGF21 had a significantly lower event-free survival (p = 0.001) than those with a lower level. Further Cox proportional-hazards regression analysis, including the traditional risk factors for cardiovascular disease, showed that serum FGF21 was an independent predictor of MACE occurrence. Conclusions: In patients with CAD at baseline, an elevated serum FGF21 level was associated with the development of a MACE in the future.


2020 ◽  
Vol 17 (1) ◽  
Author(s):  
Zhengbao Zhu ◽  
Daoxia Guo ◽  
Chongke Zhong ◽  
Aili Wang ◽  
Tan Xu ◽  
...  

Abstract Background Dickkopf-3 (Dkk-3) is implicated in the progression of atherosclerosis. This study aimed to investigate the association between serum Dkk-3 and the prognosis of ischemic stroke. Methods We measured serum Dkk-3 levels in 3344 ischemic stroke patients from CATIS (China Antihypertensive Trial in Acute Ischemic Stroke). The primary outcome was a combination of death and vascular events within 3 months after ischemic stroke. Results During 3 months of follow-up, the cumulative incidence rates of primary outcome among ischemic stroke patients in five quintiles of serum Dkk-3 (from low to high) were 4.49%, 3.74%, 2.54%, 5.23%, and 6.73%, respectively (log-rank p = 0.004). Multivariable Cox proportional hazards regression analyses showed that compared with the third quintile of serum Dkk-3, the adjusted hazard ratios (95% confidence intervals) associated with the first and fifth quintile were 3.49 (1.46–8.34) and 4.23 (1.86–9.64) for primary outcome, 3.47 (1.06–11.36) and 5.30 (1.81–15.51) for death, and 2.66 (1.01–7.01) and 3.35 (1.33–8.40) for vascular events, respectively. Multivariable-adjusted Cox proportional hazards regression model with restricted cubic splines showed a U-shaped association between serum Dkk-3 and the risk of primary outcome (p for nonlinearity = 0.030). Moreover, adding serum Dkk-3 to conventional risk factors could improve the predictive power for primary outcome (net reclassification improvement 28.44%, p < 0.001; integrated discrimination improvement 0.48%, p = 0.001). Conclusions Both low and high serum Dkk-3 levels are associated with increased risks of death and vascular events within 3 months after ischemic stroke, indicating that serum Dkk-3 may have a special effect on the prognosis of ischemic stroke. We also found that serum Dkk-3 might be a prognostic biomarker for ischemic stroke. Further studies are needed to replicate our findings and to determine the optimal levels of serum Dkk-3.


2021 ◽  
Vol 36 (Supplement_1) ◽  
Author(s):  
Yusaku Hashimoto ◽  
Takahiro Imaizumi ◽  
Akihiro Hori ◽  
Sawako Kato ◽  
Yoshinari Yasuda ◽  
...  

Abstract Background and Aims Drinking habits are one of the most important modifiable lifestyle factors to prevent the development of chronic kidney disease (CKD). Previous studies showed that it was inversely associated with the risk of developing CKD, but the dose-response relationship between alcohol consumption and the development of CKD is still controversial. In the present study, we aimed to examine whether the amount of alcohol consumed at one time is associated with new onset of CKD in general population. Method Study subjects were 11,162 Japanese aged 45 to 74 years, with an estimated glomerular filtration rate ≥60 mL/min/1.73m2, no proteinuria, no past history of cardiovascular disease, COPD or liver disease. The drinking status was obtained by self-administered questionnaires. We categorized the study subjects into four groups based on the amount of alcohol consumption: &lt;20g/time of ethanol equivalent (lowest); 20-40g/time (low intermediate); 40-60g/time (high intermediate); &gt;60g/time (highest). We set non-drinkers as a reference category. The primary outcome was the incidence of CKD, defined as 25% reduction of eGFR and to less than 60 mL/min/1.73 m2 and/or a dipstick urinalysis score of 1+ or greater (equivalent to ≥30 mg/dL) during the follow up period. We employed Cox proportional hazards regression models to examine the dose-response relationship between baseline alcohol consumption and the risk of CKD. Trend tests were performed using Cox proportional hazards regression models that treated alcohol consumption as a continuous linear term. Results Lowest and low intermediate groups were significantly associated with a decreased risk of CKD (hazard ratio [HR] 0.84; 95% confidence interval [CI], and 0.71–0.99; HR 0.79; 95% CI, 0.66–0.96, respectively) compared to non-drinkers. High intermediate group was associated with a decreased risk of CKD (HR 0.92; 95% CI, 0.70–1.21), and highest group was associated with an increased risk of CKD (HR 1.28; 95% CI, 0.84–1.95), but these associations did not reach statistical significance. There was no dose-response relationship between baseline alcohol consumption and risk of CKD (P-trend = 0.30). Conclusion A J-shape association was observed between self-reported alcohol intake and the incidence of CKD. Moderate alcohol consumption at one time may help reduce the risk of CKD.


Blood ◽  
2010 ◽  
Vol 116 (21) ◽  
pp. 4302-4302 ◽  
Author(s):  
Leo R. Zacharski ◽  
Bruce Chow ◽  
Galina Shamayeva ◽  
Philip Lavori

Abstract Abstract 4302 Background: The hypothesis that increasing iron burden (assessed by the serum ferritin level) with age contributes to the pathogenesis of cardiovascular (CVD) and other diseases of aging was tested in a 24-hospital, 6-year prospective randomized single blinded clinical trial of iron reduction by calibrated phlebotomy (JAMA 2007;297:603-9; JNCI 2008;100:996-1002). The primary outcome was all cause mortality and the secondary outcome death plus non-fatal MI and stroke (total n = 1,277 patients with PAD). Mean follow-up ferritin levels were significantly reduced in iron reduction patients (p<0.001). Although iron reduction had no significant effect on CVD outcomes overall, an association was observed for the secondary endpoint by age quartile (p for interaction = 0.004). The Cox proportional hazards regression model showed improved primary (p=0.02) and secondary (p<0.001) endpoints in the youngest age quartile patients (age 43 to 61) randomized to iron reduction versus control. Age analyzed as a continuous variable in the Cox proportional hazards regression model and the log relative hazards plots revealed that age interacted nonlinearly with iron reduction in both the primary (p=0.04) and secondary (p<0.001) endpoints. HYPOTHESIS: Iron reduction has beneficial effects on overall CVD outcomes that can be masked by interactions between age and ferritin level. METHODS: Computer randomization to iron reduction (n = 636) versus control (n = 641) groups was stratified by age, ferritin level and other prognostic variables at entry. Data tracked prospectively (including data on compliance with intervention) were subjected to pre-planned, intent-to-treat analysis. RESULTS: Mean follow-up ferritin levels declined with increasing age at entry in control patients. Older age (p = 0.027) and higher ferritin levels (p<0.001) at entry predicted poorer compliance with phlebotomy in iron reduction patients. Separation of mean follow-up ferritin levels between groups diminished with increasing age at entry. Plots of mean follow-up ferritin levels versus the log relative hazard for the primary and secondary endpoints for iron reduction patients showed significantly improved outcomes with lower mean follow-up ferritin levels (p = 0.028 and 0.044 respectively). Improvement in the primary outcome with lower ferritin levels was also found upon analysis of the entire cohort (p = 0.037). Kaplan-Meier analysis of mean follow-up ferritin levels for the entire cohort comparing patients having ferritin levels above versus below the mean showed improved primary and secondary outcomes with lower ferritin levels (p = 0.003 and 0.067 respectively). CONCLUSIONS: Lower body iron burden predicted improved clinical outcomes in patients with cardiovascular disease regardless of age or randomization status. Two factors seemed to account for the lack of an effect of intervention in the overall cohort. First, serum ferritin levels decreased with increasing age in control patients, apparently because patients with higher iron stores were more likely to die earlier. Second, younger iron reduction patients were more likely to comply with the phlebotomy intervention. Consequently, iron reduction therapy had less of a potential effect in older compared to younger patients because mean follow-up ferritin levels between groups converged with increasing age. An implication for future studies, and possibly for practice, is that adequate iron reduction targeted to patients with higher ferritin levels is more likely to be effective. These findings suggest cost – effective strategies for improving outcomes in diseases of aging. Disclosures: No relevant conflicts of interest to declare.


2021 ◽  
pp. 1-21
Author(s):  
Anne Mette L. Würtz ◽  
Mette D. Hansen ◽  
Anne Tjønneland ◽  
Eric B. Rimm ◽  
Erik B. Schmidt ◽  
...  

ABSTRACT Intake of vegetables is recommended for the prevention of myocardial infarction (MI). However, vegetables make up a heterogeneous group, and subgroups of vegetables may be differentially associated with MI. The aim of this study was to examine replacement of potatoes with other vegetables or subgroups of other vegetables and the risk of MI. Substitutions between subgroups of other vegetables and risk of MI were also investigated. We followed 29,142 women and 26,029 men aged 50-64 years in the Danish Diet, Cancer and Health cohort. Diet was assessed at baseline by using a detailed validated FFQ. Hazards ratios (HR) with 95% CI for the incidence of MI were calculated using Cox proportional hazards regression. During 13.6 years of follow-up, 656 female and 1,694 male cases were identified. Among women, the adjusted HR for MI was 1.02 (95% CI: 0.93, 1.13) per 500 g/week replacement of potatoes with other vegetables. For vegetable subgroups, the HR was 0.93 (95% CI: 0.77, 1.13) for replacement of potatoes with fruiting vegetables and 0.91 (95% CI: 0.77, 1.07) for replacement of potatoes with other root vegetables. A higher intake of cabbage replacing other vegetable subgroups was associated with a statistically non-significant higher risk of MI. A similar pattern of associations was found when intake was expressed in kcal/week. Among men, the pattern of associations was overall found to be similar to that for women. This study supports food-based dietary guidelines recommending to consume a variety of vegetables from all subgroups.


2021 ◽  
Author(s):  
Sanhe Liu ◽  
Yongzhi Li ◽  
Diansheng Cui ◽  
Yuexia Jiao ◽  
Liqun Duan ◽  
...  

Abstract BackgroundDifferent recurrence probability of non-muscle invasive bladder cancer (NMIBC) requests different adjuvant treatments and follow-up strategies. However, there is no simple, intuitive, and generally accepted clinical recurrence predictive model available for NMIBC. This study aims to construct a predictive model for the recurrence of NMIBC based on demographics and clinicopathologic characteristics from two independent centers. MethodsDemographics and clinicopathologic characteristics of 511 patients with NMIBC were retrospectively collected. Recurrence free survival (RFS) was estimated using the Kaplan-Meier method and log-rank tests. Univariate Cox proportional hazards regression analysis was used to screen variables associated with RFS, and a multivariate Cox proportional hazards regression model with a stepwise procedure was used to identify those factors of significance. A final nomogram model was built using the multivariable Cox method. The performance of the nomogram model was evaluated with respect to its calibration, discrimination, and clinical usefulness. Internal validation was assessed with bootstrap resampling. X-tile software was used for risk stratification calculated by the nomogram model. ResultsIndependent prognostic factors including tumor stage, recurrence status, and European Association of Urology (EAU) risk stratification group were introduced to the nomogram model. The model showed acceptable calibration and discrimination (area under the receiver operating characteristic [ROC] curve was 0.85; the consistency index [C-index] was 0.79 [95% CI: 0.76 to 0.82]), which was superior to the EAU risk stratification group alone. The decision curve also proved well clinical usefulness. Moreover, all populations could be stratified into three distinct risk groups by the nomogram model. ConclusionsWe established and validated a novel nomogram model that can provide individual prediction of RFS for patients with NMIBC. This intuitively prognostic nomogram model may help clinicians in postoperative treatment and follow-up decision-making.


Gut ◽  
2018 ◽  
Vol 68 (1) ◽  
pp. 62-69 ◽  
Author(s):  
Christopher M Stark ◽  
Apryl Susi ◽  
Jill Emerick ◽  
Cade M Nylund

ObjectiveGut microbiota alterations are associated with obesity. Early exposure to medications, including acid suppressants and antibiotics, can alter gut biota and may increase the likelihood of developing obesity. We investigated the association of antibiotic, histamine-2 receptor antagonist (H2RA) and proton pump inhibitor (PPI) prescriptions during early childhood with a diagnosis of obesity.DesignWe performed a cohort study of US Department of Defense TRICARE beneficiaries born from October 2006 to September 2013. Exposures were defined as having any dispensed prescription for antibiotic, H2RA or PPI medications in the first 2 years of life. A single event analysis of obesity was performed using Cox proportional hazards regression.Results333 353 children met inclusion criteria, with 241 502 (72.4%) children prescribed an antibiotic, 39 488 (11.8%) an H2RA and 11 089 (3.3%) a PPI. Antibiotic prescriptions were associated with obesity (HR 1.26; 95% CI 1.23 to 1.28). This association persisted regardless of antibiotic class and strengthened with each additional class of antibiotic prescribed. H2RA and PPI prescriptions were also associated with obesity, with a stronger association for each 30-day supply prescribed. The HR increased commensurately with exposure to each additional medication group prescribed.ConclusionsAntibiotics, acid suppressants and the combination of multiple medications in the first 2 years of life are associated with a diagnosis of childhood obesity. Microbiota-altering medications administered in early childhood may influence weight gain.


2021 ◽  
Vol 8 ◽  
Author(s):  
Shenglan Huang ◽  
Dan Li ◽  
LingLing Zhuang ◽  
Liying Sun ◽  
Jianbing Wu

The actin-related protein 2/3 complex (Arp2/3) is a major actin nucleator that has been widely reported and plays an important role in promoting the migration and invasion of various cancers. However, the expression patterns and prognostic values of Arp2/3 subunits in hepatocellular carcinoma (HCC) remain unclear. In this study, The Cancer Genome Atlas (TCGA) and UCSC Xena databases were used to obtain mRNA expression and the corresponding clinical information, respectively. The differential expression and Arp2/3 subunits in HCC were analyzed using the “limma” package of R 4.0.4 software. The prognostic value of each subunit was evaluated using Kaplan–Meier survival analysis and Cox proportional hazards regression analyses. The results revealed that mRNA expression of Arp2/3 members (ACTR2, ACTR3, ARPC1A, APRC1B, ARPC2, ARPC3, ARPC4, ARPC5, and ARPC5L) was upregulated in HCC. Higher expression of Arp2/3 members was significantly correlated with worse overall survival (OS) and shorter progression-free survival (PFS) in HCC patients. Cox proportional hazards regression analyses demonstrated that ACTR3, ARPC2, and ARPC5 were independent prognostic biomarkers of survival in patients with HCC. The relation between tumor immunocyte infiltration and the prognostic subunits was determined using the TIMER 2.0 platform and the GEPIA database. Gene set enrichment analysis (GSEA) was performed to explore the potential mechanisms of prognostic subunits in the carcinogenesis of HCC. The results revealed that ACTR3, ARPC2, and ARPC5 were significantly positively correlated with the infiltration of immune cells in HCC. The GSEA results indicated that ACTR3, ARPC2, and ARPC5 are involved in multiple cancer-related pathways that promote the development of HCC. In brief, various analyses indicated that Arp2/3 complex subunits were significantly upregulated and predicted worse survival in HCC, and they found that ACTR3, ARPC2, and ARPC5 could be used as independent predictors of survival and might be applied as promising molecular targets for diagnosis and therapy of HCC in the future.


2021 ◽  
Vol 4 (4) ◽  
pp. 401-408
Author(s):  
M. C. Musa ◽  
O. E. Asiribo ◽  
H. G. Dikko ◽  
M. Usman ◽  
S. S. Sani

An under-five childhood mortality rates in Nigeria is still high, despite efforts of government at all levels to combat the menace. This study examined some factors that significantly affect under-five child mortality. A sample of mothers with children under the age of five from Nigeria Demographic and Health Survey data (NDHS, 2013 & 2018) was used to assess the effect of some selected predictor variables (or covariates) on childhood survival. Cox proportional hazards model is essentially a regression model popularly used for investigating the association between the survival time and one or more predictor variables. The results from final fitted Cox proportional hazards regression model that the covariates, contraceptive used by the mother, state of residence, birth weight of child and type of toilet facility used by the h-ousehold were found to be significantly associated with under-five survival in the North Central Region of Nigeria. All the calculations are performed using the R software for statistical analysis.


Sign in / Sign up

Export Citation Format

Share Document