scholarly journals Cumulative Consumption of Sulfur Amino Acids Intake and Incidence of Diabetes

2021 ◽  
Vol 5 (Supplement_2) ◽  
pp. 1110-1110
Author(s):  
Dong Zhen ◽  
John Jr Richie ◽  
Xiang Gao ◽  
Biyi Shen ◽  
David Orentreich

Abstract Objectives Increasing evidence in animal models and humans suggests that diets high in sulfur-containing amino acids (SAA) could be associated a greater risk for type 2 diabetes (T2D). However, data from longitudinal human studies linking dietary SAA intake with T2D is lacking. The present study aimed to examine the association between long-term dietary intake of SAA including total SAAs, methionine, and cysteine and incident T2D in participants of the Framingham Heart Study (FHS). Methods Adult participants were selected from two prospective FHS cohorts: The Offspring Cohort (followed from 1991 to 2015, n = 3799) and the Third Generation Cohort (followed from 2002 to 2011, n = 4096). Individuals identified as diabetes patients before baseline, having missing diet or covariates data, or reported extreme daily energy intake were excluded. Energy-adjusted intake of dietary SAAs was calculated from responses to a 131-item food frequency questionnaire. Cox proportional hazards models were used to evaluate associations between intakes of SAAs (in quintiles) and risk of T2D in each cohort. A combined analysis was also performed pooling subjects from both cohorts. Results Overall, we documented 471 T2D events during 9–23 years of follow-up. In both cohorts, higher SAA intake was associated with a higher risk of T2D after adjustment for demographics, traditional risk factors and related nutrients. Comparing participants in the highest quintile with those in the lowest quintile of intake, adjusted hazard ratios (95% CI) were 1.98 (1.15–3.41) for total intake (P for trend = 0.04) in the Offspring cohort, and 4.37 (1.40–13.67) (P for trend = 0.01) in the Third Generation cohort. In the combination analysis of two cohorts, adjusted hazard ratios (95% CI) were 1.98 (1.23–3.21) for total intake, 2.21 (1.38–3.53) for methionine, and 1.79 (1.12–2.87) for cysteine (P for trends < 0.03). Conclusions Higher long-term SAA intake was associated with higher risk for T2D in humans, suggesting that dietary patterns emphasizing low SAA intake are protective against development of T2D. Funding Sources No funding.

2020 ◽  
Author(s):  
Yingting Zuo ◽  
Anxin Wang ◽  
Shuohua Chen ◽  
Xue Tian ◽  
Shouling Wu ◽  
...  

Abstract Background The relationship between estimated glomerular filtration rate (eGFR) trajectories and myocardial infarction (MI) remains unclear in people with diabetes or prediabetes. We aimed to identify common eGFR trajectories in people with diabetes or prediabetes and to examine their association with MI risk. Methods The data of this analysis was derived from the Kailuan study, which was a prospective community-based cohort study. The eGFR trajectories of 24,723 participants from year 2006 to 2012 were generated by latent mixture modeling. Incident cases of MI occurred during 2012 to 2017, confirmed by review of medical records. Cox proportional hazards models were used to calculate hazard ratios (HR) and their 95% confidence intervals (CIs) for the subsequent risk of MI of different eGFR trajectories. Results We identified 5 distinct eGFR trajectories, and named them as low-stable (9.4%), moderate-stable (31.4%), moderate-increasing (29.5%), high-decreasing (13.9%) and high-stable (15.8%) according to their range and pattern. During a mean follow-up of 4.61 years, there were a total of 235 incident MI. Although, the high-decreasing group had similar eGFR levels with the moderate-stable group at last exposure period, the risk was much higher (adjusted HR, 3.43; 95%CI, 1.56–7.54 versus adjusted HR, 2.82; 95%CI, 1.34–5.95). Notably, the moderate-increasing group had reached to the normal range, still had a significantly increased risk (adjusted HR, 2.55; 95%CI, 1.21–5.39). Conclusions eGFR trajectories were associated with MI risk in people with diabetes or prediabetes. Emphasis should be placed on early and long-term detection and control of eGFR decreases to further reduce MI risk.


Cancers ◽  
2021 ◽  
Vol 13 (5) ◽  
pp. 1177
Author(s):  
In Young Choi ◽  
Sohyun Chun ◽  
Dong Wook Shin ◽  
Kyungdo Han ◽  
Keun Hye Jeon ◽  
...  

Objective: To our knowledge, no studies have yet looked at how the risk of developing breast cancer (BC) varies with changes in metabolic syndrome (MetS) status. This study aimed to investigate the association between changes in MetS and subsequent BC occurrence. Research Design and Methods: We enrolled 930,055 postmenopausal women aged 40–74 years who participated in a biennial National Health Screening Program in 2009–2010 and 2011–2012. Participants were categorized into four groups according to change in MetS status during the two-year interval screening: sustained non-MetS, transition to MetS, transition to non-MetS, and sustained MetS. We calculated multivariable-adjusted hazard ratios (aHRs) and 95% confidence intervals (CIs) for BC incidence using the Cox proportional hazards models. Results: At baseline, MetS was associated with a significantly increased risk of BC (aHR 1.11, 95% CI 1.06–1.17) and so were all of its components. The risk of BC increased as the number of the components increased (aHR 1.46, 95% CI 1.26–1.61 for women with all five components). Compared to the sustained non-MetS group, the aHR (95% CI) for BC was 1.11 (1.04–1.19) in the transition to MetS group, 1.05 (0.96–1.14) in the transition to non-MetS group, and 1.18 (1.12–1.25) in the sustained MetS group. Conclusions: Significantly increased BC risk was observed in the sustained MetS and transition to MetS groups. These findings are clinically meaningful in that efforts to recover from MetS may lead to reduced risk of BC.


2020 ◽  
Vol 41 (Supplement_2) ◽  
Author(s):  
S Kochav ◽  
R.C Chen ◽  
J.M.D Dizon ◽  
J.A.R Reiffel

Abstract Background Theoretical concern exists regarding AV block (AVB) with class I antiarrhythmics (AADs) when bundle branch block (BBB) is present. Whether this is substantiated in real-world populations is unknown. Purpose To determine the relationship between type of AAD and incidence of AVB in patients with preexisting BBB. Methods We retrospectively studied all patients with BBB who received class I and III AADs between 1997–2019 to compare incidence of AVB. We defined index time as first exposure to either drug class and excluded patients with prior AVB or exposed to both classes. Time-at-risk window ended at first outcome occurrence or when patients were no longer observed in the database. We estimated hazard ratios for incident AVB using Cox proportional hazards models with propensity score stratification, adjusting for over 32,000 covariates from the electronic health record. Kaplan-Meier methods were used to determine treatment effects over time. Results Of 40,120 individuals with BBB, 148 were exposed to a class I AAD and 2401 to a class III AAD. Over nearly 4,200 person-years of follow up, there were 22 and 620 outcome events in the class I and class III cohorts, respectively (Figure). In adjusted analyses, AVB risk was markedly lower in patients exposed to class I AADs compared with class III (HR 0.48 [95% CI 0.30–0.75]). Conclusion Among patients with BBB, exposure to class III AADs was strongly associated with greater risk of incident AVB. This likely reflects differences in natural history of patients receiving class I vs class III AADs rather than adverse class III effects, however, the lack of worse outcomes acutely with class I AADs suggests that they may be safer in BBB than suspected. Funding Acknowledgement Type of funding source: None


2011 ◽  
Vol 29 (30) ◽  
pp. 4029-4035 ◽  
Author(s):  
David J. Biau ◽  
Peter C. Ferguson ◽  
Robert E. Turcotte ◽  
Peter Chung ◽  
Marc H. Isler ◽  
...  

Purpose To examine the effect of age on the recurrence of soft tissue sarcoma in the extremities and trunk. Patients and Methods This was a multicenter study that included 2,385 patients with median age at surgery of 57 years. The end points considered were local recurrence and metastasis. Cox proportional hazards models were used to estimate hazard ratios across the age ranges with and without adjustment for known confounding factors. Results Older patients presented with tumors that were larger (P < .001) and of higher grade (P < .001). The proportion of positive margins increased significantly as patients age (P < .001), but radiation therapy was relatively underused in patients older than age 60 years. The 5-year cumulative incidences of local recurrence were 7.2% (95% CI, 4% to 11.7%) for patients age 30 years or younger and 12.9% (95% CI, 9.1% to 17.5%) for patients age 75 years or older. The corresponding 5-year cumulative incidences of metastasis were 17.5% (95% CI, 12.1% to 23.7%) and 33.9% (95% CI, 28.1% to 39.8%) for the same groups. Regression models showed that age was significantly associated with local recurrence (P < .001) and metastasis (P < .001) in nonadjusted models. After adjusting for imbalance in presentation and treatment variables, age remained significantly associated with local recurrence (P = .031) and metastasis (P = .019). Conclusion Older patients have worse outcomes because they tend to present with worse tumors and are treated less aggressively. However, there remained a significant increase in the risk of both local and systemic recurrence associated with increasing age that could not be explained by tumor or treatment characteristics.


Circulation ◽  
2007 ◽  
Vol 116 (suppl_16) ◽  
Author(s):  
Toru Aoyama ◽  
Hideki Ishii ◽  
Hiroshi Takahashi ◽  
Takanobu Toriyama ◽  
Toru Aoyama ◽  
...  

Background: The cardiovascular (CV) events and mortality are significantly higher in hemodialysis (HD) patents compared to the general population. Although it is of clinical concern to predict the occurrence of CV events in long-term HD patients, more powerful predictor has under exploration. We investigated as to whether silent brain infarction (SBI) would be a predictable factor for future CV events and mortality in a large cohort of patients with long-term HD patients. Methods: After cranial magnetic resonance imaging to detect SBI, 202 long-term HD patients (7.1 ± 5.9 years) without symptomatic stroke were prospectively followed up until the incident of CV events (stroke, cardiac events, and death). We analyzed the prognostic role of SBI in CV events with the Kaplan-Meier method and Cox proportional hazards analysis. Results: The prevalence of SBI was quite higher compared to the previous reports (71.8% in all the patients). In overall patients, 60 patients suffered from CV disease (31 for coronary artery disease, 7 for congestive heart failure, 14 for symptomatic stroke) and 29 patients died (16 for CV death) during a follow up period (mean= 23 ± 13 months). In subgroup analysis regarding the presence of SBI, CV event-free survival rate for 4 years was significantly lower in the patients with SBI compared to those without SBI (54.6% vs. 86.7%, p=0.0003). CV and overall mortality were also significantly higher in SBI patients compared with No-SBI patients (CV mortality; 20.5 % vs. 4.3 %, overall mortality; 29.0% vs. 9.1% p< 0.01, respectively). Cox proportional hazards models showed that the presence of SBI was a significant predictor of cerebrovascular and CV events and CV and overall mortality even after adjustment for other CV risk factors listed on the Table . Conclusion: SBI detected with MRI would be powerful predictor of CV events and mortality in long-term HD patients. Hazard ratio (HR) of SBI for future events and mortality


Author(s):  
David A. Baran ◽  
Justin Lansinger ◽  
Ashleigh Long ◽  
John M. Herre ◽  
Amin Yehya ◽  
...  

Background: The opioid crisis has led to an increase in available donor hearts, although questions remain about the long-term outcomes associated with the use of these organs. Prior studies have relied on historical information without examining the toxicology results at the time of organ offer. The objectives of this study were to examine the long-term survival of heart transplants in the recent era, stratified by results of toxicological testing at the time of organ offer as well as comparing the toxicology at the time of donation with variables based on reported history. Methods: The United Network for Organ Sharing database was requested as well as the donor toxicology field. Between 2007 and 2017, 23 748 adult heart transplants were performed. United Network for Organ Sharing historical variables formed a United Network for Organ Sharing Toxicology Score and the measured toxicology results formed a Measured Toxicology Score. Survival was examined by the United Network for Organ Sharing Toxicology Score and Measured Toxicology Score, as well as Cox proportional hazards models incorporating a variety of risk factors. Results: The number and percent of donors with drug use has significantly increased over the study period ( P <0.0001). Cox proportional hazards modeling of survival including toxicological and historical data did not demonstrate differences in post-transplant mortality. Combinations of drugs identified by toxicology were not associated with differences in survival. Lower donor age and ischemic time were significantly positively associated with survival ( P <0.0001). Conclusions: Among donors accepted for transplantation, neither history nor toxicological evidence of drug use was associated with significant differences in survival. Increasing use of such donors may help alleviate the chronic donor shortage.


2021 ◽  
Author(s):  
Lisa Mirel

This report describes a comparative analysis of the public-use and restricted-use NHANES LMFs. Cox proportional hazards models were used to estimate relative hazard ratios for a standard set of sociodemographic covariates for all-cause as well as cause-specific mortality, using the public-use and restricted-use NHANES LMFs.


Neurosurgery ◽  
2019 ◽  
Vol 87 (1) ◽  
pp. 63-70
Author(s):  
Haruhisa Fukuda ◽  
Daisuke Sato ◽  
Yoriko Kato ◽  
Wataro Tsuruta ◽  
Masahiro Katsumata ◽  
...  

Abstract BACKGROUND Flow diverters (FDs) have marked the beginning of innovations in the endovascular treatment of large unruptured intracranial aneurysms, but no multi-institutional studies have been conducted on these devices from both the clinical and economic perspectives. OBJECTIVE To compare retreatment rates and healthcare expenditures between FDs and conventional coiling-based treatments in all eligible cases in Japan. METHODS We identified patients who had undergone endovascular treatments during the study period (October 2015-March 2018) from a national-level claims database. The outcome measures were retreatment rates and 1-yr total healthcare expenditures, which were compared among patients who had undergone FD, coiling, and stent-assisted coiling (SAC) treatments. The coiling and SAC groups were further categorized according to the number of coils used. Retreatment rates were analyzed using Cox proportional hazards models, and total expenditures were analyzed using multilevel mixed-effects generalized linear models. RESULTS The study sample comprised 512 FD patients, 1499 coiling patients, and 711 SAC patients. The coiling groups with ≥10 coils and ≥9 coils had significantly higher retreatment rates than the FD group with hazard ratios of 2.75 (1.30-5.82) and 2.52 (1.24-5.09), respectively. In addition, the coiling group with ≥10 coils and SAC group with ≥10 coils had significantly higher 1-year expenditures than the FD group with cost ratios (95% CI) of 1.30 (1.13-1.49) and 1.31 (1.15-1.50), respectively. CONCLUSION In this national-level study, FDs demonstrated significantly lower retreatment rates and total expenditures than conventional coiling with ≥ 9 coils.


2016 ◽  
Vol 19 (16) ◽  
pp. 2991-2998 ◽  
Author(s):  
Jiang-Wei Sun ◽  
Xiao-Ou Shu ◽  
Hong-Lan Li ◽  
Wei Zhang ◽  
Jing Gao ◽  
...  

AbstractObjectiveTo investigate the potential influence of dietary Se intake on mortality among Chinese populations.DesignWe prospectively evaluated all-cause, CVD and cancer mortality risks associated with dietary Se intake in participants of the Shanghai Women’s Health Study (SWHS) and the Shanghai Men’s Health study (SMHS). Dietary Se intake was assessed by validated FFQ during in-person interviews. Cox proportional hazards models were used to calculate hazard ratios (HR) and 95 % CI.SettingUrban city in China.SubjectsChinese adults (n 133 957).ResultsDuring an average follow-up of 13·90 years in the SWHS and 8·37 years in the SMHS, 5749 women and 4217 men died. The mean estimated dietary Se intake was 45·48 μg/d for women and 51·34 μg/d for men, respectively. Dietary Se intake was inversely associated with all-cause mortality and CVD mortality in both women and men, with respective HR for the highest compared with the lowest quintile being 0·79 (95 % CI 0·71, 0·88; Ptrend<0·0001) and 0·80 (95 % CI 0·66, 0·98; Ptrend=0·0268) for women, and 0·79 (95 % CI 0·70, 0·89; Ptrend=0·0001) and 0·66 (95 % CI 0·54, 0·82; Ptrend=0·0002) for men. No significant associations were observed for cancer mortality in both women and men. Results were similar in subgroup and sensitivity analyses.ConclusionsDietary Se intake was inversely associated with all-cause and cardiovascular mortality in both sexes, but not cancer mortality.


2021 ◽  
pp. 1-14 ◽  
Author(s):  
Olga Mitelman ◽  
Hoda Z. Abdel-Hamid ◽  
Barry J. Byrne ◽  
Anne M. Connolly ◽  
Peter Heydemann ◽  
...  

Background: Studies 4658-201/202 (201/202) evaluated treatment effects of eteplirsen over 4 years in patients with Duchenne muscular dystrophy and confirmed exon-51 amenable genetic mutations. Chart review Study 4658-405 (405) further followed these patients while receiving eteplirsen during usual clinical care. Objective: To compare long-term clinical outcomes of eteplirsen-treated patients from Studies 201/202/405 with those of external controls. Methods: Median total follow-up time was approximately 6 years of eteplirsen treatment. Outcomes included loss of ambulation (LOA) and percent-predicted forced vital capacity (FVC%p). Time to LOA was compared between eteplirsen-treated patients and standard of care (SOC) external controls and was measured from eteplirsen initiation in 201/202 or, in the SOC group, from the first study visit. Comparisons were conducted using univariate Kaplan-Meier analyses and log-rank tests, and multivariate Cox proportional hazards models with regression adjustment for baseline characteristics. Annual change in FVC%p was compared between eteplirsen-treated patients and natural history study patients using linear mixed models with repeated measures. Results: Data were included from all 12 patients in Studies 201/202 and the 10 patients with available data from 405. Median age at LOA was 15.16 years. Eteplirsen-treated patients experienced a statistically significant longer median time to LOA by 2.09 years (5.09 vs. 3.00 years, p < 0.01) and significantly attenuated rates of pulmonary decline vs. natural history patients (FVC%p change: –3.3 vs. –6.0 percentage points annually, p < 0.0001). Conclusions: Study 405 highlights the functional benefits of eteplirsen on ambulatory and pulmonary function outcomes up to 7 years of follow-up in comparison to external controls.


Sign in / Sign up

Export Citation Format

Share Document