Zinc could be a target nutrient in the prevention of physical impairment and frailty in older adults due to its anti-inflammatory/antioxidant properties. However, prospective studies evaluating this inquiry are scarce. Thus, we aimed to assess the association between zinc intake and impaired lower-extremity function (ILEF) and frailty among community-dwelling older adults.
We examined 2,963 adults aged ≥60 years from the Seniors-ENRICA cohort. At baseline (2008–2010) and subsequent follow-up (2012), zinc intake (mg/d) was estimated with a validated computerized face-to-face diet history and adjusted for total energy intake. From 2012 to 2017, the occurrence of ILEF was ascertained with the Short Physical Performance Battery, and of frailty according to the Fried phenotype criteria. Analyses were conducted using Cox proportional hazard models adjusted for relevant confounders, including lifestyle, comorbidity, and dietary factors.
During follow-up, we identified 515 incident cases of ILEF and 241 of frailty. Compared to participants in the lowest tertile of zinc intake (3.99–8.36 mg/d), those in the highest tertile (9.51–21.2 mg/d) had a lower risk of ILEF [fully-adjusted hazard ratio (95% confidence interval): 0.75 (0.58–0.97); p for trend: 0.03] and of frailty [0.63 (0.44–0.92); p for trend: 0.02]. No differences in the association were seen by strata of socio-demographic and lifestyle factors.
Higher zinc intake was prospectively associated with a lower risk of ILEF and frailty among older adults, suggesting that adequate zinc intake, that can be achieved through a healthy diet, may help preserve physical function and reduce the progression to frailty.
Previous studies have explored associations between betaine and diabetes, but few have considered the effects of genes on them. We aimed to examine associations between serum betaine, methyl-metabolizing genetic polymorphisms and the risk of type 2 diabetes in Chinese adults. This prospective study comprised 1565 subjects aged 40–75 without type 2 diabetes at baseline. Serum betaine was measured by high-performance liquid chromatography tandem mass spectrometry. Genotyping of methyl-metabolizing genes was detected by Illumina ASA-750K arrays. Cox proportional hazards model was used to estimate hazard ratios (HRs) and 95% confidence intervals (CIs). During a median of 8.9 years of follow-up, 213 participants developed type 2 diabetes. Compared with participants in the lowest quartile of serum betaine, those in the highest quartile had lower risk of type 2 diabetes, adjusted HRs (95%CIs) was 0.46 (0.31, 0.69). For methylenetetrahydrofolate reductase (MTHFR) G1793A (rs2274976) and MTHFR A1298C (rs1801131), participants carrying 1793GA + AA and 1298AC + CC had lower risk of type 2 diabetes. Interactions of serum betaine and genotype of MTHFR G1793A and MTHFR A1298C could be found influencing type 2 diabetes risk. Our findings indicate that higher serum betaine, mutations of MTHFR G1793A and A1298C, as well as the joint effects of them, are associated with lower risk of type 2 diabetes.
Aims: Observational studies of various dose levels of direct oral anticoagulants (DOACs) for patients with atrial fibrillation (AF) found that a high proportion of patients received a dose lower than the target dose tested in randomized controlled trials. There is a need to compare low-dose DOACs with warfarin or other DOACs on effectiveness and safety.Methods: Using administrative data from Quebec province, Canada, we built a cohort of new warfarin or DOAC users discharged from hospital between 2011 and 2017. We determined CHA2DS2-VASc and HAS-BLED scores, and comorbidities for 3-year prior cohort entry. The primary effectiveness endpoint was a composite of ischemic stroke/systemic embolism (SE), and secondary outcomes included a safety composite of major bleeding (MB) events and effectiveness composite (stroke/SE, death) at 1-year follow-up. We contrasted each low-dose DOAC with warfarin or other DOACs as references using inverse probability of treatment weighting to estimate marginal Cox hazard ratios (HRs).Results: The cohort comprised 22,969 patients (mean age: 80–86). We did not find a significant risk reduction for the stroke/SE primary effectiveness endpoint for DOACs vs. warfarin; however, we observed a significantly lower risk for low-dose dabigatran vs. warfarin (HR [95%CI]: 0.59 [0.42–0.81]) for effectiveness composite, mainly due to a lower death rate. The differences in effectiveness and safety composites between low-dose rivaroxaban vs. warfarin were not significant. However, low-dose apixaban had a better safety composite (HR: 0.68 [0.53–0.88]) vs. warfarin. Comparisons of dabigatran vs. apixaban showed a lower risk of stroke/SE (HR: 0.53 [0.30–0.93]) and a 2-fold higher risk of MB. The MB risk was higher for rivaroxaban than for apixaban (HR: 1.58 [1.09–2.29]).Conclusions: The results of this population-based study suggest that low-dose dabigatran has a better effective composite than warfarin. Compared with apixaban, low-dose dabigatran had a better effectiveness composite but a worse safety profile. Low-dose apixaban had a better safety composite than warfarin and other low-dose DOACs. Given that the comparative effectiveness and safety seem to vary from one DOAC to another, pharmacokinetic data for specific populations are now warranted.
Several epidemiological studies have suggested that vitamin D status is associated with risk of dementia in general populations. However, due to the synergistic effect between diabetic pathology and neuroinflammation, and the prothrombotic profile in patients with diabetes, whether vitamin D is associated with risk of dementia among patients with diabetes is unclear. This study aimed to investigate the associations of circulating vitamin D levels with risks of all-cause dementia, Alzheimer disease (AD), and vascular dementia (VD) among adults with type 2 diabetes (T2D).
Methods and findings
This study included 13,486 individuals (≥60 years) with T2D and free of dementia at recruitment (2006–2010) from the UK Biobank study. Serum 25-hydroxyvitamin D (25[OH]D) concentrations were measured using the chemiluminescent immunoassay method at recruitment. Serum 25(OH)D ≥ 75 nmol/L was considered sufficient, according to the Endocrine Society Clinical Practice Guidelines. Incidence of all-cause dementia, AD, and VD cases was ascertained using electronic health records (EHRs). Each participant’s person-years at risk were calculated from the date of recruitment to the date that dementia was reported, date of death, date of loss to follow-up, or 28 February 2018, whichever occurred first. Among the 13,486 individuals with T2D (mean age, 64.6 years; men, 64.3%), 38.3% had vitamin D ≥ 50 nmol/L and only 9.1% had vitamin D ≥ 75 nmol/L. During a mean follow-up of 8.5 years, we observed 283 cases of all-cause dementia, including 101 AD and 97 VD cases. Restricted cubic spline analysis demonstrated a nonlinear relationship between serum 25(OH)D and risk of all-cause dementia (Pnonlinearity < 0.001) and VD (Pnonlinearity = 0.007), and the nonlinear association reached borderline significance for AD (Pnonlinearity = 0.06), with a threshold at around a serum 25(OH)D value of 50 nmol/L for all the outcomes. Higher serum levels of 25(OH)D were significantly associated with a lower risk of all-cause dementia, AD, and VD. The multivariate hazard ratios and 95% confidence intervals for participants who had serum 25(OH)D ≥ 50 nmol/L, compared with those who were severely deficient (25[OH]D < 25 nmol/L), were 0.41 (0.29–0.60) for all-cause dementia (Ptrend < 0.001), 0.50 (0.27–0.92) for AD (Ptrend = 0.06), and 0.41 (0.22–0.77) for VD (Ptrend = 0.01). The main limitation of the current analysis was the potential underreporting of dementia cases, as the cases were identified via EHRs.
In this study, we observed that higher concentrations of serum 25(OH)D were significantly associated with a lower risk of all-cause dementia, AD, and VD among individuals with T2D. Our findings, if confirmed by replication, may have relevance for dementia prevention strategies that target improving or maintaining serum vitamin D concentrations among patients with T2D.
Aim: Different researches showed controversial results about the ‘off-hours effect’ in nonvariceal upper gastrointestinal bleeding (NVUGIB). Materials & methods: A total of 301 patients with NVUGIB were divided into regular-hours group and off-hours group based on when they received endoscopic hemostasis, and the relationship of the clinical outcomes with off-hours endoscopic hemostasis was evaluated. Results: Patients who received off-hours endoscopy were sicker and more likely to experience worse clinical outcomes. Off-hours endoscopic hemostasis was a significant predictor of the composite outcome in higher-risk patients (adjusted OR: 4.63; 95% CI: 1.35–15.90). However, it did not associate with the outcomes in lower-risk patients. Conclusion: Off-hours effect may affect outcomes of higher-risk NVUGIB patients receiving endoscopic hemostasis (GBS ≥12).
Evidence suggests that chronic obstructive pulmonary disease (COPD) is associated with a higher risk of lung carcinoma. Using a territory-wide clinical electronic medical records system, we investigated the association between low-dose aspirin use (≤160 mg) among patients with COPD and incidence of lung carcinoma and the corresponding risk of bleeding.
Methods and findings
This is a retrospective cohort study conducted utilizing Clinical Data Analysis Reporting System (CDARS), a territory-wide database developed by the Hong Kong Hospital Authority. Inverse probability of treatment weighting (IPTW) was used to balance baseline covariates between aspirin nonusers (35,049 patients) with new aspirin users (7,679 patients) among all eligible COPD patients from 2005 to 2018 attending any public hospitals. The median age of the cohort was 75.7 years (SD = 11.5), and 80.3% were male. Competing risk regression with Cox proportional hazards model were performed to estimate the subdistribution hazard ratio (SHR) of lung carcinoma with low-dose aspirin and the associated bleeding events. Of all eligible patients, 1,779 (4.2%, 1,526 and 253 among nonusers and users) were diagnosed with lung carcinoma over a median follow-up period of 2.6 years (interquartile range [IQR]: 1.4 to 4.8). Aspirin use was associated with a 25% lower risk of lung carcinoma (SHR = 0.75, 95% confidence interval [CI] 0.65 to 0.87, p = <0.001) and 26% decrease in lung carcinoma–related mortality (SHR = 0.74, 95% CI 0.64 to 0.86, p = <0.001). Subgroup analysis revealed that aspirin was beneficial for patients aged above or below 75 years, but was also beneficial among populations who were male, nondiabetic, and nonhypertensive. Aspirin use was not associated with an increased risk of upper gastrointestinal bleeding (UGIB) (SHR = 1.19, 95% CI 0.94 to 1.53, p = 0.16), but was associated with an increased risk of hemoptysis (SHR = 1.96, 95% CI 1.73 to 2.23, p < 0.001). The main limitations of the study were (i) that one group of patients may be more likely to seek additional medical attention, although this was partially mitigated by the use of propensity score analysis; and (ii) the observational nature of the study renders it unable to establish causality between aspirin use and lung carcinoma incidence.
In this study, we observed that low-dose aspirin use was associated with a lower risk of lung carcinoma and lung carcinoma–related mortality among COPD patients. While aspirin was not associated with an increased risk of UGIB, the risk of hemoptysis was elevated.
Adherence to the Mediterranean-DASH Intervention for Neurodegenerative Delay (MIND) diet has been linked to a decreased risk of dementia, but reverse causality and residual confounding by lifestyle may partly account for this link. We aimed to address these issues by studying the associations over cumulative time periods, which may provide insight into possible reverse causality, and by using both historical and more contemporary dietary data as this could give insight into confounding since historical data may be less affected by lifestyle factors.
In the population-based Rotterdam Study, dietary intake was assessed using validated food frequency questionnaires in 5375 participants between 1989 and 1993 (baseline I) and in a largely non-overlapping sample in 2861 participants between 2009 and 2013 (baseline II). We calculated the MIND diet score and studied its association with the risk of all-cause dementia, using Cox models. Incident all-cause dementia was recorded until 2018.
During a mean follow-up of 15.6 years from baseline I, 1188 participants developed dementia. A higher MIND diet score at baseline I was associated with a lower risk of dementia over the first 7 years of follow-up (hazard ratio (HR) [95% confidence interval (CI)] per standard deviation (SD) increase, 0.85 [0.74, 0.98]), but associations disappeared over longer follow-up intervals. The mean follow-up from baseline II was 5.9 years during which 248 participants developed dementia. A higher MIND diet score at baseline II was associated with a lower risk of dementia over every follow-up interval, but associations slightly attenuated over time (HR [95% CI] for 7 years follow-up per SD increase, 0.76 [0.66, 0.87]). The MIND diet score at baseline II was more strongly associated with the risk of dementia than the MIND diet score at baseline I.
Better adherence to the MIND diet is associated with a decreased risk of dementia within the first years of follow-up, but this may in part be explained by reverse causality and residual confounding by lifestyle. Further research is needed to unravel to which extent the MIND diet may affect the risk of dementia.
Background The mechanisms underlying long-term sequelae following acute kidney injury (AKI) remain unclear. Vessel instability, an early response to endothelial injury, may reflect a shared mechanism and early trigger for chronic kidney disease (CKD) and heart failure.
Methods To investigate whether plasma angiopoietins, markers of vessel homeostasis, are associated with CKD progression and heart failure admissions after hospitalization in patients with and without AKI, we conducted a prospective cohort study to analyze the balance between angiopoietin-1 (Angpt-1), which maintains vessel stability, and angiopoietin-2 (Angpt-2), which increases vessel destabilization. Three months after discharge, we evaluated the associations between angiopoietins and development of the primary outcomes of CKD progression and heart failure, as well as the secondary outcome of all-cause mortality 3 months after discharge or later.
Results Median age for the 1503 participants was 65.8 years; 746 (50%) had AKI. Compared with the lowest quartile, the highest quartile of the Angpt-1:Angpt-2 ratio was associated with 72% lower risk of CKD progression (adjusted hazard ratio [aHR], 0.28; 95% confidence interval [95% CI], 0.15 to 0.51), 94% lower risk of heart failure (aHR, 0.06; 95% CI, 0.02 to 0.15), and 82% lower risk of mortality (aHR, 0.18; 95% CI, 0.09 to 0.35) for those with AKI. Among those without AKI, the highest quartile of Angpt-1:Angpt-2 ratio was associated with 71% lower risk of heart failure (aHR, 0.29; 95% CI, 0.12 to 0.69) and 68% less mortality (aHR, 0.32; 95% CI, 0.15 to 0.68). There were no associations with CKD progression.
Conclusions A higher Angpt-1:Angpt-2 ratio was strongly associated with less CKD progression, heart failure, and mortality in the setting of AKI.