scholarly journals Time-Varying Risk of Atrial Fibrillation in Patient With Medically and Surgically Treated Primary Aldosteronism: A Nationwide Cohort Study

2021 ◽  
Vol 5 (Supplement_1) ◽  
pp. A301-A301
Author(s):  
Kyoung Jin Kim ◽  
Namki Hong ◽  
Seunghyun Lee ◽  
Yumie Rhee ◽  
Jung Soo Lim

Abstract Evidence of increased cardiovascular risk, especially atrial fibrillation, has been accumulating among patients with primary aldosteronism (PA), but there is still limited information about long-term prognosis related to different treatment strategy. The aim of this study was not only to investigate the incidence of atrial fibrillation, but also to evaluate their time-dependent changes after adrenalectomy (surgery group) or mineralocorticoid receptor antagonists (drug group) for patients with PA compared to those with essential hypertension (EH). From a nationwide cohort in Korea (2003–2017), PA were individually matched for sex, age (±10 years), and index year in a 1:5 ratio with EH. The primary end point was the time-varying risk of new-onset atrial fibrillation (NOAF) among PA according to treatment mode compared to EH. The secondary end points were the risks of major adverse cardiovascular events (composite of non-fatal myocardial infarction, non-fatal stroke, and death from cardiovascular causes), hospitalization for heart failure, and all-cause mortality. Cox proportional-hazards analysis or time-dependent Cox analysis based on the Schoenfeld residuals testing were performed. We enrolled 1,418 PA patients (755 in PA surgery group and 663 in PA drug group), and matched theses with 7,090 EH controls with a median of 5 years. The risk of incident NOAF was statistically higher in patients with PA (both surgery and drug groups) within the three years after diagnosis (adjusted hazard ratio, 3.02; p<0.001), whereas there was no statistically significance after the three years compared to EH (adjusted hazard ratio, 0.50; p=0.053). Patients in the PA drug group had higher risk of non-fatal stroke during the total followed up period (adjusted hazard ratio, 1.53, p=0.031), although the PA surgery group didn’t. In contrast, patients with PA had no statistically significant difference in risks for other secondary cardiovascular outcomes. In conclusion, this propensity cohort study of adults with PA demonstrated the changeable risk of NOAF over time possibly due to the residual effect of inappropriate aldosterone levels. These findings can provide clinically relevant guidance in the monitoring the cardiovascular complications, especially NOAF and non-fatal stroke, even after treatment among patients with PA. Acknowledgements: This study was supported by Collaborative Research Project of Korean Endocrine Society and National Health Insurance Sharing Service (NHIS-2019-4-005). We also thank Minheui Yu and Doori Cho, the members of the SENTINEL (Severance ENdocrinology daTa scIeNcE pLatform) team for technical assistance in searching and summarizing the relevant literature (4-2018-1215).

2021 ◽  
Vol 184 (1) ◽  
pp. 143-151
Author(s):  
Mijin Kim ◽  
Bo Hyun Kim ◽  
Hyungi Lee ◽  
Hyewon Nam ◽  
Sojeong Park ◽  
...  

Objective Little is known about the role of estrogen in thyroid cancer development. We aimed to evaluate the association between hysterectomy or bilateral salpingo-oophorectomy (BSO) and the risk of subsequent thyroid cancer. Design A nationwide cohort study Methods Data from the Korea National Health Insurance Service between 2002 and 2017 were used. A total of 78 961 and 592 330 women were included in the surgery group and no surgery group, respectively. The surgery group was categorized into two groups according to the extent of surgery: hysterectomy with ovarian conservation (hysterectomy-only) and BSO with or without hysterectomy (BSO). Results During 8 086 396.4 person-years of follow-up, 12 959 women developed thyroid cancer. Women in the hysterectomy-only (adjusted hazard ratio = 1.7, P < 0.001) and BSO (adjusted hazard ratio = 1.4, P < 0.001) groups had increased risk of thyroid cancer compared to those in the no surgery group. In premenopausal women, hysterectomy-only (adjusted hazard ratio = 1.7, P < 0.001) or BSO (adjusted hazard ratio = 1.4, P < 0.001) increased the risk of subsequent thyroid cancer, irrespective of hormone therapy, whereas, there was no significant association between hysterectomy-only (P = 0.204) or BSO (P = 0.857) and thyroid cancer development in postmenopausal women who had undergone hormone therapy. Conclusions Our findings do not support the hypotheses that sudden or early gradual decline in estrogen levels is a protective factor in the development of thyroid cancer, or that exogenous estrogen is a risk factor for thyroid cancer.


2021 ◽  
Vol 9 ◽  
Author(s):  
Yen-Chu Huang ◽  
Meng-Che Wu ◽  
Yu-Hsun Wang ◽  
James Cheng-Chung Wei

Background: Asthma is one of the most burdensome childhood disorders. Growing evidence disclose intestinal dysbiosis may contribute to asthma via the gut-lung axis. Constipation can lead to alteration of the gut microbiota. The clinical impact of constipation on asthma has not been researched. Therefore, we aim to assess whether pediatric constipation influence the risk of developing asthma by a nationwide population-based cohort study.Methods: We analyzed 10,363 constipated patients and 10,363 individuals without constipation between 1999 and 2013 from Taiwan's National Health Insurance Research Database. Analysis of propensity score was utilized to match age, sex, comorbidities, and medications at a ratio of 1:1. In addition, multiple Cox regression analysis was performed to evaluate the adjusted hazard ratio of asthma. Furthermore, sensitivity tests and a stratified analysis were performed.Results: After adjustment for age, sex, comorbidities, and medications, constipated patients had a 2.36-fold greater risk of asthma compared to those without constipation [adjusted hazard ratio (aHR): 2.36, 95% C.I. 2.04–2.73, p &lt; 0.001]. Furthermore, the severity of constipation is associated with an increased risk of asthma; the adjusted hazard ratio was 2.25, 2.85, and 3.44 within &lt; 3, 3–12, and ≥12 times of laxatives prescription within 1 year, respectively (p &lt; 0.001).Conclusion: Constipation was correlated with a significantly increased risk of asthma. Pediatricians should be aware of the possibility of asthma in constipated patients. Further research is warranted to investigate the possible pathological mechanisms of this association.


2019 ◽  
Author(s):  
Nicolai A Lund-Blix ◽  
German Tapia ◽  
Karl Mårild ◽  
Anne Lise Brantsaeter ◽  
Pål R Njølstad ◽  
...  

ABSTRACTOBJECTIVETo examine the association between maternal and child gluten intake and risk of type 1 diabetes in children.DESIGNPregnancy cohortSETTINGPopulation-based, nation-wide study in NorwayPARTICIPANTS86,306 children in The Norwegian Mother and Child Cohort Study born from 1999 through 2009, followed to April 15, 2018.MAIN OUTCOME MEASURESClinical type 1 diabetes, ascertained in a nation-wide childhood diabetes registry. Hazard ratios were estimated using Cox regression for the exposures maternal gluten intake up to week 22 of pregnancy and child’s gluten intake when the child was 18 months old.RESULTSDuring a mean follow-up of 12.3 years (range 0.7-16.0), 346 children (0.4%) developed type 1 diabetes (incidence rate 32.6 per 100,000 person-years). The average gluten intake was 13.6 grams/day for mothers during pregnancy, and 8.8 grams/day for the child at 18 months of age. Maternal gluten intake in mid-pregnancy was not associated with the development of type 1 diabetes in the child (adjusted hazard ratio 1.02 (95% confidence interval 0.73 to 1.43) per 10 grams/day increase in gluten intake). However, the child’s gluten intake at 18 months of age was associated with an increased risk of later developing type 1 diabetes (adjusted hazard ratio 1.46 (95% confidence interval 1.06 to 2.01) per 10 grams/day increase in gluten intake).CONCLUSIONSThis study suggests that the child’s gluten intake at 18 months of age, and not the maternal intake during pregnancy, could increase the risk of type 1 diabetes in the child.WHAT IS ALREADY KNOWN ON THIS TOPICA national prospective cohort study from Denmark found that a high maternal gluten intake during pregnancy could increase the risk of type 1 diabetes in the offspring (adjusted hazard ratio 1.31 (95% confidence interval 1.001 to 1.72) per 10 grams/day increase in gluten intake). No studies have investigated the relation between the amount of gluten intake by both the mother during pregnancy and the child in early life and risk of developing type 1 diabetes in childhood.WHAT THIS STUDY ADDSIn this prospective population-based pregnancy cohort with 86,306 children of whom 346 developed type 1 diabetes we found that the child’s gluten intake at 18 months of age was associated with the risk of type 1 diabetes (adjusted hazard ratio 1.46 (95% confidence interval 1.06 to 2.01) per 10 grams/day increase in gluten intake). This study suggests that the child’s gluten intake at 18 months of age, and not the maternal intake during pregnancy, could increase the child’s risk of type 1 diabetes.


BMJ ◽  
2018 ◽  
pp. k3547 ◽  
Author(s):  
Julie C Antvorskov ◽  
Thorhallur I Halldorsson ◽  
Knud Josefsen ◽  
Jannet Svensson ◽  
Charlotta Granström ◽  
...  

Abstract Objective To examine the association between prenatal gluten exposure and offspring risk of type 1 diabetes in humans. Design National prospective cohort study. Setting National health information registries in Denmark. Participants Pregnant Danish women enrolled into the Danish National Birth Cohort, between January 1996 and October 2002, Main outcome measures Maternal gluten intake, based on maternal consumption of gluten containing foods, was reported in a 360 item food frequency questionnaire at week 25 of pregnancy. Information on type 1 diabetes occurrence in the participants’ children, from 1 January 1996 to 31 May 2016, were obtained through registry linkage to the Danish Registry of Childhood and Adolescent Diabetes. Results The study comprised 101 042 pregnancies in 91 745 women, of whom 70 188 filled out the food frequency questionnaire. After correcting for multiple pregnancies, pregnancies ending in abortions, stillbirths, lack of information regarding the pregnancy, and pregnancies with implausibly high or low energy intake, 67 565 pregnancies (63 529 women) were included. The average gluten intake was 13.0 g/day, ranging from less than 7 g/day to more than 20 g/day. The incidence of type 1 diabetes among children in the cohort was 0.37% (n=247) with a mean follow-up period of 15.6 years (standard deviation 1.4). Risk of type 1 diabetes in offspring increased proportionally with maternal gluten intake during pregnancy (adjusted hazard ratio 1.31 (95% confidence interval 1.001 to 1.72) per 10 g/day increase of gluten). Women with the highest gluten intake versus those with the lowest gluten intake (≥20 v <7 g/day) had double the risk of type 1 diabetes development in their offspring (adjusted hazard ratio 2.00 (95% confidence interval 1.02 to 4.00)). Conclusions High gluten intake by mothers during pregnancy could increase the risk of their children developing type 1 diabetes. However, confirmation of these findings are warranted, preferably in an intervention setting.


Stroke ◽  
2021 ◽  
Author(s):  
Yi-Hsin Chan ◽  
Tze-Fan Chao ◽  
Hsin-Fu Lee ◽  
Shao-Wei Chen ◽  
Pei-Ru Li ◽  
...  

Background and Purpose: Data on clinical outcomes for nonvitamin K antagonist oral anticoagulant (NOACs) and warfarin in patients with atrial fibrillation and cancer are limited, and patients with active cancer were excluded from randomized trials. We investigated the effectiveness and safety for NOACs versus warfarin among patients with atrial fibrillation with cancer. Methods: In this nationwide retrospective cohort study from Taiwan National Health Insurance Research Database, we identified a total of 6274 and 1681 consecutive patients with atrial fibrillation with cancer taking NOACs and warfarin from June 1, 2012, to December 31, 2017, respectively. Propensity score stabilized weighting was used to balance covariates across study groups. Results: There were 1031, 1758, 411, and 3074 patients treated with apixaban, dabigatran, edoxaban, and rivaroxaban, respectively. After propensity score stabilized weighting, NOAC was associated with a lower risk of major adverse cardiovascular events (hazard ratio, 0.63 [95% CI, 0.50–0.80]; P =0.0001), major adverse limb events (hazard ratio, 0.41 [95% CI, 0.24–0.70]; P =0.0010), venous thrombosis (hazard ratio, 0.37 [95% CI, 0.23–0.61]; P <0.0001), and major bleeding (hazard ratio, 0.73 [95% CI, 0.56–0.94]; P =0.0171) compared with warfarin. The outcomes were consistent with either direct thrombin inhibitor (dabigatran) or factor Xa inhibitor (apixaban, edoxaban, and rivaroxaban) use, among patients with stroke history, and among patients with different type of cancer and local, regional, or metastatic stage of cancer ( P interaction >0.05). When compared with warfarin, NOAC was associated with lower risk of major adverse cardiovascular event, and venous thrombosis in patients aged <75 but not in those aged ≥75 years ( P interaction <0.05). Conclusions: Thromboprophylaxis with NOACs rather than warfarin should be considered for the majority of the atrial fibrillation population with cancer.


2021 ◽  
Vol 25 (71) ◽  
pp. 1-174
Author(s):  
Jonathan Bedford ◽  
Laura Drikite ◽  
Mark Corbett ◽  
James Doidge ◽  
Paloma Ferrando-Vivas ◽  
...  

Background New-onset atrial fibrillation occurs in around 10% of adults treated in an intensive care unit. New-onset atrial fibrillation may lead to cardiovascular instability and thromboembolism, and has been independently associated with increased length of hospital stay and mortality. The long-term consequences are unclear. Current practice guidance is based on patients outside the intensive care unit; however, new-onset atrial fibrillation that develops while in an intensive care unit differs in its causes and the risks and clinical effectiveness of treatments. The lack of evidence on new-onset atrial fibrillation treatment or long-term outcomes in intensive care units means that practice varies. Identifying optimal treatment strategies and defining long-term outcomes are critical to improving care. Objectives In patients treated in an intensive care unit, the objectives were to (1) evaluate existing evidence for the clinical effectiveness and safety of pharmacological and non-pharmacological new-onset atrial fibrillation treatments, (2) compare the use and clinical effectiveness of pharmacological and non-pharmacological new-onset atrial fibrillation treatments, and (3) determine outcomes associated with new-onset atrial fibrillation. Methods We undertook a scoping review that included studies of interventions for treatment or prevention of new-onset atrial fibrillation involving adults in general intensive care units. To investigate the long-term outcomes associated with new-onset atrial fibrillation, we carried out a retrospective cohort study using English national intensive care audit data linked to national hospital episode and outcome data. To analyse the clinical effectiveness of different new-onset atrial fibrillation treatments, we undertook a retrospective cohort study of two large intensive care unit databases in the USA and the UK. Results Existing evidence was generally of low quality, with limited data suggesting that beta-blockers might be more effective than amiodarone for converting new-onset atrial fibrillation to sinus rhythm and for reducing mortality. Using linked audit data, we showed that patients developing new-onset atrial fibrillation have more comorbidities than those who do not. After controlling for these differences, patients with new-onset atrial fibrillation had substantially higher mortality in hospital and during the first 90 days after discharge (adjusted odds ratio 2.32, 95% confidence interval 2.16 to 2.48; adjusted hazard ratio 1.46, 95% confidence interval 1.26 to 1.70, respectively), and higher rates of subsequent hospitalisation with atrial fibrillation, stroke and heart failure (adjusted cause-specific hazard ratio 5.86, 95% confidence interval 5.33 to 6.44; adjusted cause-specific hazard ratio 1.47, 95% confidence interval 1.12 to 1.93; and adjusted cause-specific hazard ratio 1.28, 95% confidence interval 1.14 to 1.44, respectively), than patients who did not have new-onset atrial fibrillation. From intensive care unit data, we found that new-onset atrial fibrillation occurred in 952 out of 8367 (11.4%) UK and 1065 out of 18,559 (5.7%) US intensive care unit patients in our study. The median time to onset of new-onset atrial fibrillation in patients who received treatment was 40 hours, with a median duration of 14.4 hours. The clinical characteristics of patients developing new-onset atrial fibrillation were similar in both databases. New-onset atrial fibrillation was associated with significant average reductions in systolic blood pressure of 5 mmHg, despite significant increases in vasoactive medication (vasoactive-inotropic score increase of 2.3; p < 0.001). After adjustment, intravenous beta-blockers were not more effective than amiodarone in achieving rate control (adjusted hazard ratio 1.14, 95% confidence interval 0.91 to 1.44) or rhythm control (adjusted hazard ratio 0.86, 95% confidence interval 0.67 to 1.11). Digoxin therapy was associated with a lower probability of achieving rate control (adjusted hazard ratio 0.52, 95% confidence interval 0.32 to 0.86) and calcium channel blocker therapy was associated with a lower probability of achieving rhythm control (adjusted hazard ratio 0.56, 95% confidence interval 0.39 to 0.79) than amiodarone. Findings were consistent across both the combined and the individual database analyses. Conclusions Existing evidence for new-onset atrial fibrillation management in intensive care unit patients is limited. New-onset atrial fibrillation in these patients is common and is associated with significant short- and long-term complications. Beta-blockers and amiodarone appear to be similarly effective in achieving cardiovascular control, but digoxin and calcium channel blockers appear to be inferior. Future work Our findings suggest that a randomised controlled trial of amiodarone and beta-blockers for management of new-onset atrial fibrillation in critically ill patients should be undertaken. Studies should also be undertaken to provide evidence for or against anticoagulation for patients who develop new-onset atrial fibrillation in intensive care units. Finally, given that readmission with heart failure and thromboembolism increases following an episode of new-onset atrial fibrillation while in an intensive care unit, a prospective cohort study to demonstrate the incidence of atrial fibrillation and/or left ventricular dysfunction at hospital discharge and at 3 months following the development of new-onset atrial fibrillation should be undertaken. Trial registration Current Controlled Trials ISRCTN13252515. Funding This project was funded by the National Institute for Health Research (NIHR) Health Technology Assessment programme and will be published in full in Health Technology Assessment; Vol. 25, No. 71. See the NIHR Journals Library website for further project information.


Stroke ◽  
2020 ◽  
Vol 51 (4) ◽  
pp. 1085-1093 ◽  
Author(s):  
Leonie H.A. Broersen ◽  
Helena Stengl ◽  
Christian H. Nolte ◽  
Dirk Westermann ◽  
Matthias Endres ◽  
...  

Background and Purpose— Our study aim was to estimate risk of incident stroke based on levels of hs-cTn (high-sensitivity cardiac troponin), a specific biomarker indicating myocardial injury, in the general population, patients with atrial fibrillation, and patients with previous stroke. Methods— Embase, PubMed, and Web of Science were searched until March 14, 2019 to identify relevant articles. Randomized controlled trials and cohort studies assessing the risk of incident stroke based on hs-cTn were eligible. Pooled adjusted hazard ratios including 95% CI were calculated using a random-effects model due to study heterogeneity per population, coding of hs-cTn (categorical/continuous data), per hs-cTn subunit (T or I), for low risk of bias, and for all-cause and ischemic stroke separately. Results— We included 17 articles with 96 702 participants. In studies conducted in the general population (n=12; 77 780 participants), the pooled adjusted hazard ratio for incident stroke was 1.25 (CI, 1.10–1.40) for high versus low hs-cTn (as defined by included studies) during an average follow-up of 1 to 20 years (median 10). When categorical data were used, this was increased to 1.58 (CI, 1.26–1.90). The results were robust when accounting for stroke classification (all-cause stroke/ischemic stroke), hs-cTn subunit, risk of bias, and coding of hs-cTn. In patients with atrial fibrillation (4 studies; 18 725 participants), the pooled adjusted hazard ratio for incident stroke was 1.95 (CI, 1.29–2.62) for high versus low hs-cTn. Due to lack of data (one study, 197 participants), no meta-analysis could be performed in patients with previous stroke. Conclusions— This meta-analysis suggests that hs-cTn can be regarded as a risk marker for incident stroke, with different effect size in different subgroups. More research about the association between hs-cTn and incident stroke in high-risk populations is needed, especially in patients with history of ischemic stroke.


2019 ◽  
Vol 6 ◽  
pp. 2333794X1984592
Author(s):  
Sharon Shem-Tov ◽  
Gabriel Chodick ◽  
Dalia Weitzman ◽  
Gideon Koren

Objective. To evaluate the relationship between attention-deficit hyperactivity disorder (ADHD) and injuries and to verify whether methylphenidate (MPH), is associated with decreasing the risk of injuries. Methods. A retrospective cohort study using the computerized database of Maccabi Healthcare Services. The ADHD cohort included all children between 12 and 20 years of age, newly diagnosed with ADHD between 2003 and 2013. The comparison cohort was composed of children who were not diagnosed with ADHD. The primary outcome was traumatic injuries. A Cox proportional hazard regression analysis was conducted to estimate ADHD effects on the risk of injuries. We also conducted a nested case-control study to examine how MPH influences this relationship. Results. A total of 59 798 children were included in the cohort study; 28 921 were classified as exposed (ADHD cohort) and 30 877 were unexposed. The traumatic injuries incidence in the exposed group was significantly higher (adjusted hazard ratio = 1.63 [95% confidence interval = 1.60-1.66]). Similar increased risk was documented also for severe injuries (adjusted hazard ratio = 1.72 [1.59-1.86]). MPH use was significantly associated with 28% lower injury events. Therapy groups were significantly associated with 29% to 40% lower injuries rate for medium- or long-acting MPH. The intensity of therapy was significantly associated with 29% to 33% lower injury rate when the intensity was lower than 0.69 mg/kg/day. Conclusion. Children with ADHD have a 60% increased odds of experiencing an injury. Treatment with MPH reduced the risk by up to 28%. The individual and financial cost secondary to injuries, underscores the public health significance of this problem. Injury prevention should be considered in clinical evaluation of MPH risks and benefits, beyond the conventional consideration of enhancing academic achievements.


2019 ◽  
Vol 36 (6) ◽  
pp. 685-692 ◽  
Author(s):  
Ayako Ohshima ◽  
Toshihiro Koyama ◽  
Aiko Ogawa ◽  
Yoshito Zamami ◽  
Hiroyoshi Y Tanaka ◽  
...  

Abstract Background Oral anticoagulants use has increased rapidly, internationally. Here we look at risks and benefits, based on Japanese data, of therapy with low risk non-valvular atrial fibrillation patients. Objectives Using a health insurance claims data set we assessed: (i) oral anticoagulants usage in Japan, and (ii) efficacy and safety of dabigatran compared with warfarin, in Japanese patients with non-valvular atrial fibrillation, aged 18–74 years. Methods We identified 4380 non-valvular atrial fibrillation patients treated with anticoagulants between 1 January 2005, and 28 February 2014, and estimated the adjusted hazard ratio for stroke or systemic embolism, and any hemorrhagic event (Cox proportional hazards regression model with stabilized inverse probability treatment weighting). Results The data included 101 989 anticoagulant prescriptions for 4380 patients, of which direct oral anticoagulants increased to 40.0% of the total by the end of the study. After applying exclusion criteria, 1536 new non-valvular atrial fibrillation patients were identified, including 1071 treated with warfarin and 465 with dabigatran. Mean ages were 56.11 ± 9.70 years for warfarin, and 55.80 ± 9.65 years for dabigatran. The adjusted hazard ratio (95% confidence interval), comparing dabigatran with warfarin, was 0.48 (0.25–0.91) for stroke or systemic embolism, and 0.91 (0.60–1.39) for any hemorrhage including intracranial and gastrointestinal. Conclusions Number of patients prescribed direct oral anticoagulants steadily increased, and incidence of all-cause bleeding related to dabigatran was similar to warfarin, in our study population of younger non-valvular atrial fibrillation patients. Dabigatran, compared with warfarin, generally reduced risk of all-cause stroke and systemic embolism.


2021 ◽  
pp. 002203452110372
Author(s):  
K.S. Ma ◽  
H. Hasturk ◽  
I. Carreras ◽  
A. Dedeoglu ◽  
J.J. Veeravalli ◽  
...  

Dementia and Alzheimer’s disease (AD) are proposed to be comorbid with periodontitis (PD). It is unclear whether PD is associated with dementia and AD independent of confounding factors. We aimed at identifying the relationship between the longitudinal risk of developing PD in a cohort of patients with dementia and AD who did not show any signs of PD at baseline. In this retrospective cohort study, 8,640 patients with dementia without prior PD were recruited, and 8,640 individuals without dementia history were selected as propensity score–matched controls. A Cox proportional hazard model was developed to estimate the risk of developing PD over 10 y. Cumulative probability was derived to assess the time-dependent effect of dementia on PD. Of the 8,640 patients, a sensitivity test was conducted on 606 patients with AD-associated dementia and 606 non-AD propensity score–matched controls to identify the impact of AD-associated dementia on the risk for PD. Subgroup analyses on age stratification were included. Overall 2,670 patients with dementia developed PD. The relative risk of PD in these patients was significantly higher than in the nondementia group (1.825, 95% CI = 1.715 to 1.942). Cox proportional hazard models showed that patients with dementia were more likely to have PD than individuals without dementia (adjusted hazard ratio = 1.915, 95% CI = 1.766 to 2.077, P < 0.0001, log-rank test P < 0.0001). The risk of PD in patients with dementia was age dependent ( P values for all ages <0.0001); younger patients with dementia were more likely to develop PD. The findings persisted for patients with AD: the relative risk (1.531, 95% CI = 1.209 to 1.939) and adjusted hazard ratio (1.667, 95% CI = 1.244 to 2.232; log-rank test P = 0.0004) of PD in patients with AD were significantly higher than the non-AD cohort. Our findings demonstrated that dementia and AD were associated with a higher risk of PD dependent of age and independent of systemic confounding factors.


Sign in / Sign up

Export Citation Format

Share Document