Which antidepressants are associated with increased risk of developing mania? A retrospective electronic case register cohort study

2016 ◽  
Vol 33 (S1) ◽  
pp. s228-s229
Author(s):  
R. Patel ◽  
P. Reiss ◽  
H. Shetty ◽  
M. Broadbent ◽  
R. Stewart ◽  
...  

IntroductionThe symptoms of bipolar disorder are sometimes misrecognised for unipolar depression and inappropriately treated with antidepressants. This may be associated with increased risk of developing mania. However, the extent to which this depends on what type of antidepressant is prescribed remains unclear.AimsTo investigate the association between different classes of antidepressants and subsequent onset of mania/bipolar disorder in a real-world clinical setting.MethodsData on prior antidepressant therapy were extracted from 21,012 adults with unipolar depression receiving care from the South London and Maudsley NHS Foundation Trust (SLaM). multivariable Cox regression analysis (with age and gender as covariates) was used to investigate the association of antidepressant therapy with risk of developing mania/bipolar disorder.ResultsIn total, 91,110 person-years of follow-up data were analysed (mean follow-up: 4.3 years). The overall incidence rate of mania/bipolar disorder was 10.9 per 1000 person-years. The peak incidence of mania/bipolar disorder was seen in patients aged between 26 and 35 years (12.3 per 1000 person-years). The most frequently prescribed antidepressants were SSRIs (35.5%), mirtazapine (9.4%), venlafaxine (5.6%) and TCAs (4.7%). Prior antidepressant treatment was associated with an increased incidence of mania/bipolar disorder ranging from 13.1 to 19.1 per 1000 person-years. Multivariable analysis indicated a significant association with SSRIs (hazard ratio 1.34, 95% CI 1.18–1.52) and venlafaxine (1.35, 1.07–1.70).ConclusionsIn people with unipolar depression, antidepressant treatment is associated with an increased risk of subsequent mania/bipolar disorder. These findings highlight the importance of considering risk factors for mania when treating people with depression.Disclosure of interestThe authors have not supplied their declaration of competing interest.

Neurology ◽  
2019 ◽  
Vol 92 (24) ◽  
pp. e2735-e2742 ◽  
Author(s):  
Mao-Hsuan Huang ◽  
Chih-Ming Cheng ◽  
Kai-Lin Huang ◽  
Ju-Wei Hsu ◽  
Ya-Mei Bai ◽  
...  

ObjectiveTo evaluate the risk of Parkinson disease (PD) among patients with bipolar disorder (BD).MethodsUsing the Taiwan National Health Insurance Research Database, we examined 56,340 patients with BD and 225,360 age- and sex-matched controls between 2001 and 2009 and followed them to the end of 2011. Individuals who developed PD during the follow-up period were identified.ResultsPatients with BD had a higher incidence of PD (0.7% vs 0.1%, p < 0.001) during the follow-up period than the controls. A Cox regression analysis with adjustments for demographic data and medical comorbid conditions revealed that patients with BD were more likely to develop PD (hazard ratio [HR] 6.78, 95% confidence interval [CI] 5.74–8.02) than the control group. Sensitivity analyses after exclusion of the first year (HR 5.82, 95% CI 4.89–6.93) or first 3 years (HR 4.42; 95% CI 3.63–5.37) of observation showed consistent findings. Moreover, a high frequency of psychiatric admission for manic/mixed and depressive episodes was associated with an increased risk of developing PD.ConclusionPatients with BD had a higher incidence of PD during the follow-up period than the control group. Manic/mixed and depressive episodes were associated with an elevated likelihood of developing PD. Further studies are necessary to investigate the underlying pathophysiology between BD and PD.


2020 ◽  
Vol 8 (1) ◽  
pp. e001325 ◽  
Author(s):  
Ramachandran Rajalakshmi ◽  
Coimbatore Subramanian Shanthi Rani ◽  
Ulagamathesan Venkatesan ◽  
Ranjit Unnikrishnan ◽  
Ranjit Mohan Anjana ◽  
...  

IntroductionPrevious epidemiological studies have reported on the prevalence of diabetic kidney disease (DKD) and diabetic retinopathy (DR) from India. The aim of this study is to evaluate the effect of DKD on the development of new-onset DR and sight-threatening diabetic retinopathy (STDR) in Asian Indians with type 2 diabetes (T2D).Research design and methodsThe study was done on anonymized electronic medical record data of people with T2D who had undergone screening for DR and renal work-up as part of routine follow-up at a tertiary care diabetes center in Chennai, South India. The baseline data retrieved included clinical and biochemical parameters including renal profiles (serum creatinine, estimated glomerular filtration rate (eGFR) and albuminuria). Grading of DR was performed using the modified Early Treatment Diabetic Retinopathy Study grading system. STDR was defined as the presence of proliferative diabetic retinopathy (PDR) and/or diabetic macular edema. DKD was defined by the presence of albuminuria (≥30 µg/mg) and/or reduction in eGFR (<60 mL/min/1.73 m2). Cox regression analysis was used to evaluate the hazard ratio (HR) for DR and STDR.ResultsData of 19 909 individuals with T2D (mean age 59.6±10.2 years, mean duration of diabetes 11.1±12.1 years, 66.1% male) were analyzed. At baseline, DR was present in 7818 individuals (39.3%), of whom 2249 (11.3%) had STDR. During the mean follow-up period of 3.9±1.9 years, 2140 (17.7%) developed new-onset DR and 980 individuals with non-proliferative DR (NPDR) at baseline progressed to STDR. Higher serum creatinine (HR 1.5, 95% CI 1.3 to 1.7; p<0.0001), eGFR <30 mL/min/1.73 m2 (HR 4.9, 95% CI 2.9 to 8.2; p<0.0001) and presence of macroalbuminuria >300 µg/mg (HR 3.0, 95% CI 2.4 to 3.8; p<0.0001) at baseline were associated with increased risk of progression to STDR.ConclusionsDKD at baseline is a risk factor for progression to STDR. Physicians should promptly refer their patients with DKD to ophthalmologists for timely detection and management of STDR.


Author(s):  
Shih-Hsiang Ou ◽  
Chu-Lin Chou ◽  
Chia-Wei Lin ◽  
Wu-Chien Chien ◽  
Te-Chao Fang ◽  
...  

The association between gout and injury remains unclear. This study aimed to investigate the injury risk in patients with gout. Using the Longitudinal Health Insurance Database (LHID) from 2000 to 2010 in Taiwan, patients with gout (group CFG) and those without gout (group C) were enrolled for further analysis. The CFG group was separated into two subgroups (with and without medication) to determine whether the risk of injury was reduced with drug intervention. The follow-up period was defined as the time from the initial diagnosis of gout to the date of injury. A total of 257,442 individuals were enrolled in this study, with 85,814 people in group CFG and 171,628 people in group C. Using Cox regression analysis, group CFG showed a significant increase in the risk of injury. Traffic injuries, poisoning, falls, crushing/cutting/piercing injury, and suicides were prominent among these injuries. Furthermore, when urate-lowing drugs were used to treat the CFG group, there were no significant differences in the occurrence of injury. Patients with gout had an increased risk of injury overall, and drug intervention did not lower the risk of injury in these patients.


2020 ◽  
Vol 41 (Supplement_2) ◽  
Author(s):  
S Dedic ◽  
N Boskovic ◽  
V Giga ◽  
M Tesic ◽  
S Aleksandric ◽  
...  

Abstract Background Previous studies have shown that left bundle branch block (LBBB), as a relatively common electrocardiographic (ECG) abnormality, represents the condition with often non benign and sometimes adverse outcome. Purpose The Aim of our study was to determine the predictive value of a stress echocardiography test in patients with LBBB. Methods Our study population included 189 patients (88 male, 46.6%, mean age 63.08±9.65) with diagnosed left bundle branch block who performed stress echocardiography (SECHO) according to Bruce protocol. Median follow-up of the patients was 56 months (IQR 48–71 months) for the occurrence of cardiovascular death and non-fatal myocardial infarction, repeat revascularization (coronary artery bypass grafting-CABG or percutaneous coronary intervention-PCI). Results Out of 189 patients, 32 (16.9%) patients had positive, while 157 (83.1%) patients had negative SECHO test. During the follow up period 28 patients had major adverse cardiac event: 1 nonfatal myocardial infarction, 6 heart failure hospitalizations, 5 CABGs, 8 PCIs, while 8 patients had cardiac death. Using the Cox regression analysis, univariate predictors of adverse cardiac events were diabetes mellitus (HR 4.530 [95% CI 1.355–15.141], p=0.014), PCI (HR 4.288 [95% [95% CI 2.010–9.144], p&lt;0.001) and positive SECHO test (HR 2.289 [95% CI 1.006–5207], p=0.048). In the multivariate analysis only previous PCI remained independent predictor of adverse events (HR 3.650 [95% CI 1.665–8.003], p=0.001). p=0.048). Using the Kaplan-Meier survival curve the patients with negative SECHO had better outcome compared to patients with positive SECHO (140/160; 87,5% vs 21/29; 72.4%, p=0.035) and much longer event-free time (77.4±1.6 months vs 67.1±5.4 months, Log Rank 4.136, p=0.042) Conclusion Patients with LBBB and negative SEHO test have good prognosis. Patients with history of CAD and diabetes mellitus and LBBB are at increased risk for future events and need periodical reassessment. Funding Acknowledgement Type of funding source: None


Cephalalgia ◽  
2013 ◽  
Vol 34 (5) ◽  
pp. 327-335 ◽  
Author(s):  
Knut Hagen ◽  
Eystein Stordal ◽  
Mattias Linde ◽  
Timothy J Steiner ◽  
John-Anker Zwart ◽  
...  

Background Headache has not been established as a risk factor for dementia. The aim of this study was to determine whether any headache was associated with subsequent development of vascular dementia (VaD), Alzheimer’s disease (AD) or other types of dementia. Methods This prospective population-based cohort study used baseline data from the Nord-Trøndelag Health Study (HUNT 2) performed during 1995–1997 and, from the same Norwegian county, a register of cases diagnosed with dementia during 1997–2010. Participants aged ≥20 years who responded to headache questions in HUNT 2 were categorized (headache free; with any headache; with migraine; with nonmigrainous headache). Hazard ratios (HRs) for later inclusion in the dementia register were estimated using Cox regression analysis. Results Of 51,383 participants providing headache data in HUNT 2, 378 appeared in the dementia register during the follow-up period. Compared to those who were headache free, participants with any headache had increased risk of VaD ( n = 63) (multivariate-adjusted HR = 2.3, 95% CI 1.4–3.8, p = 0.002) and of mixed dementia (VaD and AD ( n = 52)) (adjusted HR = 2.0, 95% CI 1.1–3.5, p = 0.018). There was no association between any headache and later development of AD ( n = 180). Conclusion In this prospective population-based cohort study, any headache was a risk factor for development of VaD.


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Po-Yu Jay Chen ◽  
Lei Wan ◽  
Jung-Nien Lai ◽  
Chih Sheng Chen ◽  
Jamie Jiin-Yi Chen ◽  
...  

Abstract Background This study aimed to investigate the risk of Parkinson’s disease (PD) among patients with age-related macular degeneration (AMD) and its association with confounding comorbidities. Methods A population-based retrospective cohort study was conducted using Longitudinal Health Insurance Database 2000 (LHID2000). We established AMD and non-AMD cohorts from January 1, 2000 to December 31, 2012 to determine the diagnosis of PD. A total of 20,848 patients were enrolled, with 10,424 AMD patients and 10,424 controls matched for age, sex, and index year at a 1:1 ratio. The follow-up period was from the index date of AMD diagnosis to the diagnosis of PD, death, withdrawal from the insurance program, or end of 2013. Multivariable Cox regression analysis was performed to examine the hazard ratio (HR) and 95% confidence interval (CI) for the risk of PD between the AMD and non-AMD cohorts. Result After adjusting for potential confounders, there was a higher risk of developing PD in the AMD cohort than in the non-AMD cohort (adjusted HR = 1.35, 95% CI = 1.16–1.58). A significant association could be observed in both female (aHR = 1.42, 95% CI = 1.13–1.80) and male (aHR = 1.28, 95% CI = 1.05–1.57) patients, aged more than 60 years (60–69: aHR = 1.51, 95% CI = 1.09–2.09, 70–79: aHR = 1.30, 95% CI = 1.05–1.60; 80–100: aHR = 1.40, 95% CI = 1.01–1.95), and with more than one comorbidity (aHR = 1.40, 95% CI = 1.20–1.64). A significant association between increased risk of PD and AMD was observed among patients with comorbidities of osteoporosis (aHR = 1.68, 95% CI = 1.22–2.33), diabetes (aHR = 1.41, 95% CI = 1.12–1.78) and hypertension (aHR = 1.36, 95% CI = 1.15–1.62) and medications of statin (aHR = 1.42, 95% CI = 1.19–1.69) and calcium channel blocker (CCB) (aHR = 1.32, 95% CI = 1.11–1.58). The cumulative incidence of PD was significantly higher over the 12-year follow-up period in AMD cohort (log-rank test, p < 0.001). Conclusions Patients with AMD may exhibit a higher risk of PD than those without AMD.


2020 ◽  
Author(s):  
Xiaohan You ◽  
Ying Zhou ◽  
Jianna Zhang ◽  
Qiongxiu Zhou ◽  
Yanling Shi ◽  
...  

Abstract Background : Continuous ambulatory peritoneal dialysis (CAPD) patients have a high incidence of stroke and commonly have increased parathyroid hormone levels and vitamin D insufficiency. We seek to investigate the incidence of stroke and the role of parathyroid hormone and vitamin D supplementation in stroke risk among CAPD patients. Methods: This study employed a retrospective design. We enrolled a Chinese cohort of 980 CAPD patients who were routinely followed in our department. The demographic and clinical data were recorded at the time of initial CAPD and during follow-up. The included patients were separated into non-stroke and stroke groups. The effects of parathyroid hormone and vitamin D supplementation on stroke in CAPD patients was evaluated. The primary endpoint is defined as the first occurrence of stroke, and composite endpoint events are defined as death or switch to hemodialysis during follow-up. Results: A total of 757 eligible CAPD patients with a mean follow-up time of 54.7 (standard deviation, 33) months were included in the study. The median incidence of stroke among our CAPD patients was 18.9 (interquartile range, 15.7 - 22.1) per 1000 person-years. A significant nonlinear correlation between baseline iPTH and hazard of stroke (p-value of linear association = 0.2 and nonlinear association = 0.002) was observed in our univariate Cox regression analysis, and low baseline iPTH levels (≤150 pg/ml) were associated with an increased cumulative hazard of stroke. Multivariate Cox regression analysis indicated a significant interaction effect between age and iPTH after adjusting for other confounders. Vitamin D supplementation during follow-up was a predictive factor for stroke in our cohort. Conclusions: CAPD patients suffered a high risk of stroke, and lower iPTH levels were significantly correlated with an increased risk of stroke. Nevertheless, vitamin D supplementation may reduce the risk of stroke in these patients.


Neurology ◽  
2019 ◽  
Vol 92 (21) ◽  
pp. e2432-e2443 ◽  
Author(s):  
Joan Martí-Fàbregas ◽  
Santiago Medrano-Martorell ◽  
Elisa Merino ◽  
Luis Prats-Sánchez ◽  
Rebeca Marín ◽  
...  

ObjectiveWe tested the hypothesis that the risk of intracranial hemorrhage (ICH) in patients with cardioembolic ischemic stroke who are treated with oral anticoagulants (OAs) can be predicted by evaluating surrogate markers of hemorrhagic-prone cerebral angiopathies using a baseline MRI.MethodsPatients were participants in a multicenter and prospective observational study. They were older than 64 years, had a recent cardioembolic ischemic stroke, and were new users of OAs. They underwent a baseline MRI analysis to evaluate microbleeds, white matter hyperintensities, and cortical superficial siderosis. We collected demographic variables, clinical characteristics, risk scores, and therapeutic data. The primary endpoint was ICH that occurred during follow-up. We performed bivariate and multivariate Cox regression analyses.ResultsWe recruited 937 patients (aged 77.6 ± 6.5 years; 47.9% were men). Microbleeds were detected in 207 patients (22.5%), moderate/severe white matter hyperintensities in 419 (45.1%), and superficial siderosis in 28 patients (3%). After a mean follow-up of 23.1 ± 6.8 months, 18 patients (1.9%) experienced an ICH. In multivariable analysis, microbleeds (hazard ratio 2.7, 95% confidence interval [CI] 1.1–7, p = 0.034) and moderate/severe white matter hyperintensities (hazard ratio 5.7, 95% CI 1.6–20, p = 0.006) were associated with ICH (C index 0.76, 95% CI 0.66–0.85). Rate of ICH was highest in patients with both microbleed and moderate/severe WMH (3.76 per 100 patient-years, 95% CI 1.62–7.4).ConclusionPatients taking OAs who have advanced cerebral small vessel disease, evidenced by microbleeds and moderate to severe white matter hyperintensities, had an increased risk of ICH. Our results should help to determine the risk of prescribing OA for a patient with cardioembolic stroke.ClinicalTrials.gov identifierNCT02238470.


2020 ◽  
Vol 4 (15) ◽  
pp. 3639-3647 ◽  
Author(s):  
Zachary Gowanlock ◽  
Anastasiya Lezhanska ◽  
Maeve Conroy ◽  
Mark Crowther ◽  
Maria Tiboni ◽  
...  

Abstract Iron deficiency is a common consequence of bariatric surgery and frequently leads to anemia. Our study reports the incidence and predictors of iron deficiency, iron deficiency anemia (IDA), and IV iron use after bariatric surgery. We conducted a retrospective study of all adult patients who underwent bariatric surgery from January to December 2012 at the regional bariatric surgery center in Hamilton, Ontario, Canada, and were followed for at least 6 months. Time-to-event data were presented as Kaplan-Meier curves. Cox regression analysis was used to identify outcome predictors. A total of 388 patients met the inclusion criteria. Iron deficiency, IDA, and the use of IV iron were reported in 43%, 16%, and 6% of patients, respectively, with a mean follow-up of 31 months. The cumulative incidence of iron deficiency and IDA increased with longer follow-up, and there was a significant increase in IV iron use starting 3 years after surgery. Malabsorptive procedures (hazard ratio [HR], 1.92; 95% confidence interval [CI], 1.20-3.06; P = .006) and low baseline ferritin (HR, 0.96; 95% CI, 0.95-0.97; P &lt; .001) were associated with an increased risk of iron deficiency. Young age (HR, 0.90; 95% CI, 0.82-0.99; P = .028), baseline anemia (HR, 19.6; 95% CI, 7.85-48.9; P &lt; .001), and low baseline ferritin (HR, 0.96; 95% CI, 0.95-0.98; P &lt; .001) were associated with an increased risk of IDA. Our results suggest that IDA is a delayed consequence of bariatric surgery and that preoperative assessment of patient risk may be possible.


2020 ◽  
Vol 9 (24) ◽  
Author(s):  
Maria Lukács Krogager ◽  
Peter Søgaard ◽  
Christian Torp‐Pedersen ◽  
Henrik Bøggild ◽  
Gunnar Gislason ◽  
...  

Background Hyperkalemia can be harmful, but the effect of correcting hyperkalemia is sparsely studied. We used nationwide data to examine hyperkalemia follow‐up in patients with hypertension. Methods and Results We identified 7620 patients with hypertension, who had the first plasma potassium measurement ≥4.7 mmol/L (hyperkalemia) within 100 days of combination antihypertensive therapy initiation. A second potassium was measured 6 to 100 days after the episode of hyperkalemia. All‐cause mortality within 90 days of the second potassium measurement was assessed using Cox regression. Mortality was examined for 8 predefined potassium intervals derived from the second measurement: 2.2 to 2.9 mmol/L (n=37), 3.0 to 3.4 mmol/L (n=184), 3.5 to 3.7 mmol/L (n=325), 3.8 to 4.0 mmol/L (n=791), 4.1 to 4.6 mmol/L (n=3533, reference), 4.7 to 5.0 mmol/L (n=1786), 5.1 to 5.5 mmol/L (n=720), and 5.6 to 7.8 mmol/L (n=244). Ninety‐day mortality in the 8 strata was 37.8%, 21.2%, 14.5%, 9.6%, 6.3%, 6.2%, 10.0%, and 16.4%, respectively. The multivariable analysis showed that patients with concentrations >5.5 mmol/L after an episode of hyperkalemia had increased mortality risk compared with the reference (hazard ratio [HR], 2.27; 95% CI, 1.60–3.20; P <0.001). Potassium intervals 3.5 to 3.7 mmol/L and 3.8 to 4.0 mmol/L were also associated with increased risk of death (HR, 1.71; 95% CI, 1.23–2.37; P <0.001; HR, 1.36; 95% CI, 1.04–1.76; P <0.001, respectively) compared with the reference group. We observed a trend toward increased risk of death within the interval 5.1 to 5.5 mmol/L (HR, 1.29; 95% CI, 0.98–1.69). Potassium concentrations <4.1 mmol/L and >5.0 mmol/L were associated with increased risk of cardiovascular death. Conclusions Overcorrection of hyperkalemia to levels <4.1 mmol/L was frequent and associated with increased all‐cause and cardiovascular mortality. Potassium concentrations >5.5 mmol/L were also associated with an increased all‐cause and cardiovascular mortality.


Sign in / Sign up

Export Citation Format

Share Document