Persistent emotional distress after a first-time myocardial infarction and its association to late cardiovascular and non-cardiovascular mortality

2019 ◽  
Vol 26 (14) ◽  
pp. 1510-1518 ◽  
Author(s):  
Claudia T Lissåker ◽  
Fredrika Norlund ◽  
John Wallert ◽  
Claes Held ◽  
Erik MG Olsson

Background Patients with symptoms of depression and/or anxiety – emotional distress – after a myocardial infarction (MI) have been shown to have worse prognosis and increased healthcare costs. However, whether specific subgroups of patients with emotional distress are more vulnerable is less well established. The purpose of this study was to identify the association between different patterns of emotional distress over time with late cardiovascular and non-cardiovascular mortality among first-MI patients aged <75 years in Sweden. Methods We utilized data on 57,602 consecutive patients with a first-time MI from the national SWEDEHEART registers. Emotional distress was assessed using the anxiety/depression dimension of the European Quality of Life Five Dimensions questionnaire two and 12 months after the MI, combined into persistent (emotional distress at both time-points), remittent (emotional distress at the first follow-up only), new (emotional distress at the second-follow up only) or no distress. Data on cardiovascular and non-cardiovascular mortality were obtained until the study end-time. We used multiple imputation to create complete datasets and adjusted Cox proportional hazards models to estimate hazard ratios. Results Patients with persistent emotional distress were more likely to die from cardiovascular (hazard ratio: 1.46, 95% confidence interval: 1.16, 1.84) and non-cardiovascular causes (hazard ratio: 1.54, 95% confidence interval: 1.30, 1.82) than those with no distress. Those with remittent emotional distress were not statistically significantly more likely to die from any cause than those without emotional distress. Discussion Among patients who survive 12 months, persistent, but not remittent, emotional distress was associated with increased cardiovascular and non-cardiovascular mortality. This indicates a need to identify subgroups of individuals with emotional distress who may benefit from further assessment and specific treatment.

2018 ◽  
Vol 13 (4) ◽  
pp. 628-637 ◽  
Author(s):  
Laura C. Plantinga ◽  
Raymond J. Lynch ◽  
Rachel E. Patzer ◽  
Stephen O. Pastan ◽  
C. Barrett Bowling

Background and objectivesSerious fall injuries in the setting of ESKD may be associated with poor access to kidney transplant. We explored the burden of serious fall injuries among patients on dialysis and patients on the deceased donor waitlist and the associations of these fall injuries with waitlisting and transplantation.Design, setting, participants, & measurementsOur analytic cohorts for the outcomes of (1) waitlisting and (2) transplantation included United States adults ages 18–80 years old who (1) initiated dialysis (n=183,047) and (2) were waitlisted for the first time (n=37,752) in 2010–2013. Serious fall injuries were determined by diagnostic codes for falls plus injury (fracture, joint dislocation, or head trauma) in inpatient and emergency department claims; the first serious fall injury after cohort entry was included as a time-varying exposure. Follow-up ended at the specified outcome, death, or the last date of follow-up (September 30, 2014). We used multivariable Cox proportional hazards models to determine the independent associations between serious fall injury and waitlisting or transplantation.ResultsOverall, 2-year cumulative incidence of serious fall injury was 6% among patients on incident dialysis; with adjustment, patients who had serious fall injuries were 61% less likely to be waitlisted than patients who did not (hazard ratio, 0.39; 95% confidence interval, 0.35 to 0.44). Among incident waitlisted patients (4% 2-year cumulative incidence), those with serious fall injuries were 29% less likely than their counterparts to be subsequently transplanted (hazard ratio, 0.71; 95% confidence interval, 0.63 to 0.80).ConclusionsSerious fall injuries among United States patients on dialysis are associated with substantially lower likelihood of waitlisting for and receipt of a kidney transplant.PodcastThis article contains a podcast at https://www.asn-online.org/media/podcast/CJASN/2018_03_06_CJASNPodcast_18_4_P.mp3


2021 ◽  
pp. 000486742110096
Author(s):  
Oleguer Plana-Ripoll ◽  
Patsy Di Prinzio ◽  
John J McGrath ◽  
Preben B Mortensen ◽  
Vera A Morgan

Introduction: An association between schizophrenia and urbanicity has long been observed, with studies in many countries, including several from Denmark, reporting that individuals born/raised in densely populated urban settings have an increased risk of developing schizophrenia compared to those born/raised in rural settings. However, these findings have not been replicated in all studies. In particular, a Western Australian study showed a gradient in the opposite direction which disappeared after adjustment for covariates. Given the different findings for Denmark and Western Australia, our aim was to investigate the relationship between schizophrenia and urbanicity in these two regions to determine which factors may be influencing the relationship. Methods: We used population-based cohorts of children born alive between 1980 and 2001 in Western Australia ( N = 428,784) and Denmark ( N = 1,357,874). Children were categorised according to the level of urbanicity of their mother’s residence at time of birth and followed-up through to 30 June 2015. Linkage to State-based registers provided information on schizophrenia diagnosis and a range of covariates. Rates of being diagnosed with schizophrenia for each category of urbanicity were estimated using Cox proportional hazards models adjusted for covariates. Results: During follow-up, 1618 (0.4%) children in Western Australia and 11,875 (0.9%) children in Denmark were diagnosed with schizophrenia. In Western Australia, those born in the most remote areas did not experience lower rates of schizophrenia than those born in the most urban areas (hazard ratio = 1.02 [95% confidence interval: 0.81, 1.29]), unlike their Danish counterparts (hazard ratio = 0.62 [95% confidence interval: 0.58, 0.66]). However, when the Western Australian cohort was restricted to children of non-Aboriginal Indigenous status, results were consistent with Danish findings (hazard ratio = 0.46 [95% confidence interval: 0.29, 0.72]). Discussion: Our study highlights the potential for disadvantaged subgroups to mask the contribution of urban-related risk factors to risk of schizophrenia and the importance of stratified analysis in such cases.


2021 ◽  
pp. 1-21
Author(s):  
Anne Mette L. Würtz ◽  
Mette D. Hansen ◽  
Anne Tjønneland ◽  
Eric B. Rimm ◽  
Erik B. Schmidt ◽  
...  

ABSTRACT Intake of vegetables is recommended for the prevention of myocardial infarction (MI). However, vegetables make up a heterogeneous group, and subgroups of vegetables may be differentially associated with MI. The aim of this study was to examine replacement of potatoes with other vegetables or subgroups of other vegetables and the risk of MI. Substitutions between subgroups of other vegetables and risk of MI were also investigated. We followed 29,142 women and 26,029 men aged 50-64 years in the Danish Diet, Cancer and Health cohort. Diet was assessed at baseline by using a detailed validated FFQ. Hazards ratios (HR) with 95% CI for the incidence of MI were calculated using Cox proportional hazards regression. During 13.6 years of follow-up, 656 female and 1,694 male cases were identified. Among women, the adjusted HR for MI was 1.02 (95% CI: 0.93, 1.13) per 500 g/week replacement of potatoes with other vegetables. For vegetable subgroups, the HR was 0.93 (95% CI: 0.77, 1.13) for replacement of potatoes with fruiting vegetables and 0.91 (95% CI: 0.77, 1.07) for replacement of potatoes with other root vegetables. A higher intake of cabbage replacing other vegetable subgroups was associated with a statistically non-significant higher risk of MI. A similar pattern of associations was found when intake was expressed in kcal/week. Among men, the pattern of associations was overall found to be similar to that for women. This study supports food-based dietary guidelines recommending to consume a variety of vegetables from all subgroups.


Neurosurgery ◽  
2015 ◽  
Vol 77 (6) ◽  
pp. 880-887 ◽  
Author(s):  
Eric J. Heyer ◽  
Joanna L. Mergeche ◽  
Shuang Wang ◽  
John G. Gaudet ◽  
E. Sander Connolly

BACKGROUND: Early cognitive dysfunction (eCD) is a subtle form of neurological injury observed in ∼25% of carotid endarterectomy (CEA) patients. Statin use is associated with a lower incidence of eCD in asymptomatic patients having CEA. OBJECTIVE: To determine whether eCD status is associated with worse long-term survival in patients taking and not taking statins. METHODS: This is a post hoc analysis of a prospective observational study of 585 CEA patients. Patients were evaluated with a battery of neuropsychometric tests before and after surgery. Survival was compared for patients with and without eCD stratifying by statin use. At enrollment, 366 patients were on statins and 219 were not. Survival was assessed by using Kaplan-Meier methods and multivariable Cox proportional hazards models. RESULTS: Age ≥75 years (P = .003), diabetes mellitus (P &lt; .001), cardiac disease (P = .02), and statin use (P = .014) are significantly associated with survival univariately (P &lt; .05) by use of the log-rank test. By Cox proportional hazards model, eCD status and survival adjusting for univariate factors within statin and nonstatin use groups suggested a significant effect by association of eCD on survival within patients not taking statin (hazard ratio, 1.61; 95% confidence interval, 1.09–2.40; P = .018), and no significant effect of eCD on survival within patients taking statin (hazard ratio, 0.98; 95% confidence interval, 0.59–1.66; P = .95). CONCLUSION: eCD is associated with shorter survival in patients not taking statins. This finding validates eCD as an important neurological outcome and suggests that eCD is a surrogate measure for overall health, comorbidity, and vulnerability to neurological insult.


2019 ◽  
Vol 14 (6) ◽  
pp. 854-861 ◽  
Author(s):  
Mark E. Molitch ◽  
Xiaoyu Gao ◽  
Ionut Bebu ◽  
Ian H. de Boer ◽  
John Lachin ◽  
...  

Background and objectivesGlomerular hyperfiltration has been considered to be a contributing factor to the development of diabetic kidney disease (DKD). To address this issue, we analyzed GFR follow-up data on participants with type 1 diabetes undergoing 125I-iothalamate clearance on entry into the Diabetes Control and Complications Trial (DCCT)/Epidemiology of Diabetes Interventions and Complications study.Design, setting, participants, & measurementsThis was a cohort study of DCCT participants with type 1 diabetes who underwent an 125I-iothalamate clearance (iGFR) at DCCT baseline. Presence of hyperfiltration was defined as iGFR levels ≥140 ml/min per 1.73 m2, with secondary thresholds of 130 or 150 ml/min per 1.73 m2. Cox proportional hazards models assessed the association between the baseline hyperfiltration status and the subsequent risk of reaching an eGFR <60 ml/min per 1.73 m2.ResultsOf the 446 participants, 106 (24%) had hyperfiltration (iGFR levels ≥140 ml/min per 1.73 m2) at baseline. Over a median follow-up of 28 (interquartile range, 23, 33) years, 53 developed an eGFR <60 ml/min per 1.73 m2. The cumulative incidence of eGFR <60 ml/min per 1.73 m2 at 28 years of follow-up was 11.0% among participants with hyperfiltration at baseline, compared with 12.8% among participants with baseline GFR <140 ml/min per 1.73 m2. Hyperfiltration was not significantly associated with subsequent risk of developing an eGFR <60 ml/min per 1.73 m2 in an unadjusted Cox proportional hazards model (hazard ratio, 0.83; 95% confidence interval, 0.43 to 1.62) nor in an adjusted model (hazard ratio, 0.77; 95% confidence interval, 0.38 to 1.54). Application of alternate thresholds to define hyperfiltration (130 or 150 ml/min per 1.73 m2) showed similar findings.ConclusionsEarly hyperfiltration in patients with type 1 diabetes was not associated with a higher long-term risk of decreased GFR. Although glomerular hypertension may be a mechanism of kidney injury in DKD, higher total GFR does not appear to be a risk factor for advanced DKD.


2020 ◽  
Vol 189 (10) ◽  
pp. 1096-1113 ◽  
Author(s):  
Shawn A Zamani ◽  
Kathleen M McClain ◽  
Barry I Graubard ◽  
Linda M Liao ◽  
Christian C Abnet ◽  
...  

Abstract Recent epidemiologic studies have examined the association of fish consumption with upper gastrointestinal cancer risk, but the associations with n-3 and n-6 polyunsaturated fatty acid (PUFA) subtypes remain unclear. Using the National Institutes of Health–AARP Diet and Health Study (United States, 1995–2011), we prospectively investigated the associations of PUFA subtypes, ratios, and fish with the incidence of head and neck cancer (HNC; n = 2,453), esophageal adenocarcinoma (EA; n = 855), esophageal squamous cell carcinoma (n = 267), and gastric cancer (cardia: n = 603; noncardia: n = 631) among 468,952 participants (median follow-up, 15.5 years). A food frequency questionnaire assessed diet. Multivariable-adjusted hazard ratios were estimated using Cox proportional hazards regression. A Benjamini-Hochberg (BH) procedure was used for false-discovery control. Long-chain n-3 PUFAs were associated with a 20% decreased HNC and EA risk (for HNC, quintile5 vs. 1 hazard ratio = 0.81, 95% confidence interval: 0.71, 0.92, and BH-adjusted Ptrend = 0.001; and for EA, quintile5 vs. 1 hazard ratio = 0.79, 95% confidence interval: 0.64, 0.98, and BH-adjusted Ptrend = 0.1). Similar associations were observed for nonfried fish but only for high intake. Further, the ratio of long-chain n-3:n-6 was associated with a decreased HNC and EA risk. No consistent associations were observed for gastric cancer. Our results indicate that dietary long-chain n-3 PUFA and nonfried fish intake are associated with lower HNC and EA risk.


2019 ◽  
Vol 14 (7) ◽  
pp. 994-1001 ◽  
Author(s):  
Eli Farhy ◽  
Clarissa Jonas Diamantidis ◽  
Rebecca M. Doerfler ◽  
Wanda J. Fink ◽  
Min Zhan ◽  
...  

Background and objectivesPoor disease recognition may jeopardize the safety of CKD care. We examined safety events and outcomes in patients with CKD piloting a medical-alert accessory intended to improve disease recognition and an observational subcohort from the same population.Design, setting, participants, & measurementsWe recruited 350 patients with stage 2–5 predialysis CKD. The first (pilot) 108 participants were given a medical-alert accessory (bracelet or necklace) indicating the diagnosis of CKD and displaying a website with safe CKD practices. The subsequent (observation) subcohort (n=242) received usual care. All participants underwent annual visits with ascertainment of patient-reported events (class 1) and actionable safety findings (class 2). Secondary outcomes included 50% GFR reduction, ESKD, and death. Cox proportional hazards assessed the association of the medical-alert accessory with outcomes.ResultsMedian follow-up of pilot and observation subcohorts were 52 (interquartile range, 44–63) and 37 (interquartile range, 27–47) months, respectively. The frequency of class 1 and class 2 safety events reported at annual visits was not different in the pilot versus observation group, with 108.7 and 100.6 events per 100 patient-visits (P=0.13), and 38.3 events and 41.2 events per 100 patient visits (P=0.23), respectively. The medical-alert accessory was associated with lower crude and adjusted rate of ESKD versus the observation group (hazard ratio, 0.42; 95% confidence interval, 0.20 to 0.89; and hazard ratio, 0.38; 95% confidence interval, 0.16 to 0.94, respectively). The association of the medical-alert accessory with the composite endpoint of ESKD or 50% reduction GFR was variable over time but appeared to have an early benefit (up to 23 months) with its use. There was no significant difference in incidence of hospitalization, death, or a composite of all outcomes between medical-alert accessory users and the observational group.ConclusionsThe medical-alert accessory was not associated with incidence of safety events but was associated with a lower rate of ESKD relative to usual care.


Neurosurgery ◽  
2017 ◽  
Vol 81 (6) ◽  
pp. 935-948 ◽  
Author(s):  
Joan Margaret O’Donnell ◽  
Michael Kerin Morgan ◽  
Gillian Z Heller

Abstract BACKGROUND The evidence for the risk of seizures following surgery for brain arteriovenous malformations (bAVM) is limited. OBJECTIVE To determine the risk of seizures after discharge from surgery for supratentorial bAVM. METHODS A prospectively collected cohort database of 559 supratentorial bAVM patients (excluding patients where surgery was not performed with the primary intention of treating the bAVM) was analyzed. Cox proportional hazards regression models (Cox regression) were generated assessing risk factors, a Receiver Operator Characteristic curve was generated to identify a cut-point for size and Kaplan–Meier life table curves created to identify the cumulative freedom from postoperative seizure. RESULTS Preoperative histories of more than 2 seizures and increasing maximum diameter (size, cm) of bAVM were found to be significantly (P &lt; .01) associated with the development of postoperative seizures and remained significant in the Cox regression (size as continuous variable: P = .01; hazard ratio: 1.2; 95% confidence interval: 1.0-1.3; more than 2 seizures: P = .02; hazard ratio: 2.1; 95% confidence interval: 1.1-3.8). The cumulative risk of first seizure after discharge from hospital following resection surgery for all patients with bAVM was 5.8% and 18% at 12 mo and 7 yr, respectively. The 7-yr risk of developing postoperative seizures ranged from 11% for patients with bAVM ≤4 cm and with 0 to 2 preoperative seizures, to 59% for patients with bAVM &gt;4 cm and with &gt;2 preoperative. CONCLUSION The risk of seizures after discharge from hospital following surgery for bAVM increases with the maximum diameter of the bAVM and a patient history of more than 2 preoperative seizures.


2015 ◽  
Vol 22 (8) ◽  
pp. 1086-1093 ◽  
Author(s):  
Saeed Akhtar ◽  
Raed Alroughani ◽  
Samar F Ahmed ◽  
Jasem Y Al-Hashel

Background: The frequency of paediatric-onset multiple sclerosis (POMS) and the precise risk of secondary progression of disease are largely unknown in the Middle East. This cross-sectional cohort study assessed the risk and examined prognostic factors for time to onset of secondary progressive multiple sclerosis (SPMS) in a cohort of POMS patients. Methods: The Kuwait National MS Registry database was used to identify a cohort of POMS cases (diagnosed at age <18 years) from 1994 to 2013. Data were abstracted from patients’ records. A Cox proportional hazards model was used to evaluate the prognostic significance of the variables considered. Results: Of 808 multiple sclerosis (MS) patients, 127 (15.7%) were POMS cases. The median age (years) at disease onset was 16.0 (range 6.5–17.9). Of 127 POMS cases, 20 (15.8%) developed SPMS. A multivariable Cox proportional hazards model showed that at MS onset, brainstem involvement (adjusted hazard ratio 5.71; 95% confidence interval 1.53–21.30; P=0.010), and POMS patient age at MS onset (adjusted hazard ratio 1.38; 95% confidence interval 1.01–1.88; P=0.042) were significantly associated with the increased risk of a secondary progressive disease course. Conclusions: This study showed that POMS patients with brainstem/cerebellar presentation and a relatively higher age at MS onset had disposition for SPMS and warrant an aggressive therapeutic approach.


Neurology ◽  
2017 ◽  
Vol 89 (18) ◽  
pp. 1877-1885 ◽  
Author(s):  
Ariela R. Orkaby ◽  
Kelly Cho ◽  
Jean Cormack ◽  
David R. Gagnon ◽  
Jane A. Driver

Objective:To determine whether metformin is associated with a lower incidence of dementia than sulfonylureas.Methods:This was a retrospective cohort study of US veterans ≥65 years of age with type 2 diabetes who were new users of metformin or a sulfonylurea and had no dementia. Follow-up began after 2 years of therapy. To account for confounding by indication, we developed a propensity score (PS) and used inverse probability of treatment weighting (IPTW) methods. Cox proportional hazards models estimated the hazard ratio (HR) of incident dementia.Results:We identified 17,200 new users of metformin and 11,440 new users of sulfonylureas. Mean age was 73.5 years and mean HbA1c was 6.8%. Over an average follow-up of 5 years, 4,906 cases of dementia were diagnosed. Due to effect modification by age, all analyses were conducted using a piecewise model for age. Crude hazard ratio [HR] for any dementia in metformin vs sulfonylurea users was 0.67 (95% confidence interval [CI] 0.61–0.73) and 0.78 (95% CI 0.72–0.83) for those <75 years of age and ≥75 years of age, respectively. After PS IPTW adjustment, results remained significant in veterans <75 years of age (HR 0.89; 95% CI 0.79–0.99), but not for those ≥75 years of age (HR 0.96; 95% CI 0.87–1.05). A lower risk of dementia was also seen in the subset of younger veterans who had HbA1C values ≥7% (HR 0.76; 95% CI 0.63–0.91), had good renal function (HR 0.86; 95% CI 0.76–0.97), and were white (HR 0.87; 95% CI 0.77–0.99).Conclusions:After accounting for confounding by indication, metformin was associated with a lower risk of subsequent dementia than sulfonylurea use in veterans <75 years of age. Further work is needed to identify which patients may benefit from metformin for the prevention of dementia.


Sign in / Sign up

Export Citation Format

Share Document