scholarly journals Dietary Polyunsaturated Fat Intake in Relation to Head and Neck, Esophageal, and Gastric Cancer Incidence in the National Institutes of Health–AARP Diet and Health Study

2020 ◽  
Vol 189 (10) ◽  
pp. 1096-1113 ◽  
Author(s):  
Shawn A Zamani ◽  
Kathleen M McClain ◽  
Barry I Graubard ◽  
Linda M Liao ◽  
Christian C Abnet ◽  
...  

Abstract Recent epidemiologic studies have examined the association of fish consumption with upper gastrointestinal cancer risk, but the associations with n-3 and n-6 polyunsaturated fatty acid (PUFA) subtypes remain unclear. Using the National Institutes of Health–AARP Diet and Health Study (United States, 1995–2011), we prospectively investigated the associations of PUFA subtypes, ratios, and fish with the incidence of head and neck cancer (HNC; n = 2,453), esophageal adenocarcinoma (EA; n = 855), esophageal squamous cell carcinoma (n = 267), and gastric cancer (cardia: n = 603; noncardia: n = 631) among 468,952 participants (median follow-up, 15.5 years). A food frequency questionnaire assessed diet. Multivariable-adjusted hazard ratios were estimated using Cox proportional hazards regression. A Benjamini-Hochberg (BH) procedure was used for false-discovery control. Long-chain n-3 PUFAs were associated with a 20% decreased HNC and EA risk (for HNC, quintile5 vs. 1 hazard ratio = 0.81, 95% confidence interval: 0.71, 0.92, and BH-adjusted Ptrend = 0.001; and for EA, quintile5 vs. 1 hazard ratio = 0.79, 95% confidence interval: 0.64, 0.98, and BH-adjusted Ptrend = 0.1). Similar associations were observed for nonfried fish but only for high intake. Further, the ratio of long-chain n-3:n-6 was associated with a decreased HNC and EA risk. No consistent associations were observed for gastric cancer. Our results indicate that dietary long-chain n-3 PUFA and nonfried fish intake are associated with lower HNC and EA risk.

2021 ◽  
pp. 000486742110096
Author(s):  
Oleguer Plana-Ripoll ◽  
Patsy Di Prinzio ◽  
John J McGrath ◽  
Preben B Mortensen ◽  
Vera A Morgan

Introduction: An association between schizophrenia and urbanicity has long been observed, with studies in many countries, including several from Denmark, reporting that individuals born/raised in densely populated urban settings have an increased risk of developing schizophrenia compared to those born/raised in rural settings. However, these findings have not been replicated in all studies. In particular, a Western Australian study showed a gradient in the opposite direction which disappeared after adjustment for covariates. Given the different findings for Denmark and Western Australia, our aim was to investigate the relationship between schizophrenia and urbanicity in these two regions to determine which factors may be influencing the relationship. Methods: We used population-based cohorts of children born alive between 1980 and 2001 in Western Australia ( N = 428,784) and Denmark ( N = 1,357,874). Children were categorised according to the level of urbanicity of their mother’s residence at time of birth and followed-up through to 30 June 2015. Linkage to State-based registers provided information on schizophrenia diagnosis and a range of covariates. Rates of being diagnosed with schizophrenia for each category of urbanicity were estimated using Cox proportional hazards models adjusted for covariates. Results: During follow-up, 1618 (0.4%) children in Western Australia and 11,875 (0.9%) children in Denmark were diagnosed with schizophrenia. In Western Australia, those born in the most remote areas did not experience lower rates of schizophrenia than those born in the most urban areas (hazard ratio = 1.02 [95% confidence interval: 0.81, 1.29]), unlike their Danish counterparts (hazard ratio = 0.62 [95% confidence interval: 0.58, 0.66]). However, when the Western Australian cohort was restricted to children of non-Aboriginal Indigenous status, results were consistent with Danish findings (hazard ratio = 0.46 [95% confidence interval: 0.29, 0.72]). Discussion: Our study highlights the potential for disadvantaged subgroups to mask the contribution of urban-related risk factors to risk of schizophrenia and the importance of stratified analysis in such cases.


Neurosurgery ◽  
2015 ◽  
Vol 77 (6) ◽  
pp. 880-887 ◽  
Author(s):  
Eric J. Heyer ◽  
Joanna L. Mergeche ◽  
Shuang Wang ◽  
John G. Gaudet ◽  
E. Sander Connolly

BACKGROUND: Early cognitive dysfunction (eCD) is a subtle form of neurological injury observed in ∼25% of carotid endarterectomy (CEA) patients. Statin use is associated with a lower incidence of eCD in asymptomatic patients having CEA. OBJECTIVE: To determine whether eCD status is associated with worse long-term survival in patients taking and not taking statins. METHODS: This is a post hoc analysis of a prospective observational study of 585 CEA patients. Patients were evaluated with a battery of neuropsychometric tests before and after surgery. Survival was compared for patients with and without eCD stratifying by statin use. At enrollment, 366 patients were on statins and 219 were not. Survival was assessed by using Kaplan-Meier methods and multivariable Cox proportional hazards models. RESULTS: Age ≥75 years (P = .003), diabetes mellitus (P < .001), cardiac disease (P = .02), and statin use (P = .014) are significantly associated with survival univariately (P < .05) by use of the log-rank test. By Cox proportional hazards model, eCD status and survival adjusting for univariate factors within statin and nonstatin use groups suggested a significant effect by association of eCD on survival within patients not taking statin (hazard ratio, 1.61; 95% confidence interval, 1.09–2.40; P = .018), and no significant effect of eCD on survival within patients taking statin (hazard ratio, 0.98; 95% confidence interval, 0.59–1.66; P = .95). CONCLUSION: eCD is associated with shorter survival in patients not taking statins. This finding validates eCD as an important neurological outcome and suggests that eCD is a surrogate measure for overall health, comorbidity, and vulnerability to neurological insult.


2019 ◽  
Vol 26 (14) ◽  
pp. 1510-1518 ◽  
Author(s):  
Claudia T Lissåker ◽  
Fredrika Norlund ◽  
John Wallert ◽  
Claes Held ◽  
Erik MG Olsson

Background Patients with symptoms of depression and/or anxiety – emotional distress – after a myocardial infarction (MI) have been shown to have worse prognosis and increased healthcare costs. However, whether specific subgroups of patients with emotional distress are more vulnerable is less well established. The purpose of this study was to identify the association between different patterns of emotional distress over time with late cardiovascular and non-cardiovascular mortality among first-MI patients aged <75 years in Sweden. Methods We utilized data on 57,602 consecutive patients with a first-time MI from the national SWEDEHEART registers. Emotional distress was assessed using the anxiety/depression dimension of the European Quality of Life Five Dimensions questionnaire two and 12 months after the MI, combined into persistent (emotional distress at both time-points), remittent (emotional distress at the first follow-up only), new (emotional distress at the second-follow up only) or no distress. Data on cardiovascular and non-cardiovascular mortality were obtained until the study end-time. We used multiple imputation to create complete datasets and adjusted Cox proportional hazards models to estimate hazard ratios. Results Patients with persistent emotional distress were more likely to die from cardiovascular (hazard ratio: 1.46, 95% confidence interval: 1.16, 1.84) and non-cardiovascular causes (hazard ratio: 1.54, 95% confidence interval: 1.30, 1.82) than those with no distress. Those with remittent emotional distress were not statistically significantly more likely to die from any cause than those without emotional distress. Discussion Among patients who survive 12 months, persistent, but not remittent, emotional distress was associated with increased cardiovascular and non-cardiovascular mortality. This indicates a need to identify subgroups of individuals with emotional distress who may benefit from further assessment and specific treatment.


2019 ◽  
Vol 14 (7) ◽  
pp. 994-1001 ◽  
Author(s):  
Eli Farhy ◽  
Clarissa Jonas Diamantidis ◽  
Rebecca M. Doerfler ◽  
Wanda J. Fink ◽  
Min Zhan ◽  
...  

Background and objectivesPoor disease recognition may jeopardize the safety of CKD care. We examined safety events and outcomes in patients with CKD piloting a medical-alert accessory intended to improve disease recognition and an observational subcohort from the same population.Design, setting, participants, & measurementsWe recruited 350 patients with stage 2–5 predialysis CKD. The first (pilot) 108 participants were given a medical-alert accessory (bracelet or necklace) indicating the diagnosis of CKD and displaying a website with safe CKD practices. The subsequent (observation) subcohort (n=242) received usual care. All participants underwent annual visits with ascertainment of patient-reported events (class 1) and actionable safety findings (class 2). Secondary outcomes included 50% GFR reduction, ESKD, and death. Cox proportional hazards assessed the association of the medical-alert accessory with outcomes.ResultsMedian follow-up of pilot and observation subcohorts were 52 (interquartile range, 44–63) and 37 (interquartile range, 27–47) months, respectively. The frequency of class 1 and class 2 safety events reported at annual visits was not different in the pilot versus observation group, with 108.7 and 100.6 events per 100 patient-visits (P=0.13), and 38.3 events and 41.2 events per 100 patient visits (P=0.23), respectively. The medical-alert accessory was associated with lower crude and adjusted rate of ESKD versus the observation group (hazard ratio, 0.42; 95% confidence interval, 0.20 to 0.89; and hazard ratio, 0.38; 95% confidence interval, 0.16 to 0.94, respectively). The association of the medical-alert accessory with the composite endpoint of ESKD or 50% reduction GFR was variable over time but appeared to have an early benefit (up to 23 months) with its use. There was no significant difference in incidence of hospitalization, death, or a composite of all outcomes between medical-alert accessory users and the observational group.ConclusionsThe medical-alert accessory was not associated with incidence of safety events but was associated with a lower rate of ESKD relative to usual care.


Neurosurgery ◽  
2017 ◽  
Vol 81 (6) ◽  
pp. 935-948 ◽  
Author(s):  
Joan Margaret O’Donnell ◽  
Michael Kerin Morgan ◽  
Gillian Z Heller

Abstract BACKGROUND The evidence for the risk of seizures following surgery for brain arteriovenous malformations (bAVM) is limited. OBJECTIVE To determine the risk of seizures after discharge from surgery for supratentorial bAVM. METHODS A prospectively collected cohort database of 559 supratentorial bAVM patients (excluding patients where surgery was not performed with the primary intention of treating the bAVM) was analyzed. Cox proportional hazards regression models (Cox regression) were generated assessing risk factors, a Receiver Operator Characteristic curve was generated to identify a cut-point for size and Kaplan–Meier life table curves created to identify the cumulative freedom from postoperative seizure. RESULTS Preoperative histories of more than 2 seizures and increasing maximum diameter (size, cm) of bAVM were found to be significantly (P &lt; .01) associated with the development of postoperative seizures and remained significant in the Cox regression (size as continuous variable: P = .01; hazard ratio: 1.2; 95% confidence interval: 1.0-1.3; more than 2 seizures: P = .02; hazard ratio: 2.1; 95% confidence interval: 1.1-3.8). The cumulative risk of first seizure after discharge from hospital following resection surgery for all patients with bAVM was 5.8% and 18% at 12 mo and 7 yr, respectively. The 7-yr risk of developing postoperative seizures ranged from 11% for patients with bAVM ≤4 cm and with 0 to 2 preoperative seizures, to 59% for patients with bAVM &gt;4 cm and with &gt;2 preoperative. CONCLUSION The risk of seizures after discharge from hospital following surgery for bAVM increases with the maximum diameter of the bAVM and a patient history of more than 2 preoperative seizures.


2015 ◽  
Vol 22 (8) ◽  
pp. 1086-1093 ◽  
Author(s):  
Saeed Akhtar ◽  
Raed Alroughani ◽  
Samar F Ahmed ◽  
Jasem Y Al-Hashel

Background: The frequency of paediatric-onset multiple sclerosis (POMS) and the precise risk of secondary progression of disease are largely unknown in the Middle East. This cross-sectional cohort study assessed the risk and examined prognostic factors for time to onset of secondary progressive multiple sclerosis (SPMS) in a cohort of POMS patients. Methods: The Kuwait National MS Registry database was used to identify a cohort of POMS cases (diagnosed at age <18 years) from 1994 to 2013. Data were abstracted from patients’ records. A Cox proportional hazards model was used to evaluate the prognostic significance of the variables considered. Results: Of 808 multiple sclerosis (MS) patients, 127 (15.7%) were POMS cases. The median age (years) at disease onset was 16.0 (range 6.5–17.9). Of 127 POMS cases, 20 (15.8%) developed SPMS. A multivariable Cox proportional hazards model showed that at MS onset, brainstem involvement (adjusted hazard ratio 5.71; 95% confidence interval 1.53–21.30; P=0.010), and POMS patient age at MS onset (adjusted hazard ratio 1.38; 95% confidence interval 1.01–1.88; P=0.042) were significantly associated with the increased risk of a secondary progressive disease course. Conclusions: This study showed that POMS patients with brainstem/cerebellar presentation and a relatively higher age at MS onset had disposition for SPMS and warrant an aggressive therapeutic approach.


2020 ◽  
Vol 189 (10) ◽  
pp. 1163-1172
Author(s):  
Tracy A Becerra-Culqui ◽  
Darios Getahun ◽  
Vicki Chiu ◽  
Lina S Sy ◽  
Hung Fu Tseng

Abstract As prenatal vaccinations become more prevalent, it is important to assess potential safety events. In a retrospective cohort study of Kaiser Permanente Southern California (Pasadena, California) mother-child pairs with birth dates during January 1, 2011–December 31, 2014, we investigated the association between prenatal tetanus, diphtheria, and acellular pertussis (Tdap) vaccination and risk of attention-deficit/hyperactivity disorder (ADHD) in offspring. Information on Tdap vaccination during pregnancy was obtained from electronic medical records. ADHD was defined by International Classification of Diseases codes (Ninth or Tenth Revision) and dispensed ADHD medication after age 3 years. Children were followed to the date of their first ADHD diagnosis, the end of Kaiser Permanente membership, or the end of follow-up (December 31, 2018). In Cox proportional hazards models, we estimated unadjusted and adjusted hazard ratios for the association between maternal Tdap vaccination and ADHD, with inverse probability of treatment weighting (IPTW) used to adjust for confounding. Of 128,756 eligible mother-child pairs, 85,607 were included in the final sample. The ADHD incidence rate was 3.41 per 1,000 person-years in the Tdap-vaccinated women and 3.93 per 1,000 person-years in the unvaccinated (hazard ratio = 1.01, 95% confidence interval: 0.88, 1.16). The IPTW-adjusted analyses showed no association between prenatal Tdap vaccination and ADHD in offspring (hazard ratio = 1.00, 95% confidence interval: 0.88, 1.14). In this study, prenatal Tdap vaccination was not associated with ADHD risk in offspring, supporting recommendations to vaccinate pregnant women.


2021 ◽  
Vol 16 (8) ◽  
pp. 1178-1189
Author(s):  
Michelle R. Denburg ◽  
Yunwen Xu ◽  
Alison G. Abraham ◽  
Josef Coresh ◽  
Jingsha Chen ◽  
...  

Background and objectivesMetabolomics facilitates the discovery of biomarkers and potential therapeutic targets for CKD progression.Design, setting, participants, & measurementsWe evaluated an untargeted metabolomics quantification of stored plasma samples from 645 Chronic Kidney Disease in Children (CKiD) participants. Metabolites were standardized and logarithmically transformed. Cox proportional hazards regression examined the association between 825 nondrug metabolites and progression to the composite outcome of KRT or 50% reduction of eGFR, adjusting for age, sex, race, body mass index, hypertension, glomerular versus nonglomerular diagnosis, proteinuria, and baseline eGFR. Stratified analyses were performed within subgroups of glomerular/nonglomerular diagnosis and baseline eGFR.ResultsBaseline characteristics were 391 (61%) male; median age 12 years; median eGFR 54 ml/min per 1.73 m2; 448 (69%) nonglomerular diagnosis. Over a median follow-up of 4.8 years, 209 (32%) participants developed the composite outcome. Unique association signals were identified in subgroups of baseline eGFR. Among participants with baseline eGFR ≥60 ml/min per 1.73 m2, two-fold higher levels of seven metabolites were significantly associated with higher hazards of KRT/halving of eGFR events: three involved in purine and pyrimidine metabolism (N6-carbamoylthreonyladenosine, hazard ratio, 16; 95% confidence interval, 4 to 60; 5,6-dihydrouridine, hazard ratio, 17; 95% confidence interval, 5 to 55; pseudouridine, hazard ratio, 39; 95% confidence interval, 8 to 200); two amino acids, C-glycosyltryptophan, hazard ratio, 24; 95% confidence interval 6 to 95 and lanthionine, hazard ratio, 3; 95% confidence interval, 2 to 5; the tricarboxylic acid cycle intermediate 2-methylcitrate/homocitrate, hazard ratio, 4; 95% confidence interval, 2 to 7; and gulonate, hazard ratio, 10; 95% confidence interval, 3 to 29. Among those with baseline eGFR <60 ml/min per 1.73 m2, a higher level of tetrahydrocortisol sulfate was associated with lower risk of progression (hazard ratio, 0.8; 95% confidence interval, 0.7 to 0.9).ConclusionsUntargeted plasma metabolomic profiling facilitated discovery of novel metabolite associations with CKD progression in children that were independent of established clinical predictors and highlight the role of select biologic pathways.


2015 ◽  
Vol 95 (12) ◽  
pp. 1660-1667 ◽  
Author(s):  
A. Williams Andrews ◽  
Dongmei Li ◽  
Janet K. Freburger

Background Little is known about the use of rehabilitation in the acute care setting and its impact on hospital readmissions. Objective The objective of this study was to examine the association between the intensity of rehabilitation services received during the acute care stay for stroke and the risk of 30-day and 90-day hospital readmission. Design A retrospective cohort analysis of all acute care hospitals in Arkansas and Florida was conducted. Methods Patients (N=64,065) who were admitted for an incident stroke in 2009 or 2010 were included. Rehabilitation intensity was categorized as none, low, medium-low, medium-high, or high based on the sum and distribution of physical therapy, occupational therapy, and speech therapy charges within each hospital. Cox proportional hazards regression was used to estimate hazard ratios, controlling for demographic characteristics, illness severity, comorbidities, hospital variables, and state. Results Relative to participants who received the lowest intensity therapy, those who received higher-intensity therapy had a decreased risk of 30-day readmission. The risk was lowest for the highest-intensity group (hazard ratio=0.86; 95% confidence interval=0.79, 0.93). Individuals who received no therapy were at an increased risk of hospital readmission relative to those who received low-intensity therapy (hazard ratio=1.30; 95% confidence interval=1.22, 1.40). The findings were similar, but with smaller effects, for 90-day readmission. Furthermore, patients who received higher-intensity therapy had more comorbidities and greater illness severity relative to those who received lower-intensity therapy. Limitations The results of the study are limited in scope and generalizability. Also, the study may not have adequately accounted for all potentially important covariates. Conclusions Receipt of and intensity of rehabilitation therapy in the acute care of stroke is associated with a decreased risk of hospital readmission.


2021 ◽  
pp. 107755872110185
Author(s):  
Megan Shepherd-Banigan ◽  
Valerie A. Smith ◽  
Karen M. Stechuchak ◽  
Courtney H. Van Houtven

Support policies for caregivers improves care-recipient access to care and effects may generalize to nonhealth services. Using administrative data from the U.S. Department of Veterans Affairs (VA) for veterans <55 years, we assessed the association between enrollment in a VA caregiver support program and veteran use of vocational assistance services: the post-9/11 GI Bill, VA vocational rehabilitation and employment (VR&E), and supported employment. We applied instrumental variables to Cox proportional hazards models. Caregiver enrollment in the program increased veteran supported employment use (hazard ratio = 1.35, 95% confidence interval [1.14, 1.53]), decreased VR&E use (hazard ratio = 0.84, 95% confidence interval [0.76, 0.92]), and had no effect on the post-9/11 GI Bill. Caregiver support policies could increase access to some vocational assistance for individuals with disabilities, particularly supported employment, which is integrated into health care. Limited coordination between health and employment sectors and misaligned incentives may have inhibited effects for the post-9/11 GI Bill and VR&E.


Sign in / Sign up

Export Citation Format

Share Document