scholarly journals Using Additive and Relative Hazards to Quantify Colorectal Survival Inequalities for Patients with A Severe Psychiatric Illness

Author(s):  
Alyson L Mahar ◽  
Laura E Davis ◽  
Paul Kurdyak ◽  
Timothy P Hanna ◽  
Natalie G Coburn ◽  
...  

IntroductionDespite recommendations, most studies examining health inequalities fail to report both absolute and relative summary measures. We examine colorectal cancer (CRC) survival for patients with and without severe psychiatric illness (SPI) to demonstrate the use and importance of relative and absolute effects. Objectives and ApproachWe conducted a retrospective cohort study of CRC patients diagnosed between 01/04/2007 and 31/12/2012, using linked administrative databases. SPI was defined as diagnoses of major depression, bipolar disorder, schizophrenia, and other psychotic illnesses six months to five years preceding cancer diagnosis and categorized as inpatient, outpatient or none. Associations between SPI history and risk of death were examined using Cox Proportional Hazards regression to obtain hazard ratios and Aalen’s semi-parametric additive hazards regression to obtain absolute differences. Both models controlled for age, sex, primary tumour location, and rurality. ResultsThe final cohort included 24,507 CRC patients, 482 patients had an outpatient SPI history and 258 patients had an inpatient SPI history. 58.1% of patients with inpatient SPI history died, and 47.1% of patients with outpatient SPI history died. Patients with an outpatient SPI history had a 40% (HR 1.40, 95% CI: 1.22-1.59) increased risk of death and patients with an inpatient SPI history had a 91% increased risk of death (HR 1.91, 95% CI: 1.63-2.25), relative to no history of a mental illness. An outpatient SPI history was associated with an additional 33 deaths per 1000 person years, and an inpatient SPI was associated with an additional 82 deaths per 1000 person years after controlling for confounders. Conclusion / ImplicationsWe demonstrated that reporting of both relative and absolute effects is possible and calculating risk difference is relatively simple using Aalen models. We encourage future studies examining inequalities with time-to-event data to use this method and report both relative and absolute effect measures.

2021 ◽  
Vol 39 (15_suppl) ◽  
pp. 10045-10045
Author(s):  
AnnaLynn M. Williams ◽  
Jeanne S. Mandelblatt ◽  
Mingjuan Wang ◽  
Kirsten K. Ness ◽  
Gregory T. Armstrong ◽  
...  

10045 Background: Survivors of childhood cancer have functional limitations and health-related morbidity consistent with an accelerated aging phenotype. We characterized aging using a Deficit Accumulation Index (DAI) which examines the accumulation of multiple aging-related deficits readily available from medical records and self-report. DAI’s are used as surrogates of biologic aging and are validated to predict mortality in adult cancer patients. Methods: We included childhood cancer survivors (N = 3,758, mean age 30 [SD 8], 22 [9] years post diagnosis, 52% male) and community controls (N = 575, mean age 34 [10] 44% male) who completed clinical assessments and questionnaires and who were followed for mortality through December 31st, 2018 (mean follow-up 6.1 [3.1] years). Using the initial SJLIFE clinical assessment, a DAI score was generated as the proportion of deficits out of 44 items related to aging, including chronic conditions (e.g. hearing loss, hypertension), psychosocial and physical function, and activities of daily living. The total score ranged 0 to 1; scores > 0.20 are robust, while moderate and large clinically meaningful differences are 0.02 and 0.06, respectively. Linear regression compared the DAI in survivors and controls with an age*survivor/control interaction and examined treatment associations in survivors. Cox-proportional hazards models estimated risk of death associated with DAI. All models were adjusted for age, sex, and race. Results: Mean [SD] of DAI was 0.17 [0.11] for survivors and 0.10 [0.08] for controls. 32% of survivors had a DAI above the 90th percentile of the control distribution (p < 0.001). After adjustment for covariates, survivors had a statistically and clinically meaningfully higher DAI score than controls (β = 0.072 95%CI 0.062, 0.081; p < 0.001). When plotted against age, the adjusted DAI at the average age of survivors (30 years) was 0.166 (95% CI 0.160,0.171), which corresponded to 60 years of age in controls, suggesting premature aging of 30 years. The mean difference in DAI between survivors and controls increased with age from 0.06 (95% CI 0.04, 0.07) at age 20 to 0.11 (95% CI 0.08, 0.13) at age 60, consistent with an accelerated aging phenotype (p = 0.014). Cranial radiation, abdominal radiation, cyclophosphamide, platinum agents, neurosurgery, and amputation were each associated with a higher DAI (all p≤0.001). Among survivors, a 0.06 increase in DAI was associated with a 41% increased risk of all-cause mortality (HR 1.41 95%CI 1.32, 1.50; p < 0.001). Conclusions: Survivors of childhood cancer experience significant age acceleration that is associated with an increased risk of mortality; longitudinal analyses are underway to validate these findings. Given the ease of estimating a DAI, this may be a feasible method to quickly identify survivors for novel and tailored interventions that can improve health and prevent premature mortality.


Viruses ◽  
2018 ◽  
Vol 10 (11) ◽  
pp. 658 ◽  
Author(s):  
Paul Blair ◽  
Maryam Keshtkar-Jahromi ◽  
Kevin Psoter ◽  
Ronald Reisler ◽  
Travis Warren ◽  
...  

Angola variant (MARV/Ang) has replaced Mt. Elgon variant Musoke isolate (MARV/MtE-Mus) as the consensus standard variant for Marburg virus research and is regarded as causing a more aggressive phenotype of disease in animal models; however, there is a dearth of published evidence supporting the higher virulence of MARV/Ang. In this retrospective study, we used data pooled from eight separate studies in nonhuman primates experimentally exposed with either 1000 pfu intramuscular (IM) MARV/Ang or MARV/MtE-Mus between 2012 and 2017 at the United States Army Medical Research Institute of Infectious Diseases (USAMRIID). Multivariable Cox proportional hazards regression was used to evaluate the association of variant type with time to death, the development of anorexia, rash, viremia, and 10 select clinical laboratory values. A total of 47 cynomolgus monkeys were included, of which 18 were exposed to MARV/Ang in three separate studies and 29 to MARV/MtE-Mus in five studies. Following universally fatal Marburg virus exposure, compared to MARV/MtE-Mus, MARV/Ang was associated with an increased risk of death (HR = 22.10; 95% CI: 7.08, 68.93), rash (HR = 5.87; 95% CI: 2.76, 12.51) and loss of appetite (HR = 35.10; 95% CI: 7.60, 162.18). Our data demonstrate an increased virulence of MARV/Ang compared to MARV/MtE-Mus variant in the 1000 pfu IM cynomolgus macaque model.


2009 ◽  
Vol 27 (15_suppl) ◽  
pp. 1069-1069
Author(s):  
D. Sartori ◽  
M. Bari ◽  
G. L. Pappagallo ◽  
F. Rosetti ◽  
S. Olsen ◽  
...  

1069 Background: Ten to 15% of patients (pts) with breast cancer will be diagnosed with central nervous system (CNS) metastases, and autopsy series suggest that up to 30% of pts have evidence of CNS disease at the time of death. The idenfication of factors that may predispose to CNS metastasis may help lead to earlier detection and possibly to improvement in disease management. Methods: Breast cancer pts with CNS metastases were identified within a database of 1300 breast cancer diganoses from 1995 to 2007 at the Department of Oncology, Azienda ULSS 13 VE. Pathologic features of tumor samples were examined using standard immunohistochemical assays. Results: Fifty-one pts with CNS metastases were identified. Median age at primary breast cancer diagnosis was 49 years (range, 28–78); median time to CNS metastases was 45 months (range, 3–244). HER2 overexpression was found in tumors from 25 pts (49.0%); 23 pts had tumors lacking overexpression of HER2, estrogen receptors (ER), and progesterone receptors (PgR) (ie, “triple negative” disease). Overexpression of p53 (at least 20% tumor cells positive), Ki67 (at least 20%), and BCL2 (at least 30%) were detected in tumors from 16 pts (31.4%), 32 pts (62.7%), and 14 pts (27.5%), respectively. Median survival from CNS involvement was 3.67 months (95% CI 2.05–5.28), with 24.4% and 15.3% of patients estimated to be alive at 12 and 24 months, respectively (Kaplan-Meier product limit method). A Cox proportional hazards analysis found that Ki67 overexpression was the only factor independently associated with a significantly increased risk of death (2.7-fold increase, p=0.028), while triple negative status was associated with a 1.8-fold increase in the risk of death (P=0.08) (Table). Conclusions: In our series of breast cancer pts with CNS metastases, nearly all had either HER2 overexpression or triple-negative disease. Pts whose tumors had higher proliferative indices, assessed by Ki67, had the poorest prognosis. [Table: see text] [Table: see text]


BMJ Open ◽  
2021 ◽  
Vol 11 (12) ◽  
pp. e050051
Author(s):  
Arthur W Wallace ◽  
Piera M Cirillo ◽  
James C Ryan ◽  
Nickilou Y Krigbaum ◽  
Anusha Badathala ◽  
...  

ObjectivesSARS-CoV-2 enters cells using the ACE2 receptor. Medications that affect ACE2 expression or function such as angiotensin receptor blockers (ARBs) and ACE inhibitors (ACE-I) and metformin have the potential to counter the dysregulation of ACE2 by the virus and protect against viral injury. Here, we describe COVID-19 survival associated with ACE-I, ARB and metformin use.DesignThis is a hospital-based observational study of patients with COVID-19 infection using logistic regression with correction for pre-existing conditions and propensity score weighted Cox proportional hazards models to estimate associations between medication use and mortality.SettingMedical record data from the US Veterans Affairs (VA) were used to identify patients with a reverse transcription PCR diagnosis of COVID-19 infection, to classify patterns of ACE inhibitors (ACE-I), ARB, beta blockers, metformin, famotidine and remdesivir use, and, to capture mortality.Participants9532 hospitalised patients with COVID-19 infection followed for 60 days were analysed.Outcome measureDeath from any cause within 60 days of COVID-19 diagnosis was examined.ResultsDiscontinuation of ACE-I was associated with increased risk of death (OR: 1.4; 95% CI 1.2–1.7). Initiating (OR: 0.3; 95% CI 0.2–0.5) or continuous (OR: 0.6; 95% CI 0.5–0.7) ACE-I was associated with reduced risk of death. ARB and metformin associations were similar in direction and magnitude and also statistically significant. Results were unchanged when accounting for pre-existing morbidity and propensity score adjustment.ConclusionsRecent randomised clinical trials support the safety of continuing ACE-I and ARB treatment in patients with COVID-19 where indicated. Our study extends these findings to suggest a possible COVID-19 survival benefit for continuing or initiating ACE-I, ARB and metformin medications. Randomised trials are appropriate to confirm or refute the therapeutic potential for ACE-I, ARBs and metformin.


2009 ◽  
Vol 27 (15_suppl) ◽  
pp. 9625-9625
Author(s):  
J. A. Berlin ◽  
P. J. Bowers ◽  
S. Rao ◽  
S. Sun ◽  
K. Liu ◽  
...  

9625 Background: When cancer patients (pts) with chemotherapy-induced anemia (CIA) respond to erythropoietic-stimulating agents (ESA), hemoglobin (Hb) typically increases within 4–8 weeks. This exploratory analysis examined whether mortality differs depending on Hb response after 4 or 8 weeks of epoetin alfa (EPO) treatment or depending on transfusion. Methods: Pt-level data were analyzed from 31 randomized studies (7,215 pts) of epoetin alfa vs non-EPO (15 studies) or placebo (16 studies) in pts with CIA. A landmark analysis was used; Hb change was set at a specific time (4 and 8 weeks) and subsequent survival was examined separately for EPO and placebo. Pts were categorized as “Hb increased” (>0.5 g/dL), “Hb decreased” (>0.5 g/dL), or “Hb stable” (within ±0.5 g/dL) compared to baseline. Hb stable was compared to other Hb change categories with Cox proportional hazards models, stratified by study and adjusted for potential confounders. Results: The hazard ratio (HR) for Hb decreased versus Hb stable at 4 weeks was 1.44 for EPO (95% CI: 1.04, 1.99), indicating worse survival for pts with a decline in Hb. This association was weaker for placebo (HR: 1.12; 95% CI: 0.74, 1.67). Increased risk with declining Hb in EPO-treated pts was most pronounced in studies that maintained Hb ≥12 g/dL or treated pts for >12–16 weeks (1,876 pts). Patterns were similar using the 8-week landmark. In both EPO-treated and placebo pts, transfusion increased the rate of on-study death ∼3.5 fold (treating transfusion as a time-dependent variable). Conclusions: These exploratory findings suggest that both decreased Hb after 4 or 8 weeks of EPO treatment and transfusion are associated with increased risk of death. In spite of adjustment for other prognostic factors, it is likely that this association reflects poorer underlying prognosis of pts whose Hb fails to respond. ESAs should be discontinued in the absence of a Hb response. [Table: see text]


Neurology ◽  
2021 ◽  
pp. 10.1212/WNL.0000000000012483
Author(s):  
Emily L. Johnson ◽  
Gregory L. Krauss ◽  
Anna Kucharska-Newton ◽  
Alice D. Lam ◽  
Rani Sarkis ◽  
...  

ObjectiveTo determine the risk of mortality and causes of death in persons with late-onset epilepsy (LOE) compared to those without epilepsy in a community-based sample, adjusting for demographics and comorbid conditions.MethodsThis is an analysis of the prospective Atherosclerosis Risk in Communities (ARIC) study, initiated in 1987-1989 among 15,792 mostly black and white men and women in 4 U.S. communities. We used Centers for Medicare Services fee-for-service claims codes to identify cases of incident epilepsy starting at or after age 67. We used Cox proportional hazards analysis to identify the hazard of mortality associated with LOE and to adjust for demographics and vascular risk factors. We used death certificate data to identify dates and causes of death.ResultsAnalyses included 9090 participants, of whom 678 developed LOE during median 11.5 years of follow-up after age 67. Participants who developed LOE were at an increased hazard of mortality compared to those who did not, with adjusted hazard ratio 2.39 (95% CI 2.12-2.71). We observed excess mortality due to stroke, dementia, neurologic conditions, and end-stage renal disease in participants with compared to without LOE. Only 4 deaths (1.1%) were directly attributed to seizure-related causes.ConclusionsPersons who develop LOE are at increased risk of death compared to those without epilepsy, even after adjusting for comorbidities. The majority of this excess mortality is due to stroke and dementia.


2021 ◽  
Vol 39 (15_suppl) ◽  
pp. e18577-e18577
Author(s):  
Christopher Noel ◽  
Antoine Eskander ◽  
Rinku Sutradhar ◽  
Alyson Mahar ◽  
Simone Vigod ◽  
...  

e18577 Background: Psychological distress is a key construct of patient-centred cancer care. While an increased risk of suicide for cancer patients has been reported, more frequent consequences of distress after a cancer diagnosis, such as non-fatal self-injury (NFSI), remain largely unknown. We examined the risk for NFSI after a cancer diagnosis. Methods: Using linked administrative databases we identified adults diagnosed with cancer between 2007-2019. Cumulative incidence of NFSI, defined as emergency department presentation of self-injury, was computed accounting for the competing-risk of death from all causes. Factors associated with NFSI were assessed using multivariable Fine and Gray models. Results: Of 806,910 included patients, 2,482 had NFSI and 182 died by suicide. 5-year cumulative incidence of NFSI was 0.27% [95%CI 0.25-0.28%]. After adjusting for key confounders, prior severe psychiatric illness whether requiring inpatient care (sub-distribution hazard ratio (sHR) 12.6, [95% CI 10.5-15.2]) or outpatient care (sHR 7.5, 95% CI 6.48-8.84), and prior self-injury (sHR 6.6 [95% CI 5.5-8.0]) were associated with increased risk of NFSI. Young adults (age 18-39) had the highest NFSI rates, relative to individuals >70 (sHR 5.4, [95% CI 4.5-6.5]). The magnitude of association between prior severe psychiatric illness and NFSI was greatest for young adults (interaction term p < 0.01). Certain cancer subsites were also at increased risk, including head and neck (sHR1.52, [95%CI 1.19-1.93]). Conclusions: Patients with cancer have higher incidence of NFSI than suicide after diagnosis. Younger age, prior severe psychiatric illness, and prior self-injury were independently associated with NFSI. These exposures act synergistically, placing young adults with a prior mental health history at greatest risk for NFSI events. Those factors should be used to identify at-risk patients for psycho-social assessment and intervention.


Circulation ◽  
2018 ◽  
Vol 137 (suppl_1) ◽  
Author(s):  
Matteo Fabbri ◽  
Sheila Manemann ◽  
Cynthia Boyd ◽  
Jennifer Wolff ◽  
Alanna Chamberlain ◽  
...  

Introduction: Little is known about the characteristics and resources that enable patients with heart failure (HF) to engage in effective self-management. To address this gap in knowledge, we measured personal and health care resources for self-management and examined associations with mortality among patients with HF. Methods: We surveyed 5543 residents of 11 counties in Southeast Minnesota with a first-ever code for HF [International Classification of Disease, Ninth Revision code 428 or Tenth Revision code I50] between 1/1/2013 and 3/31/2016. Self-management resources were measured with the health care and personal subscales of the Chronic Illness Resources Survey (CIRS), both of which included 3 questions on a 5-point scale. The responses were averaged and participants were categorized as low if the mean score was below the median of the distribution (range from 1 to 5). The survey was returned by 2866 participants (response rate 52%) and those with complete data on the main items of interest were retained for analysis (N=2212). Cox proportional hazards regression was used to determine the association between each subscale and mortality. Results: Among 2212 participants (mean age 72.8 years, 54.1% men) the median health care score was 4, while the personal score was 3. Those with low health care resources were older and less educated than those with a higher score (p<0.05), while those with low personal resources had less comorbidities and lower education attainment compared to those with a higher score (p<0.05). After a mean (SD) follow-up of 1.3 ± 0.6 years, 207 deaths occurred. Low levels of both self-management resources were associated with an increased risk of death compared with patients with high levels (Table). Conclusions: Having limited self-management resources is associated with an increased risk of mortality among patients with HF. Thus, interventions aimed at supporting self-management among patients with HF may improve outcomes.


Circulation ◽  
2013 ◽  
Vol 127 (suppl_12) ◽  
Author(s):  
Sheila M McNallan ◽  
Yariv Gerber ◽  
Susan A Weston ◽  
Jill Killian ◽  
Shannon M Dunlay ◽  
...  

Background: Contemporary data on survival after incident acute coronary syndrome (ACS), including both myocardial infarction (MI) and unstable angina (UA), are limited. Objective: To describe survival after incident ACS, to determine if it differs by ACS type (MI or UA) and to determine whether it has improved over time. Methods: Olmsted County, MN residents hospitalized between 1/1/2005-12/31/2010 were screened for incident ACS. ACS was defined as either MI validated by standard epidemiological criteria or UA validated by the Braunwald classification. Patients were followed for death from any cause. Cox proportional hazards regression was used to determine whether survival differed by ACS type, while adjusting for year of diagnosis, age, sex and comorbidities. Results: Among 1,160 incident ACS cases (mean±SD age 66.9±14.8, 60% male), 35% were UA and 65% were MI. After a mean (SD) follow up of 3.7 (2.1) years, 274 deaths occurred. The 3-year Kaplan-Meier survival estimate for MI was 79.6% (95% CI: 76.7%-82.6%) and for UA was 84.9% (95% CI: 81.3%-88.6%) (log-rank p=0.011). The association of ACS type with survival differed by age (p=0.056). After adjustment for year of diagnosis, sex and comorbidities, no difference in survival was observed between ACS types among those aged <60 (HR for MI vs. UA: 0.64, 95% 0.29-1.42). By contrast, among patients aged 60-79, those with an MI had 2 times the risk of death compared to those with UA (HR: 2.04, 95% CI: 1.24-3.37). Patients aged 80 or older who had an MI had a 40% increased risk of death compared to patients of the same age who had UA (HR: 1.42, 95% CI: 1.02-1.98). There was no difference in survival over time (HR for 2010 vs. 2005: 0.91, 95% CI: 0.61-1.36). Conclusions: Survival did not differ between UA and MI patients younger than 60, however among patients 60 or older, survival was worse among those with an MI. Survival after ACS did not change over the study period.


Circulation ◽  
2008 ◽  
Vol 118 (suppl_18) ◽  
Author(s):  
Tajinder P Singh ◽  
Michael Givertz ◽  
Marc Semigran ◽  
Fred Costantino ◽  
David DeNofrio ◽  
...  

Race has been associated with patient survival in heart transplant (HT) recipients. We hypothesized that this association is mediated by socioeconomic (SE) factors. Block groups (average population 1000) are the smallest units of population in US census database with available SE data. We assessed whether SE position, determined for the block group of patient residence at time of HT, is associated with graft failure in HT recipients. We used the US census 2000 database to extract 6 SE variables of wealth, income, education and occupation and calculated a previously validated summary SE score for 520 patients who underwent first HT at one of 4 Boston centers during 1996–2005. Cox proportional hazards modeling was used to compare the risk of graft failure (time to death or re-transplant) in the lowest quartile SE group (low SE group, n = 129) with that of remaining patients (controls, n = 391). Low SE patients were younger (median age 48 yrs versus 52 yrs for controls, P < 0.015) and more likely to be nonwhite (32% versus 9% of controls, P = 0.001). The two SE groups were similar with respect to gender, listing status, cardiac diagnosis, hemodynamic support, year of HT and prevalence of diabetes, smoking and hypertension. Graft failure occurred in 142 HT patients (135 deaths, 7 re-transplants). Early Graft failure (within 6 months) was associated with earlier era (before 2001), pre-HT extracorporeal membrane oxygenation or ventricular assist device but not with age, cardiac diagnosis, race or low SE position. In patients who survived at least 6 months (conditional survival), nonwhite race (HR 2.0, 95% CI 1.3–3.3, P = 0.004), low SE position (HR 1.6, CI 1.1–2.5, P = 0.03) and pre-HT smoking (HR 1.6, CI 1.0 –2.4, P = 0.03) were all associated with subsequent graft failure. In multivariable analysis, nonwhite race (HR 1.9, CI 1.2–3.3) remained a significant predictor of late graft failure after controlling for low SE position (HR 1.3, CI 0.8 –2.1, P = 0.24) and pre-HT smoking (HR 1.6, CI 1.1–2.5, P = 0.03). There is no association of early post-HT survival with race or low SE position. Nonwhite race is associated with increased risk of death or re-transplant after 6 months of HT. Low SE position explains only a small fraction of this association.


Sign in / Sign up

Export Citation Format

Share Document