Abstract 13790: Area Deprivation Index and Cardiac Readmissions: Evaluating Risk-prediction in an Electronic Health Record

Circulation ◽  
2020 ◽  
Vol 142 (Suppl_3) ◽  
Author(s):  
Amber E Johnson ◽  
Jianhui Zhu ◽  
William Garrard ◽  
Floyd W Thoma ◽  
Suresh R Mulukutla ◽  
...  

Introduction: Social determinants of health (SDOH) affect cardiovascular outcomes. Our objective was to examine the association between neighborhood-level SDOH, and risk of readmission and mortality for atrial fibrillation (AF), heart failure, (HF) and myocardial ischemia (MI). Hypothesis: We hypothesized that neighborhood-level SDOH 1) is associated with readmission and death, 2) would significantly contribute to risk-prediction. Methods: We conducted a retrospective analysis from a regional health center to test the predictive ability of the area deprivation index (ADI) on 30-day and one-year readmission and mortality among patients admitted to hospital. Covariates included age, sex, race, comorbidity, number of medications, length of stay, and insurance. We used Cox proportional hazards and log rank analyses. To evaluate the discriminative power of the prediction models, we used the C-statistic and net reclassification methods after adding ADI into the model. Results: Our cohort of 31,633 adult patients followed was for 49.88 months (interquartile range 47.99). The cohort was 51.4% male and 90.6% white. Almost half (15,543, 49.1%) lived in neighborhoods in the two highest (worst) ADI quintiles. Patients living in areas with highest ADI were 28% less likely to be admitted within one year of index AF admission (HR=0.72, 95% CI [0.59, 0.87]). Patients in the highest ADI quintile were 30% more likely to be admitted within one year of index HF admission (HR=1.30, 95%CI [1.11, 1.52]). For MI, patients in the highest ADI quintile had a one-year risk for readmission that was twice that of those in the lowest quintile (HR=2.08, 95%CI [1.53, 2.81]). As ADI increased, risk of cardiac readmission and all-cause readmission increased at 30 days and one year. Reclassification of readmission risk was significantly improved when including ADI in the models. Patients in the highest ADI quintile were 46% more likely to die within 30 days when compared with those in the lowest quintile (HR=1.46, 95% CI [1.04, 2.06]) and 26% more likely to die within a year (HR=1.26 [1.10, 1.43]). Conclusions: Residence in disadvantaged communities predicts rehospitalization and mortality. ADI can be used to identify vulnerable cardiac patients after discharge.

Author(s):  
Amber E. Johnson ◽  
Jianhui Zhu ◽  
William Garrard ◽  
Floyd W. Thoma ◽  
Suresh Mulukutla ◽  
...  

Background Assessment of the social determinants of post‐hospital cardiac care is needed. We examined the association and predictive ability of neighborhood‐level determinants (area deprivation index, ADI), readmission risk, and mortality for heart failure, myocardial ischemia, and atrial fibrillation. Methods and Results Using a retrospective (January 1, 2011–December 31, 2018) analysis of a large healthcare system, we assess the predictive ability of ADI on 30‐day and 1‐year readmission and mortality following hospitalization. Cox proportional hazards models analyzed time‐to‐event. Log rank analyses determined survival. C‐statistic and net reclassification index determined the model’s discriminative power. Covariates included age, sex, race, comorbidity, number of medications, length of stay, and insurance. The cohort (n=27 694) had a median follow‐up of 46.5 months. There were 14 469 (52.2%) men and 25 219 White (91.1%) patients. Patients in the highest ADI quintile (versus lowest) were more likely to be admitted within 1 year of index heart failure admission (hazard ratio [HR], 1.25; 95% CI, 1.03‒1.51). Patients with myocardial ischemia in the highest ADI quintile were twice as likely to be readmitted at 1 year (HR, 2.04; 95% CI, 1.44‒2.91]). Patients with atrial fibrillation living in areas with highest ADI were less likely to be admitted within 1 year (HR, 0.79; 95% CI, 0.65‒0.95). As ADI increased, risk of readmission increased, and risk reclassification was improved with ADI in the models. Patients in the highest ADI quintile were 25% more likely to die within a year (HR, 1.25 1.08‒1.44). Conclusions Residence in socioeconomically disadvantaged communities predicts rehospitalization and mortality. Measuring neighborhood deprivation can identify individuals at risk following cardiac hospitalization.


2021 ◽  
Vol 8 ◽  
Author(s):  
David De Ridder ◽  
José Sandoval ◽  
Nicolas Vuilleumier ◽  
Andrew S. Azman ◽  
Silvia Stringhini ◽  
...  

Objective: To investigate the association between socioeconomic deprivation and the persistence of SARS-CoV-2 clusters.Methods: We analyzed 3,355 SARS-CoV-2 positive test results in the state of Geneva (Switzerland) from February 26 to April 30, 2020. We used a spatiotemporal cluster detection algorithm to monitor SARS-CoV-2 transmission dynamics and defined spatial cluster persistence as the time in days from emergence to disappearance. Using spatial cluster persistence measured outcome and a deprivation index based on neighborhood-level census socioeconomic data, stratified survival functions were estimated using the Kaplan-Meier estimator. Population density adjusted Cox proportional hazards (PH) regression models were then used to examine the association between neighborhood socioeconomic deprivation and persistence of SARS-CoV-2 clusters.Results: SARS-CoV-2 clusters persisted significantly longer in socioeconomically disadvantaged neighborhoods. In the Cox PH model, the standardized deprivation index was associated with an increased spatial cluster persistence (hazard ratio [HR], 1.43 [95% CI, 1.28–1.59]). The adjusted tercile-specific deprivation index HR was 1.82 [95% CI, 1.56–2.17].Conclusions: The increased risk of infection of disadvantaged individuals may also be due to the persistence of community transmission. These findings further highlight the need for interventions mitigating inequalities in the risk of SARS-CoV-2 infection and thus, of serious illness and mortality.


Author(s):  
Shaan Khurshid ◽  
Uri Kartoun ◽  
Jeffrey M. Ashburner ◽  
Ludovic Trinquart ◽  
Anthony Philippakis ◽  
...  

Background - Atrial fibrillation (AF) is associated with increased risks of stroke and heart failure. Electronic health record (EHR) based AF risk prediction may facilitate efficient deployment of interventions to diagnose or prevent AF altogether. Methods - We externally validated an EHR atrial fibrillation (EHR-AF) score in IBM Explorys Life Sciences, a multi-institutional dataset containing statistically de-identified EHR data for over 21 million individuals ("Explorys Dataset"). We included individuals with complete AF risk data, ≥2 office visits within two years, and no prevalent AF. We compared EHR-AF to existing scores including CHARGE-AF, C 2 HEST, and CHA 2 DS 2 -VASc. We assessed association between AF risk scores and 5-year incident AF, stroke, and heart failure using Cox proportional hazards modeling, 5-year AF discrimination using c-indices, and calibration of predicted AF risk to observed AF incidence. Results - Of 21,825,853 individuals in the Explorys Dataset, 4,508,180 comprised the analysis (age 62.5, 56.3% female). AF risk scores were strongly associated with 5-year incident AF (hazard ratio [HR] per standard deviation [SD] increase 1.85 using CHA 2 DS 2 -VASc to 2.88 using EHR-AF), stroke (1.61 using C 2 HEST to 1.92 using CHARGE-AF), and heart failure (1.91 using CHA 2 DS 2 -VASc to 2.58 using EHR-AF). EHR-AF (c-index 0.808 [95%CI 0.807-0.809]) demonstrated favorable AF discrimination compared to CHARGE-AF (0.806 [0.805-0.807]), C 2 HEST (0.683 [0.682-0.684]), and CHA 2 DS 2 -VASc (0.720 [0.719-0.722]). Of the scores, EHR-AF demonstrated the best calibration to incident AF (calibration slope 1.002 [0.997-1.007]). In subgroup analyses, AF discrimination using EHR-AF was lower in individuals with stroke (c-index 0.696 [0.692-0.700]) and heart failure (0.621 [0.617-0.625]). Conclusions - EHR-AF demonstrates predictive accuracy for incident AF using readily ascertained EHR data. AF risk is associated with incident stroke and heart failure. Use of such risk scores may facilitate decision-support and population health management efforts focused on minimizing AF-related morbidity.


Author(s):  
Massimiliano Cantinotti ◽  
Raffaele Giordano ◽  
Marco Scalese ◽  
Sabrina Molinaro ◽  
Francesca della Pina ◽  
...  

AbstractThe routine use of brain natriuretic peptide (BNP) in pediatric cardiac surgery remains controversial. Our aim was to test whether BNP adds information to predict risk in pediatric cardiac surgery.In all, 587 children undergoing cardiac surgery (median age 6.3 months; 1.2–35.9 months) were prospectively enrolled at a single institution. BNP was measured pre-operatively, on every post-operative day in the intensive care unit, and before discharge. The primary outcome was major complications and length ventilator stay >15 days. A first risk prediction model was fitted using Cox proportional hazards model with age, body surface area and Aristotle score as continuous predictors. A second model was built adding cardiopulmonary bypass time and arterial lactate at the end of operation to the first model. Then, peak post-operative log-BNP was added to both models. Analysis to test discrimination, calibration, and reclassification were performed.BNP increased after surgery (p<0.001), peaking at a mean of 63.7 h (median 36 h, interquartile range 12–84 h) post-operatively and decreased thereafter. The hazard ratios (HR) for peak-BNP were highly significant (first model HR=1.40, p=0.006, second model HR=1.44, p=0.008), and the log-likelihood improved with the addition of BNP at 12 h (p=0.006; p=0.009). The adjunction of peak-BNP significantly improved the area under the ROC curve (first model p<0.001; second model p<0.001). The adjunction of peak-BNP also resulted in a net gain in reclassification proportion (first model NRI=0.089, p<0.001; second model NRI=0.139, p=0.003).Our data indicates that BNP may improve the risk prediction in pediatric cardiac surgery, supporting its routine use in this setting.


Author(s):  
Chun-Gu Cheng ◽  
Hsin Chu ◽  
Jiunn-Tay Lee ◽  
Wu-Chien Chien ◽  
Chun-An Cheng

(1) Background: Patients with benign prostatic hyperplasia (BPH) were questioned about quality of life and sleep. Most BPH patients were treated with alpha-1 adrenergic receptor antagonists, which could improve cerebral blood flow for 1–2 months. Patients with ischemic stroke (IS) could experience cerebral autoregulation impairment for six months. The relationship between BPH and recurrent IS remains unclear. The aim of this study was to determine the risk of one-year recurrent IS conferred by BPH. (2) Methods: We used data from the Taiwanese National Health Insurance Database to identify newly diagnosed IS cases entered from 1 January 2008 to 31 December 2008. Patients were followed until the recurrent IS event or 365 days after the first hospitalization. The risk factors associated with one-year recurrent IS were assessed using Cox proportional hazards regression. (3) Results: Patients with BPH had a higher risk of recurrent IS (12.11% versus 8.15%) (adjusted hazard ratio (HR): 1.352; 95% confidence interval (CI): 1.028–1.78, p = 0.031). Other risk factors included hyperlipidemia (adjusted HR: 1.338; 95% CI: 1.022–1.751, p = 0.034), coronary artery disease (adjusted HR: 1.487; 95% CI: 1.128–1.961, p = 0.005), chronic obstructive pulmonary disease (adjusted HR: 1.499; 95% CI: 1.075–2.091, p = 0.017), and chronic kidney disease (adjusted HR: 1.523; 95% CI: 1.033–2.244, p = 0.033). (4) Conclusion: Patients with BPH who had these risk factors had an increased risk of one-year recurrent IS. The modification of risk factors may prevent recurrent IS.


Medicina ◽  
2020 ◽  
Vol 56 (5) ◽  
pp. 236 ◽  
Author(s):  
Charat Thongprayoon ◽  
Wisit Cheungpasitporn ◽  
Sorkko Thirunavukkarasu ◽  
Tananchai Petnak ◽  
Api Chewcharat ◽  
...  

Background and Objectives: The optimal range of serum potassium at hospital discharge is unclear. The aim of this study was to assess the relationship between discharge serum potassium levels and one-year mortality in hospitalized patients. Materials and Methods: All adult hospital survivors between 2011 and 2013 at a tertiary referral hospital, who had available admission and discharge serum potassium data, were enrolled. End-stage kidney disease patients were excluded. Discharge serum potassium was defined as the last serum potassium level measured within 48 h prior to hospital discharge and categorized into ≤2.9, 3.0–3.4, 3.5–3.9, 4.0–4.4, 4.5–4.9, 5.0–5.4 and ≥5.5 mEq/L. A Cox proportional hazards analysis was performed to assess the independent association between discharge serum potassium and one-year mortality after hospital discharge, using the discharge potassium range of 4.0–4.4 mEq/L as the reference group. Results: Of 57,874 eligible patients, with a mean discharge serum potassium of 4.1 ± 0.4 mEq/L, the estimated one-year mortality rate after discharge was 13.2%. A U-shaped association was observed between discharge serum potassium and one-year mortality, with the nadir mortality in the discharge serum potassium range of 4.0–4.4 mEq/L. After adjusting for clinical characteristics, including admission serum potassium, both discharge serum potassium ≤3.9 mEq/L and ≥4.5 mEq/L were significantly associated with increased one-year mortality, compared with the discharge serum potassium of 4.0–4.4 mEq/L. Stratified analysis based on admission serum potassium showed similar results, except that there was no increased risk of one-year mortality when discharge serum potassium was ≤3.9 mEq/L in patients with an admission serum potassium of ≥5.0 mEq/L. Conclusion: The association between discharge serum potassium and one-year mortality after hospital discharge had a U-shaped distribution and was independent of admission serum potassium. Favorable survival outcomes occurred when discharge serum potassium was strictly within the range of 4.0–4.4 mEq/L.


Cancers ◽  
2021 ◽  
Vol 13 (7) ◽  
pp. 1495
Author(s):  
Tú Nguyen-Dumont ◽  
James G. Dowty ◽  
Robert J. MacInnis ◽  
Jason A. Steen ◽  
Moeen Riaz ◽  
...  

While gene panel sequencing is becoming widely used for cancer risk prediction, its clinical utility with respect to predicting aggressive prostate cancer (PrCa) is limited by our current understanding of the genetic risk factors associated with predisposition to this potentially lethal disease phenotype. This study included 837 men diagnosed with aggressive PrCa and 7261 controls (unaffected men and men who did not meet criteria for aggressive PrCa). Rare germline pathogenic variants (including likely pathogenic variants) were identified by targeted sequencing of 26 known or putative cancer predisposition genes. We found that 85 (10%) men with aggressive PrCa and 265 (4%) controls carried a pathogenic variant (p < 0.0001). Aggressive PrCa odds ratios (ORs) were estimated using unconditional logistic regression. Increased risk of aggressive PrCa (OR (95% confidence interval)) was identified for pathogenic variants in BRCA2 (5.8 (2.7–12.4)), BRCA1 (5.5 (1.8–16.6)), and ATM (3.8 (1.6–9.1)). Our study provides further evidence that rare germline pathogenic variants in these genes are associated with increased risk of this aggressive, clinically relevant subset of PrCa. These rare genetic variants could be incorporated into risk prediction models to improve their precision to identify men at highest risk of aggressive prostate cancer and be used to identify men with newly diagnosed prostate cancer who require urgent treatment.


2021 ◽  
pp. 000486742110096
Author(s):  
Oleguer Plana-Ripoll ◽  
Patsy Di Prinzio ◽  
John J McGrath ◽  
Preben B Mortensen ◽  
Vera A Morgan

Introduction: An association between schizophrenia and urbanicity has long been observed, with studies in many countries, including several from Denmark, reporting that individuals born/raised in densely populated urban settings have an increased risk of developing schizophrenia compared to those born/raised in rural settings. However, these findings have not been replicated in all studies. In particular, a Western Australian study showed a gradient in the opposite direction which disappeared after adjustment for covariates. Given the different findings for Denmark and Western Australia, our aim was to investigate the relationship between schizophrenia and urbanicity in these two regions to determine which factors may be influencing the relationship. Methods: We used population-based cohorts of children born alive between 1980 and 2001 in Western Australia ( N = 428,784) and Denmark ( N = 1,357,874). Children were categorised according to the level of urbanicity of their mother’s residence at time of birth and followed-up through to 30 June 2015. Linkage to State-based registers provided information on schizophrenia diagnosis and a range of covariates. Rates of being diagnosed with schizophrenia for each category of urbanicity were estimated using Cox proportional hazards models adjusted for covariates. Results: During follow-up, 1618 (0.4%) children in Western Australia and 11,875 (0.9%) children in Denmark were diagnosed with schizophrenia. In Western Australia, those born in the most remote areas did not experience lower rates of schizophrenia than those born in the most urban areas (hazard ratio = 1.02 [95% confidence interval: 0.81, 1.29]), unlike their Danish counterparts (hazard ratio = 0.62 [95% confidence interval: 0.58, 0.66]). However, when the Western Australian cohort was restricted to children of non-Aboriginal Indigenous status, results were consistent with Danish findings (hazard ratio = 0.46 [95% confidence interval: 0.29, 0.72]). Discussion: Our study highlights the potential for disadvantaged subgroups to mask the contribution of urban-related risk factors to risk of schizophrenia and the importance of stratified analysis in such cases.


Author(s):  
Yuko Yamaguchi ◽  
Marta Zampino ◽  
Toshiko Tanaka ◽  
Stefania Bandinelli ◽  
Yusuke Osawa ◽  
...  

Abstract Background Anemia is common in older adults and associated with greater morbidity and mortality. The causes of anemia in older adults have not been completely characterized. Although elevated circulating growth and differentiation factor 15 (GDF-15) has been associated with anemia in older adults, it is not known whether elevated GDF-15 predicts the development of anemia. Methods We examined the relationship between plasma GDF-15 concentrations at baseline in 708 non-anemic adults, aged 60 years and older, with incident anemia during 15 years of follow-up among participants in the Invecchiare in Chianti (InCHIANTI) Study. Results During follow-up, 179 (25.3%) participants developed anemia. The proportion of participants who developed anemia from the lowest to highest quartile of plasma GDF-15 was 12.9%, 20.1%, 21.2%, and 45.8%, respectively. Adults in the highest quartile of plasma GDF-15 had an increased risk of developing anemia (Hazards Ratio 1.15, 95% Confidence Interval 1.09, 1.21, P&lt;.0001) compared to those in the lower three quartiles in a multivariable Cox proportional hazards model adjusting for age, sex, serum iron, soluble transferrin receptor, ferritin, vitamin B12, congestive heart failure, diabetes mellitus, and cancer. Conclusions Circulating GDF-15 is an independent predictor for the development of anemia in older adults.


2021 ◽  
pp. 1-38
Author(s):  
Ala Al Rajabi ◽  
Geraldine Lo Siou ◽  
Alianu K. Akawung ◽  
Kathryn L McDonald ◽  
Tiffany R. Price ◽  
...  

ABSTRACT Current cancer prevention recommendations advise limiting red meat intake to <500g/week and avoiding consumption of processed meat, but do not differentiate the source of processed meat. We examined the associations of processed meat derived from red vs. non-red meats with cancer risk in a prospective cohort of 26,218 adults who reported dietary intake using the Canadian Diet History Questionnaire. Incidence of cancer was obtained through data linkage with Alberta Cancer Registry with median (IQR) follow-up of 13.3 (5.1) years. Multivariable Cox proportional hazards regression models were adjusted for covariates and stratified by age and gender. The median (IQR) consumption (g/week) of red meat, processed meat from red meat and processed meat from non-red meat were 267.9 (269.9), 53.6 (83.3), and 11.9 (31.8), respectively. High intakes (4th Quartile) of processed meat from red meat was associated with increased risk of gastro-intestinal cancer Adjusted Hazard Ratio (AHR) (95% CI): 1.68 (1.09 – 2.57) and colorectal cancers AHR (95% CI): 1.90 (1.12 – 3.22), respectively in women. No statistically significant associations were observed for intakes of red meat or processed meat from non-red meat. Results suggests that the carcinogenic effect associated with processed meat intake may be limited to processed meat derived from red meats. The findings provide preliminary evidence toward refining cancer prevention recommendations for red and processed meat intake.


Sign in / Sign up

Export Citation Format

Share Document