Readmission Risk Trajectories for Heart Failure Patients Using a Dynamic Prediction Approach (Preprint)

2019 ◽  
Author(s):  
Wei Jiang ◽  
Sauleh Siddiqui ◽  
Diego Martinez ◽  
Stephanie Cabral ◽  
Matthew Toerper ◽  
...  

BACKGROUND Patients hospitalized with heart failure suffer the highest rates of 30-day readmission among any clinically-defined patient populations in the United States. Investigation into the predictability of 30-day readmissions can lead to clinical decision-support tools and targeted interventions that can help care providers to improve individual patient care and reduce readmission risk. OBJECTIVE We developed a dynamic readmission risk prediction model that yields daily predictions for hospitalized heart failure patients toward identifying risk trajectories over time. We identified clinical predictors associated with different patterns in readmission risk trajectories. METHODS A two-stage predictive modeling approach combining logistic and beta regression was applied to electronic health record (EHR) data accumulated daily to predict 30-day readmission for a cohort of 534 heart failure patient encounters over 2,750 patient-days. Unsupervised clustering was performed on predictions to uncover time-dependent trends in readmission risk over the patient’s hospital stay. RESULTS Readmission occurred in 107 (20.0%) encounters. The out-of-sample AUC for the two-stage predictive model was 0.73 (±0.08). Dynamic clinical predictors capturing lab results and vital signs had the highest predictive value compared to demographic, administrative, medication and procedural data included. Unsupervised clustering identified four risk trajectory groups: decreasing risk (24.5% encounters), high risk (21.2%), moderate risk (33.1%), and low risk (21.2%). The decreasing risk group demonstrated change in average probability of readmission from admission (0.69) to discharge (0.30), while the high risk (0.75), moderate risk (0.61), and low risk (0.39) maintained consistency over the hospital course. Clinical predictors that discriminated groups included lab measures (hemoglobin, potassium, sodium), vital signs (diastolic blood pressure), and the number of prior hospitalizations. CONCLUSIONS Dynamically predicting readmission and quantifying trends over patients’ hospital stay illuminated differing risk trajectory groups. Identifying risk trajectory patterns and distinguishing predictors may shed new light on indicators of readmission and the isolated effects of the index hospitalization.

2020 ◽  
Vol 38 (33) ◽  
pp. 3851-3862 ◽  
Author(s):  
Matthew J. Ehrhardt ◽  
Zachary J. Ward ◽  
Qi Liu ◽  
Aeysha Chaudhry ◽  
Anju Nohria ◽  
...  

PURPOSE Survivors of childhood cancer treated with anthracyclines and/or chest-directed radiation are at increased risk for heart failure (HF). The International Late Effects of Childhood Cancer Guideline Harmonization Group (IGHG) recommends risk-based screening echocardiograms, but evidence supporting its frequency and cost-effectiveness is limited. PATIENTS AND METHODS Using the Childhood Cancer Survivor Study and St Jude Lifetime Cohort, we developed a microsimulation model of the clinical course of HF. We estimated long-term health outcomes and economic impact of screening according to IGHG-defined risk groups (low [doxorubicin-equivalent anthracycline dose of 1-99 mg/m2 and/or radiotherapy < 15 Gy], moderate [100 to < 250 mg/m2 or 15 to < 35 Gy], or high [≥ 250 mg/m2 or ≥ 35 Gy or both ≥ 100 mg/m2 and ≥ 15 Gy]). We compared 1-, 2-, 5-, and 10-year interval-based screening with no screening. Screening performance and treatment effectiveness were estimated based on published studies. Costs and quality-of-life weights were based on national averages and published reports. Outcomes included lifetime HF risk, quality-adjusted life-years (QALYs), lifetime costs, and incremental cost-effectiveness ratios (ICERs). Strategies with ICERs < $100,000 per QALY gained were considered cost-effective. RESULTS Among the IGHG risk groups, cumulative lifetime risks of HF without screening were 36.7% (high risk), 24.7% (moderate risk), and 16.9% (low risk). Routine screening reduced this risk by 4% to 11%, depending on frequency. Screening every 2, 5, and 10 years was cost-effective for high-risk survivors, and every 5 and 10 years for moderate-risk survivors. In contrast, ICERs were > $175,000 per QALY gained for all strategies for low-risk survivors, representing approximately 40% of those for whom screening is currently recommended. CONCLUSION Our findings suggest that refinement of recommended screening strategies for IGHG high- and low-risk survivors is needed, including careful reconsideration of discontinuing asymptomatic left ventricular dysfunction and HF screening in low-risk survivors.


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Sandra Chamat-Hedemand ◽  
Niels Eske Bruun ◽  
Lauge Østergaard ◽  
Magnus Arpi ◽  
Emil Fosbøl ◽  
...  

Abstract Background Infective endocarditis (IE) is diagnosed in 7–8% of streptococcal bloodstream infections (BSIs), yet it is unclear when to perform transthoracic (TTE) and transoesophageal echocardiography (TOE) according to different streptococcal species. The aim of this sub-study was to propose a flowchart for the use of echocardiography in streptococcal BSIs. Methods In a population-based setup, we investigated all patients admitted with streptococcal BSIs and crosslinked data with nationwide registries to identify comorbidities and concomitant hospitalization with IE. Streptococcal species were divided in four groups based on the crude risk of being diagnosed with IE (low-risk < 3%, moderate-risk 3–10%, high-risk 10–30% and very high-risk > 30%). Based on number of positive blood culture (BC) bottles and IE risk factors (prosthetic valve, previous IE, native valve disease, and cardiac device), we further stratified cases according to probability of concomitant IE diagnosis to create a flowchart suggesting TTE plus TOE (IE > 10%), TTE (IE 3–10%), or “wait & see” (IE < 3%). Results We included 6393 cases with streptococcal BSIs (mean age 68.1 years [SD 16.2], 52.8% men). BSIs with low-risk streptococci (S. pneumoniae, S. pyogenes, S. intermedius) are not initially recommended echocardiography, unless they have ≥3 positive BC bottles and an IE risk factor. Moderate-risk streptococci (S. agalactiae, S. anginosus, S. constellatus, S. dysgalactiae, S. salivarius, S. thermophilus) are guided to “wait & see” strategy if they neither have a risk factor nor ≥3 positive BC bottles, while a TTE is recommended if they have either ≥3 positive BC bottles or a risk factor. Further, a TTE and TOE are recommended if they present with both. High-risk streptococci (S. mitis/oralis, S. parasanguinis, G. adiacens) are directed to a TTE if they neither have a risk factor nor ≥3 positive BC bottles, but to TTE and TOE if they have either ≥3 positive BC bottles or a risk factor. Very high-risk streptococci (S. gordonii, S. gallolyticus, S. mutans, S. sanguinis) are guided directly to TTE and TOE due to a high baseline IE prevalence. Conclusion In addition to the clinical picture, this flowchart based on streptococcal species, number of positive blood culture bottles, and risk factors, can help guide the use of echocardiography in streptococcal bloodstream infections. Since echocardiography results are not available the findings should be confirmed prospectively with the use of systematic echocardiography.


2021 ◽  
Vol 10 (Supplement_1) ◽  
Author(s):  
J Plonka ◽  
J Bugajski ◽  
M Plonka ◽  
A Tycinska ◽  
M Gierlotka

Abstract Funding Acknowledgements Type of funding sources: None. Levosimendan, a calcium sensitizer and potassium channel-opener, is appreciated  for its effects on systemic and pulmonary hemodynamic and for the relief of symptoms in acute heart failure (AHF). Positive effects of levosimendan on renal function have been also described. The aim of the present analysis was to assess the predictors of the diuresis response to levosimendan administration in high risk acute heart failure patients. Methods. We analysed 34 consecutive patients admitted with high risk AHF to one centre and treated in intensive cardiac care unit. Levosimendan was administered on top of other treatment as a 24-hour infusion of 12.5 mg total dose except for 7 patients (1 patient - terminated earlier due to intolerance, 5 patients – 48h infusion, 1 patient - 72h infusion). Decision of levosimendan administration was based on clinical status and left to attending physician. Diuresis and diuretic dosage before (24 hours) and after levosimendan infusion (48 hours) were taken into account for the present study. Results. The AHF was primary of cardiac origin in all patients. In 6 (18%) it was due to recent acute myocardial infarction. In-hospital mortality was 24%. Median length of hospitalization was 26 days (range 6 to 107 days). Mean age of the patients was 66 ± 12 years, 25 (74%) were men. Mean INTERMACS score was 3.4 ± 1.4 with wet-cold clinical profile present in 13 (38%) of patients. Mean left ventricle ejection fraction (LVEF) was 27 ± 13%, mean NTproBNP was 17176 ± 12464 pg/ml, and mean eGFR 48 ± 22 ml/min/1.73m2. At the time of levosimendan administration patients had background treatment with catecholamines (mean number per patient 1.4 ± 1.1, range 0-3) and with diuretics (mean dosage of furosemide 167 ± 102 mg/24h, range 20-500). 48-hours diuresis after levosimendan administration varies from 950 to 11300 ml (mean 4307 ± 2418 ml). It was significantly lower in patients with cold-wet profile (2646 ± 1335 vs. 5335 ± 2381 ml in other clinical profiles, p = 0.0002). Additionally, 48-hour diuresis was negatively correlated with age (r=-0.46, p = 0.0062) and the number of background catecholamines (r=-0.47, p = 0.0047), and not significantly with the furosemide dosage (r=-0.28, p = 0.10) – figure. No association with diuresis was found for LVEF, NTproBNP, and eGFR. In multiple regression analysis (model R2 = 0.63, p = 0.0085) both older age (p = 0.026) and cold-wet profile (p = 0.0074) were significant predictors of poor diuresis after levosimendan administration. Conclusion. Older age and cold-wet profile were significant predictors of poor diuresis response to levosimendan administration in high risk acute heart failure patients. Although concomitant catecholamines and high diuretic dosage use cloud also be markers of non-responders to levosimendan in terms of diuresis. Abstract Figure


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Yuanyuan Chen ◽  
Dongru Chen ◽  
Huancai Lin

Abstract Background Infiltration and sealing are micro-invasive treatments for arresting proximal non-cavitated caries lesions; however, their efficacies under different conditions remain unknown. This systematic review and meta-analysis aimed to evaluate the caries-arresting effectiveness of infiltration and sealing and to further analyse their efficacies across different dentition types and caries risk levels. Methods Six electronic databases were searched for published literature, and references were manually searched. Split-mouth randomised controlled trials (RCTs) to compare the effectiveness between infiltration/sealing and non-invasive treatments in proximal lesions were included. The primary outcome was obtained from radiographical readings. Results In total, 1033 citations were identified, and 17 RCTs (22 articles) were included. Infiltration and sealing reduced the odds of lesion progression (infiltration vs. non-invasive: OR = 0.21, 95% CI 0.15–0.30; sealing vs. placebo: OR = 0.27, 95% CI 0.18–0.42). For both the primary and permanent dentitions, infiltration and sealing were more effective than non-invasive treatments (primary dentition: OR = 0.30, 95% CI 0.20–0.45; permanent dentition: OR = 0.20, 95% CI 0.14–0.28). The overall effects of infiltration and sealing were significantly different from the control effects based on different caries risk levels (OR = 0.20, 95% CI 0.14–0.28). Except for caries risk at moderate levels (moderate risk: OR = 0.32, 95% CI 0.01–8.27), there were significant differences between micro-invasive and non-invasive treatments (low risk: OR = 0.24, 95% CI 0.08–0.72; low to moderate risk: OR = 0.38, 95% CI 0.18–0.81; moderate to high risk: OR = 0.17, 95% CI 0.10–0.29; and high risk: OR = 0.14, 95% CI 0.07–0.28). Except for caries risk at moderate levels (moderate risk: OR = 0.32, 95% CI 0.01–8.27), infiltration was superior (low risk: OR = 0.24, 95% CI 0.08–0.72; low to moderate risk: OR = 0.38, 95% CI 0.18–0.81; moderate to high risk: OR = 0.20, 95% CI 0.10–0.39; and high risk: OR = 0.14, 95% CI 0.05–0.37). Conclusion Infiltration and sealing were more efficacious than non-invasive treatments for halting non-cavitated proximal lesions.


2018 ◽  
Vol 24 (8) ◽  
pp. S129
Author(s):  
Justin D. Roberts ◽  
Amanda Gerberich ◽  
Kathleen Makkar ◽  
Lisa Rathman

Heart ◽  
2015 ◽  
Vol 101 (Suppl 4) ◽  
pp. A15.1-A15
Author(s):  
Sarah Burgess ◽  
Lucy Cornthwaite

2021 ◽  
Author(s):  
Faisal Rahman ◽  
Noam Finkelstein ◽  
Anton Alyakin ◽  
Nisha Gilotra ◽  
Jeff Trost ◽  
...  

Abstract Objective: Despite technological and treatment advancements over the past two decades, cardiogenic shock (CS) mortality has remained between 40-60%. A number of factors can lead to delayed diagnosis of CS, including gradual onset and nonspecific symptoms. Our objective was to develop an algorithm that can continuously monitor heart failure patients, and partition them into cohorts of high- and low-risk for CS.Methods: We retrospectively studied 24,461 patients hospitalized with acute decompensated heart failure, 265 of whom developed CS, in the Johns Hopkins Healthcare system. Our cohort identification approach is based on logistic regression, and makes use of vital signs, lab values, and medication administrations recorded during the normal course of care. Results: Our algorithm identified patients at high-risk of CS. Patients in the high-risk cohort had 10.2 times (95% confidence interval 6.1-17.2) higher prevalence of CS than those in the low-risk cohort. Patients who experienced cardiogenic shock while in the high-risk cohort were first deemed high-risk a median of 1.7 days (interquartile range 0.8 to 4.6) before cardiogenic shock diagnosis was made by their clinical team. Conclusions: This risk model was able to predict patients at higher risk of CS in a time frame that allowed a change in clinical care. Future studies need to evaluate if CS analysis of high-risk cohort identification may affect outcomes.


Author(s):  
Nazia N. Shaik ◽  
Swapna M. Jaswanth ◽  
Shashikala Manjunatha

Background: Diabetes is one of the largest global health emergencies of the 21st century. As per International Federation of Diabetes some 425 million people worldwide are estimated to have diabetes. The prevalence is higher in urban versus rural (10.2% vs 6.9%). India had 72.9 million people living with diabetes of which, 57.9% remained undiagnosed as per the 2017 data. The objectives of the present study were to identify subjects who at risk of developing Diabetes by using Indian diabetes risk score (IDRS) in the Urban field practice area of Rajarajeswari Medical College and Hospital (RRMCH).Methods: A cross sectional study was conducted using a Standard questionnaire of IDRS on 150 individuals aged ≥20 years residing in the Urban field practice area of RRMCH. The subjects with score <30, 30-50, >or =60 were categorized as having low risk, moderate risk and high risk for developing diabetes type-2 respectively.Results: Out of total 150 participants, 36 (24%) were in high-risk category (IDRS≥60), the majority of participants 61 (41%) were in the moderate-risk category (IDRS 30–50) and 53 (35%) participants were found to be at low-risk (<30) for diabetes. Statistical significant asssociation was found between IDRS and gender, literacy status, body mass index (p<0.0000l).Conclusions: It is essential to implement IDRS which is a simple tool for identifying subjects who are at risk for developing diabetes so that proper intervention can be carried out at the earliest to reduce the burden of diabetes.


Sign in / Sign up

Export Citation Format

Share Document