A Validated Risk Model for Prediction of Early Readmission in Patients with Hepatic Encephalopathy

2018 ◽  
Vol 17 (5) ◽  
pp. 0-10
Author(s):  
Andrew J. Kruger ◽  
Fasika Aberra ◽  
Sylvester M. Black ◽  
Alice Hinton ◽  
James Hanje ◽  
...  

Introduction and aim. Hepatic encephalopathy (HE) is a common complication in cirrhotics and is associated with an increased healthcare burden. Our aim was to study independent predictors of 30-day readmission and develop a readmission risk model in patients with HE. Secondary aims included studying readmission rates, cost, and the impact of readmission on mortality. Material and methods. We utilized the 2013 Nationwide Readmission Database (NRD) for hospitalized patients with HE. A risk assessment model based on index hospitalization variables for predicting 30-day readmission was developed using multivariate logistic regression and validated with the 2014 NRD. Patients were stratified into Low Risk and High Risk groups. Cox regression models were fit to identify predictors of calendar-year mortality. Results. Of 24,473 cirrhosis patients hospitalized with HE, 32.4% were readmitted within 30-days. Predictors of readmission included presence of ascites (OR: 1.19; 95% CI: 1.06-1.33), receiving paracentesis (OR: 1.43; 95% CI: 1.26-1.62) and acute kidney injury (OR: 1.11; 95% CI: 1.00-1.22). Our validated model stratified patients into Low Risk and High Risk of 30-day readmissions (29% and 40%, respectively). The cost of the first readmission was higher than index admission in the 30-day readmission cohort ($14,198 vs. $10,386; p-value < 0.001). Thirty-day readmission was the strongest predictor of calendar-year mortality (HR: 4.03; 95% CI: 3.49-4.65). Conclusions. Nearly one-third of patients with HE were readmitted within 30-days, and early readmission adversely impacted healthcare utilization and calendar-year mortality. With our proposed simple risk assessment model, patients at high risk for early readmissions can be identified to potentially avert poor outcomes.

Blood ◽  
2012 ◽  
Vol 120 (21) ◽  
pp. 364-364
Author(s):  
Joshua M Ruch ◽  
Hsou M Hu ◽  
Vinita Bahl ◽  
Suman L. Sood

Abstract Abstract 364 Introduction: VTE is a common complication in hospitalized medical patients and the role of pharmacologic anticoagulation prophylaxis is well-established. Patients with active malignancy are at higher risk for VTE during hospitalization. However, VTE prophylaxis is underutilized in these patients due to many real and perceived contraindications to prophylaxis. To aid clinicians in determining VTE risk and guide choice of prophylaxis, our institution adopted the Caprini risk assessment model (Ann Surg, 2010; 251[2]:344–50), based on clinical factors such as age, comorbidities, and recent surgery. Our primary objective was to assess adherence to recommended VTE prophylaxis in hospitalized medical patients with solid tumors, hematological malignancies, and bone marrow transplant (BMT) patients in comparison to general medical (GM) patients, and the impact of recommended prophylaxis use on VTE outcomes. Secondary objectives were to evaluate the distribution of Caprini risk scores and the utility of the Caprini risk assessment model for guiding prophylaxis in this population. Methods: Patients admitted to the hematology/oncology (HO; oncology, malignant hematology, and BMT) and GM inpatient services at the University of Michigan between July 1, 2009 to December 31, 2011 were included in the study. After IRB approval, patient information was extracted from the electronic medical record (EMR). A point-scoring method based on the Caprini risk assessment model was used to calculate VTE risk at admission. A score of 3–4 was high risk and ≥ 5 highest risk for VTE. Type of VTE prophylaxis and VTE rate were determined. Recommended prophylaxis was 5000 units TID SQ heparin, 30–40 mg SQ enoxaparin, or 2.5 mg SQ fondaparinux, ± sequential compression devices (SCDs). Pharmacological prophylaxis administration was verified in the EMR. VTE is defined as deep venous thrombosis (DVT) or pulmonary embolism (PE) occurring during hospitalization or within 90 days, confirmed by Doppler, CT or V/Q scan. Adherence was defined as the percentage of patients at high or highest risk for VTE with a length of stay ≥ 2 days who received guideline recommended prophylaxis within 2 days of admission. Patients with a contraindication to prophylaxis were excluded. A retrospective cohort study was performed. Chi-squared test was used to test differences in proportions and Cochran-Armitage test for trends. Results: 4300 patients were admitted to HO and 18,347 to GM services. Compared to GM patients (86.8%), the rate of adherence to recommended VTE prophylaxis was similar for oncology (87.6%), hematology (85.4%), and lower (45.6%) for BMT patients (p<0.0001). The overall VTE rate on HO services was 2.77%. Compared with 1.45% in GM, VTE rate was 3.02% in oncology (p=0.070), 2.01% in hematology (p=0.220), and 3.61% for BMT (p=0.001). Over half (51.3%) of VTE in HO patients occurred in patients who did not receive pharmacologic prophylaxis. In HO patients with a VTE, ordered prophylaxis included 16.0% combined pharmacological and SCD, 32.8% pharmacological alone, 32.8% SCD alone, and 18.5% none. Use of combined or pharmacologic prophylaxis alone was non-significantly increased in the non-VTE HO patients. By the Caprini risk assessment model, 33.3% of all patients on HO services were high and 62.2% highest risk, with less oncology (p=0.0001) and more BMT (p=0.0003) patients classified as high or highest risk. VTE rate in HO patients rose as Caprini risk score increased: score (n, % with VTE) 0–1 (23, 4.35%,); 2 (169, 0.59%); 3–4 (1434, 1.67%); 5–6 (1691, 2.90%); 7–8 (745, 3.76%); and 9 (238, 6.72%), p<0.0001 for trend. Conclusions: Adherence to recommended VTE prophylaxis was high in medical patients with cancer, resulting in low overall rates of VTE during and following discharge. The majority of patients with VTE did not receive recommended pharmacologic prophylaxis. Most VTE occurred in patients at highest risk (Caprini risk assessment score ≥ 5), with a trend to higher VTE rate as individual score increased. These data suggest that the individual Caprini score may provide more detailed VTE risk assessment and may help inform the need for prophylaxis despite perceived relative contraindications in this high risk cancer population. Further study is needed to understand the barriers to ordering VTE prophylaxis in this population and encourage increased prophylaxis use. Disclosures: No relevant conflicts of interest to declare.


2005 ◽  
Vol 23 (30) ◽  
pp. 7437-7444 ◽  
Author(s):  
Claudi Oude Nijhuis ◽  
Willem A. Kamps ◽  
Simon M.G. Daenen ◽  
Jourik A. Gietema ◽  
Winette T.A. van der Graaf ◽  
...  

Purpose To investigate the feasibility of withholding antibiotics and early discharge for patients with chemotherapy-induced neutropenia and fever at low risk of bacterial infection by a new risk assessment model. Patients and Methods Outpatients with febrile neutropenia were allocated to one of three groups by a risk assessment model combining objective clinical parameters and plasma interleukin 8 level. Patients with signs of a bacterial infection and/or abnormal vital signs indicating sepsis were considered high risk. Based on their interleukin-8 level, remaining patients were allocated to low or medium risk for bacterial infection. Medium-risk and high-risk patients received standard antibiotic therapy, whereas low-risk patients did not receive antibiotics and were discharged from hospital after 12 hours of a febrile observation. End points were the feasibility of the treatment protocol. Results Of 196 assessable episodes, 76 (39%) were classified as high risk, 84 (43%) as medium risk, and 36 (18%) as low risk. There were no treatment failures in the low-risk group (95% CI, 0% to 10%). Therefore, sensitivity of our risk assessment model was 100% (95% CI, 90% to 100%), the specificity, positive, and negative predictive values were 21%, 13%, and 100%, respectively. Median duration of hospitalization was 3 days in the low-risk group versus 7 days in the medium- and high-risk groups (P < .0001). The incremental costs of the experimental treatment protocol amounted to a saving of €471 (US $572) for every potentially low-risk patient. Conclusion This risk assessment model appears to identify febrile neutropenic patients at low risk for bacterial infection. Antibiotics can be withheld in well-defined neutropenic patients with fever.


Blood ◽  
2019 ◽  
Vol 134 (Supplement_1) ◽  
pp. 3385-3385
Author(s):  
Mia Djulbegovic ◽  
Kevin Chen ◽  
Soundari Sureshanand ◽  
Sarwat Chaudhry

Background: Venous thromboembolism (VTE) is a common cause of morbidity and mortality in the United States. Annually, up to 1 in 120 people develop VTE, approximating the incidence of stroke. Given that hospitalization and acute medical illness increase the risk of VTE, hospital-associated VTE represents a preventable cause of morbidity and mortality. Accordingly, accreditation and regulatory agencies endorse inpatient pharmacologic VTE prophylaxis (PPX) as a quality measure. In order to raise rates of PPX prescribing, many health systems have adopted a default approach to electronic ordering, in which clinicians must "opt-out" of PPX prescription. However, this strategy may cause medical overuse and avoidable harms, which has prompted the American Society of Hematology (ASH) to recommend a risk-adapted approach to PPX. One risk model endorsed by ASH is the IMPROVE-VTE risk assessment model, which can identify patients who are at low risk for VTE and therefore may not warrant pharmacologic PPX. We therefore sought to compare the actual practice of PPX prescribing to the guideline-recommended strategy according the IMPROVE-VTE model in a large, contemporary population of medical inpatients. Methods: In this observational study, we used electronic health record data to identify adult, medical inpatients hospitalized on general medical and subspecialty services at Yale-New Haven Hospital from 1/1/14-12/31/18. We excluded patients who were pregnant, admitted for VTE, taking full dose anticoagulation on admission, admitted for bleeding, or had a platelet count of < 50,000/µL. For each patient, we calculated the IMPROVE-VTE score using the previously validated model weights: 3 points for a prior history of VTE; 2 points for known thrombophilia, lower limb paralysis, or active cancer; 1 point for immobilization, admission to the intensive care unit, or age ≥ 60 years. For each component other than age, we used ICD-9 and ICD-10 codes that were billed either prior to or upon admission to determine the presence of these risk factors. In order to simulate the decision to initiate PPX on hospital admission, we calculated each patient's IMPROVE-VTE score at the time of admission. In accordance with the ASH guidelines, we used an IMPROVE-VTE score of <2 to differentiate patients at low-risk of hospital-associated VTE from those at high-risk. We used inpatient medication order history data to determine receipt of pharmacologic PPX. We used χ2 testing to compare the relative frequency of PPX prescribing on admission between patients at low-risk and high-risk for VTE. Results: We identified 135,288 medical inpatients during the study period, of whom 99,380 met inclusion criteria. The average age was 63.5 years-old (standard deviation 18 years); 51% of patients were female; 68% of patients were white. Of all the included patients, 81% received pharmacologic prophylaxis; of these patients, 78% received unfractionated heparin subcutaneously and 22% received low molecular weight heparin subcutaneously. Among all hospitalized patients, 78% had an IMPROVE-VTE score of <2 (32% had a score of 0 and 46% had a score of 1). Among these patients at low risk of hospital-associated VTE, 81% received pharmacologic PPX. Differences in prophylaxis rates between patients at low vs high risk of VTE were statistically significant (p<0.001). Conclusion: In this contemporary cohort of adult, medical inpatients, >80% of patients who were at low risk of hospital-associated VTE received pharmacologic PPX, representing a group in whom PPX may be unnecessary. Using a risk-adapted approach such as the IMPROVE-VTE risk assessment model, rather than default PPX ordering, may reduce medical overuse and avoidable harms. Disclosures Chaudhry: CVS State of CT Clinical Pharmacy Program: Other: Paid Reviewer for CVS State of CT Clinical Pharmacy Program.


2021 ◽  
Vol 13 (2) ◽  
pp. 826
Author(s):  
Meiling Zhou ◽  
Xiuli Feng ◽  
Kaikai Liu ◽  
Chi Zhang ◽  
Lijian Xie ◽  
...  

Influenced by climate change, extreme weather events occur frequently, and bring huge impacts to urban areas, including urban waterlogging. Conducting risk assessments of urban waterlogging is a critical step to diagnose problems, improve infrastructure and achieve sustainable development facing extreme weathers. This study takes Ningbo, a typical coastal city in the Yangtze River Delta, as an example to conduct a risk assessment of urban waterlogging with high-resolution remote sensing images and high-precision digital elevation models to further analyze the spatial distribution characteristics of waterlogging risk. Results indicate that waterlogging risk in the city proper of Ningbo is mainly low risk, accounting for 36.9%. The higher-risk and medium-risk areas have the same proportions, accounting for 18.7%. They are followed by the lower-risk and high-risk areas, accounting for 15.5% and 9.6%, respectively. In terms of space, waterlogging risk in the city proper of Ningbo is high in the south and low in the north. The high-risk area is mainly located to the west of Jiangdong district and the middle of Haishu district. The low-risk area is mainly distributed in the north of Jiangbei district. These results are consistent with the historical situation of waterlogging in Ningbo, which prove the effectiveness of the risk assessment model and provide an important reference for the government to prevent and mitigate waterlogging. The optimized risk assessment model is also of importance for waterlogging risk assessments in coastal cities. Based on this model, the waterlogging risk of coastal cities can be quickly assessed, combining with local characteristics, which will help improve the city’s capability of responding to waterlogging disasters and reduce socio-economic loss.


Author(s):  
Otto Huisman ◽  
Ricardo Almandoz ◽  
Thomas Schuster ◽  
Adriana Andrade Caballero ◽  
Leonardo Martinez Forero

Pipeline risk analysis is a common step carried out by operators in their overall Pipeline Integrity Management Process. There is a growing realization among operators of the need to adopt more proactive risk management approaches. This has brought about increased demand for more quantitative models to support risk reduction decision-making. Consequences of failure are a key component of these models where enhanced quantitative approaches can be deployed. Impacts to the environment and upon populations are key issues which both operators and regulatory bodies seek to minimize. Pipeline risk models and High Consequence Area (HCA) analyses play an increasingly important role in this context by allowing operators to identify a range of potential scenarios and the relative impact to receptors based upon the best available data sources. This paper presents the process and results of an HCA analysis project carried out by ROSEN for a major South American state-owned pipeline operator (hereafter referred to as ‘the Client’). This analysis was implemented using automated GIS processing methods and includes HCA analyses for approximately 2354 km of pipeline. The analysis was based on industry standards for both liquid and gas pipelines (i.e. American Petroleum Institute (API) and American Society of mechanical Engineers (ASME)), but customized for the specific needs of the Client and the South American geographical context. A key use for the results of this analysis is to serve as input for the pipeline risk assessment model jointly developed by ROSEN Integrity Solutions, MACAW Engineering and the Client. The methodology for development of this model is briefly discussed, and operational uses of HCA results are illustrated. The benefits of this project include, but are not limited to, identifying areas that could be severely impacted should a pipeline failure occur, being able to assess the risk profile of credible threats in HCAs, but also being able to prioritize preventative and mitigation measures at HCAs to either reduce the likelihood of failure or the impact of failure upon various receptors.


2011 ◽  
Vol 40 (1) ◽  
pp. 37-45 ◽  
Author(s):  
Heather M. B. MacRitchie ◽  
Christopher Longbottom ◽  
Margaret Robertson ◽  
Zoann Nugent ◽  
Karen Chan ◽  
...  

2016 ◽  
Vol 116 (09) ◽  
pp. 530-536 ◽  
Author(s):  
David J. Rosenberg ◽  
Anne Press ◽  
Joanna Fishbein ◽  
Martin Lesser ◽  
Lauren McCullagh ◽  
...  

SummaryThe IMPROVE Bleed Risk Assessment Model (RAM) remains the only bleed RAM in hospitalised medical patients using 11 clinical and laboratory factors. The aim of our study was to externally validate the IMPROVE Bleed RAM. A retrospective chart review was conducted between October 1, 2012 and July 31, 2014. We applied the point scoring system to compute risk scores for each patient in the validation sample. We then dichotomised the patients into those with a score <7 (low risk) vs ≥ 7 (high risk), as outlined in the original study, and compared the rates of any bleed, non-major bleed, and major bleed. Among the 12,082 subjects, there was an overall 2.6 % rate of any bleed within 14 days of admission. There was a 2.12 % rate of any bleed in those patients with a score of < 7 and a 4.68 % rate in those with a score ≥ 7 [Odds Ratio (OR) 2.3 (95 % CI=1.8–2.9), p<0.0001]. MB rates were 1.5 % in the patients with a score of < 7 and 3.2 % in the patients with a score of ≥ 7, [OR 2.2 (95 % CI=1.6–2.9), p<0.0001]. The ROC curve was 0.63 for the validation sample. This study represents the largest externally validated Bleed RAM in a hospitalised medically ill patient population. A cut-off point score of 7 or above was able to identify a high-risk patient group for MB and any bleed. The IMPROVE Bleed RAM has the potential to allow for more tailored approaches to thromboprophylaxis in medically ill hospitalised patients.Supplementary Material to this article is available online at www.thrombosis-online.com.


2021 ◽  
Author(s):  
Federico Nichetti ◽  
Francesca Ligorio ◽  
Giulia Montelatici ◽  
Luca Porcu ◽  
Emma Zattarin ◽  
...  

Abstract Background: Hospitalized cancer patients are at increased risk for Thromboembolic Events (TEs). As untailored thromboprophylaxis is associated with hemorrhagic complications, the definition of a risk-assessment model (RAM) in this population is needed. Objectives: INDICATE was an observational study enrolling hospitalized cancer patients, with the primary objective of assessing the Negative Predictive Value (NPV) for TEs during hospitalization and within 45 days from discharge of low-grade Khorana Score (KS=0). Secondary objectives were to assess KS Positive Predictive Value (PPV), the impact of TEs on survival and the development of a new RAM. Materials and Methods: Assuming 7% of TEs in KS=0 patients as unsatisfactory percentage and 3% of as satisfactory, 149 patients were needed to detect the favorable NPV with one-sided a= 0.10 and power=0.80. Stepwise logistic regression was adopted to identify variables included in a new RAM.Results: Among 535 enrolled patients, 153 (28.6%) had a KS=0. The primary study objective was met: 29 (5.4%) TEs were diagnosed, with 7 (4.6%) cases in the KS=0 group (NPV=95.4%, 95%CI: 90.8-98.1%; one-sided p=0.084). However, the PPV was low (5.7%, 95%CI: 1.9-12.8%); a new RAM based on albumin (OR 0.34, p=0.003), log(LDH) (OR 1.89, p=0.023) and presence of vascular compression (OR 5.32, p<.001) was developed and internally validated. Also, TEs were associated with poorer OS (median, 5.7 vs 24.8 months, p <.001).Conclusion: INDICATE showed that the KS has a good NPV but poor PPV for TEs in hospitalized cancer patients. A new RAM was developed, and deserves further assessment in external cohorts.


2021 ◽  
Vol 13 (18) ◽  
pp. 3704
Author(s):  
Pengcheng Zhao ◽  
Fuquan Zhang ◽  
Haifeng Lin ◽  
Shuwen Xu

Fire risk prediction is significant for fire prevention and fire resource allocation. Fire risk maps are effective methods for quantifying regional fire risk. Laoshan National Forest Park has many precious natural resources and tourist attractions, but there is no fire risk assessment model. This paper aims to construct the forest fire risk map for Nanjing Laoshan National Forest Park. The forest fire risk model is constructed by factors (altitude, aspect, topographic wetness index, slope, distance to roads and populated areas, normalized difference vegetation index, and temperature) which have a great influence on the probability of inducing fire in Laoshan. Since the importance of factors in different study areas is inconsistent, it is necessary to calculate the significance of each factor of Laoshan. After the significance calculation is completed, the fire risk model of Laoshan can be obtained. Then, the fire risk map can be plotted based on the model. This fire risk map can clarify the fire risk level of each part of the study area, with 16.97% extremely low risk, 48.32% low risk, 17.35% moderate risk, 12.74% high risk and 4.62% extremely high risk, and it is compared with the data of MODIS fire anomaly point. The result shows that the accuracy of the risk map is 76.65%.


2022 ◽  
Vol 12 ◽  
Author(s):  
Xitao Wang ◽  
Xiaolin Dou ◽  
Xinxin Ren ◽  
Zhuoxian Rong ◽  
Lunquan Sun ◽  
...  

Pancreatic ductal adenocarcinoma (PDAC) is a highly heterogeneous malignancy. Single-cell sequencing (scRNA-seq) technology enables quantitative gene expression measurements that underlie the phenotypic diversity of cells within a tumor. By integrating PDAC scRNA-seq and bulk sequencing data, we aim to extract relevant biological insights into the ductal cell features that lead to different prognoses. Firstly, differentially expressed genes (DEGs) of ductal cells between normal and tumor tissues were identified through scRNA-seq data analysis. The effect of DEGs on PDAC survival was then assessed in the bulk sequencing data. Based on these DEGs (LY6D, EPS8, DDIT4, TNFSF10, RBP4, NPY1R, MYADM, SLC12A2, SPCS3, NBPF15) affecting PDAC survival, a risk score model was developed to classify patients into high-risk and low-risk groups. The results showed that the overall survival was significantly longer in the low-risk group (p &lt; 0.05). The model also revealed reliable predictive power in different subgroups of patients. The high-risk group had a higher tumor mutational burden (TMB) (p &lt; 0.05), with significantly higher mutation frequencies in KRAS and ADAMTS12 (p &lt; 0.05). Meanwhile, the high-risk group had a higher tumor stemness score (p &lt; 0.05). However, there was no significant difference in the immune cell infiltration scores between the two groups. Lastly, drug candidates targeting risk model genes were identified, and seven compounds might act against PDAC through different mechanisms. In conclusion, we have developed a validated survival assessment model, which acted as an independent risk factor for PDAC.


Sign in / Sign up

Export Citation Format

Share Document