risk criteria
Recently Published Documents


TOTAL DOCUMENTS

410
(FIVE YEARS 75)

H-INDEX

32
(FIVE YEARS 4)

2022 ◽  
Vol 2022 ◽  
pp. 1-15
Author(s):  
Hao-Yu Wang ◽  
Bo Xu ◽  
Chen-Xi Song ◽  
Chang-Dong Guan ◽  
Li-Hua Xie ◽  
...  

Background. There is a paucity of real-world data regarding the clinical impact of dual antiplatelet therapy (DAPT) interruption (temporary or permanent) among patients at high ischemic risk. The aim of this study was to assess the risk of cardiovascular events after interruption of DAPT in high-risk PCI population. Methods. This study used data from the Fuwai PCI registry, a large, prospective cohort of consecutive patients who underwent PCI. We assessed 3,931 patients with at least 1 high ischemic risk criteria of stent-related recurrent ischemic events proposed in the 2017 ESC guidelines for focused update on DAPT who were free of major cardiac events in the first 12 months. The primary ischemic endpoint was 30-month major adverse cardiac and cerebrovascular events, and the key safety endpoints were BARC class 2, 3, or 5 bleeding and net adverse clinical events. Results. DAPT interruption within 12 months occurred in 1,122 patients (28.5%), most of which were due to bleeding events or patients’ noncompliance to treatment. A multivariate Cox regression model, propensity score (PS) matching, and inverse probability of treatment weighting (IPTW) based on the propensity score demonstrated that DAPT interruption significantly increased the risk of primary ischemic endpoint compared with prolonged DAPT (3.9% vs. 2.2%; Cox-adjusted hazard ratio (HR): 1.840; 95% confidence interval (CI): 1.247 to 2.716; PS matching-HR: 2.049 [1.236–3.399]; IPTW-adjusted HR: 1.843 [1.250–2.717]). This difference was driven mainly by all-cause death (1.8% vs. 0.7%) and MI (1.3% vs. 0.5%). Furthermore, the rate of net adverse clinical events (4.9% vs. 3.2%; Cox-adjusted HR: 1.581 [1.128–2.216]; PS matching-HR: 1.639 [1.075–2.499]; IPTW-adjusted HR: 1.554 [1.110–2.177]) was also higher in patients with DAPT interruption (≤12 months), whereas no significant differences between groups were observed in terms of BARC 2, 3, or 5 bleeding. These findings were consistent across various stent-driven high-ischemic risk subsets with respect to the primary ischemic endpoints, with a greater magnitude of harm among patients with diffuse multivessel diabetic coronary artery disease. Conclusions. In patients undergoing high-risk PCI, interruption of DAPT in the first 12 months occurred infrequently and was associated with a significantly higher adjusted risk of major adverse cardiovascular events and net adverse clinical events. 2017 ESC stent-driven high ischemic risk criteria may help clinicians to discriminate patient selection in the use of long-term DAPT when the ischemic risk certainly overcomes the bleeding one.


2021 ◽  
Vol 108 (Supplement_9) ◽  
Author(s):  
Wing Ching Li ◽  
Omar Elboraey ◽  
Mohammad Saeed Kilani ◽  
Jeremy Bruce Ward ◽  
Ilayaraja Rajendran

Abstract Background Gallstone related diseases account for almost one-third of acute surgical admissions with presentation varying from biliary colic to sepsis. There were various studies evaluating the role of ‘percutaneous cholecystostomy’ (PC) as part of the management in acute cholecystitis under ‘radiological guidance’ (RG). However, limited literature is conducted to evaluate patients’ outcomes based on the indication and optimal timing of cholecystostomy. Therefore, this study was set up to assess the difference in clinical outcome between the patients undergoing cholecystectomy with overt sepsis (OS) and impending sepsis (IS). Methods A retrospective observational study was conducted using a prospective database on patients who underwent PC under RG between 03/2014-03/2021. NICE’s sepsis risk stratification tool was used to divide patients into OS and IS groups. OS group included patients with 1 or > 1 high-risk criteria. IS group included patients with 2 or > 2 moderate to high-risk criteria. The primary outcomes are 30-day mortality and the ‘length of stay’ (LoS) and secondary outcome include post-procedural ‘bile leak’ (BL).Continuous and categorical variables were analysed using Mann-Whitney U and Chi-squared tests respectively. A p-value of < 0.05 was considered to be statistically significant. Results Some 27 patients were included. The median age was 80 (range 61-90).The majority of the patients (77.78%, n = 21) were unfit for surgery, with a Charlson Comorbidity Index ranging of 3 to 12. The median length of hospital stay of the OS and IS groups were 17 and 15 days respectively (p = 0.47).There was no significant difference in bile leak (IS-1/20 vs OS-0/7; p = 0.56) and drain accidents (IS-8/20 vs OS-1/7;p=0.35).Overall two patients in the IS group underwent an uncomplicated interval cholecystectomy. The 30-day mortality rate was significantly higher in OS (IS 0/20 vs OS-4/7; p = 0.00039). Conclusions Percutaneous cholecystostomy is generally safe to be performed irrespective of patients’ co-morbidities and has no significant long-term complications associated with mortality. Early cholecystostomy before overt sepsis results in a reduced 30-day mortality rate and better outcome. Further clinical studies may be required to determine specific patient groups who would benefit from percutaneous cholecystostomy.


2021 ◽  
Vol 100 (10) ◽  
pp. 1043-1051
Author(s):  
Irina V. May ◽  
Svetlana V. Kleyn ◽  
Ekaterina V. Maksimova ◽  
Stanislav Yu. Balashov ◽  
Mihail Yu. Tsinker

Introduction. The quality of the environment and the health of the population are the main factors in the socio-economic development of society. To reduce the volume of emissions of pollutants into the atmosphere and improve the living conditions of residents of contaminated cities, a federal project, “Clean Air” has been developed. Analysis of the efficiency and effectiveness of air protection measures according to public health risk criteria is a priority task for achieving the project conditions and ensuring a safe living environment. Materials and methods. The initial information for the hygienic assessment and health risks of the city of Bratsk was the data of in-situ measurements of the quality of atmospheric air within the framework of environmental and socio-hygienic monitoring and summary calculations of the dispersion of emissions from stationary and mobile sources. Results. According to the hygienic assessment and health risk assessment, 13 priority impurities were identified for inclusion in the systematic monitoring program. Conclusion. Analysis of measures to reduce emissions of pollutants into the air by health risk criteria made it possible to establish that, in general, the directions of efforts are adequate to the list of priority risk factors for the health of citizens. For the correct assessment of the effectiveness and efficiency, it is advisable to supplement with materials containing data on specific emission sources on which it is planned to implement measures; supplying the assessment of health risks with data on the actual morbidity of the population in the city; as well as to assess and discuss with business entities the results of health risk assessment, all identified discrepancies between declared emissions, calculated levels of pollution and the real sanitary and hygienic situation in the city.


2021 ◽  
Vol 26 (Supplement_1) ◽  
pp. e93-e95
Author(s):  
Charlotte Grandjean-Blanchet ◽  
Stephanie Villeneuve ◽  
Carolyn Beck ◽  
Michaela Cada ◽  
Daniel Rosenfield ◽  
...  

Abstract Primary Subject area Emergency Medicine - Paediatric Background While the management of febrile neutropenia in patients with cancer has clear, evidence-based guidelines, the management of previously healthy, immunocompetent children with a febrile illness and first episode of neutropenia is less understood. These patients are often similarly treated with empiric antibiotics and hospitalization despite studies demonstrating that this population, if they are well-appearing with a short history of neutropenia, is at low risk of serious bacterial infections. Therefore, less aggressive management should be considered in patients meeting low risk criteria. Objectives The aim of our quality improvement (QI) study was to decrease the number of unnecessary hospitalizations and empiric antibiotics prescribed by 50% over a 12-month period for otherwise healthy, well appearing patients presenting to the emergency department (ED) with a first episode of febrile neutropenia. Design/Methods A team of stakeholders from Hematology, Infectious Disease, Pediatrics and Emergency Medicine was assembled. A review of the literature, peer institutions and local practices of managing febrile neutropenia in healthy children was performed. Using the Model for Improvement, a guideline for the management of healthy children with first episode of febrile neutropenia was developed and refined using PDSA cycles. In January 2020, the guideline was launched for clinical use in the ED. Education, targeted audit and feedback, pathway modifications, and reminders were used to address knowledge gaps and staff turnover. A family of measures was analyzed using run charts and statistical process control (SPC) methods. Results Eighteen months of baseline data identified nineteen low risk patients with 84% either hospitalized and/or received antibiotics. It was also uncovered that many patients were misdiagnosed with neutropenia by excluding bands from the absolute neutrophil count (ANC). After the first twelve months of the intervention, sixteen patients met low risk criteria. Hospitalization and/or antibiotics use for this population decreased to 25% and all blood cultures were negative. Recognition of true severe febrile neutropenia also improved. Forty-one patients had a neutrophil count < 0.5, but an ANC > 0.5. Hospitalization and/or antibiotics use for this population decreased from 52% to 10%. Conclusion Through a multi-faceted, multidisciplinary QI study, we improved resource stewardship and value-based care by reducing unnecessary hospitalizations and antibiotics in low risk patients with a first episode of febrile neutropenia. Next steps include iterations to the guideline to increase impact along with sustainability planning. This work can easily be adopted by other pediatric and community sites caring for children.


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Yasamin Tavakoli Haji Abadi ◽  
Soroush Avakh Darestani

Purpose The food industry is directly related to the health of humans and society and also that little attention has been paid to the assessment of sustainable supply chain risk management in this area, this will be qualified as an important research area. This study aims to develop a framework for assessing the sustainable supply chain risk management in the realm of the food industry (confectionery and chocolate) with a case study of three generic companies denotes as A1–A3. The proposed risk management was evaluated in three aforementioned manufacturing companies, and these three companies were ranked by the Fuzzy-Weighted Aggregated Sum Product Assessment (F-WASPAS) method in EXCEL. Design/methodology/approach The evaluation was carried out using integrated multi-criteria decision-making methods Best-Worst method (BWM)-WASPAS. Via an extensive literature review in the area of sustainable supply chain, sustainable food supply chain and risks in this, 9 risk criteria and 59 sub-criteria of risk were identified. Using expert opinion in the food industry, 8 risk criteria and 39 risk sub-criteria were identified for final evaluation. The final weight of the main and sub-criteria was obtained using the F-BWM method via LINGO software. Risk management in the sustainable supply chain has the role of identifying, analyzing and providing solutions to control risks. Findings The following criteria in each group gained more weight: loss of credibility and brand, dangerous and unhealthy working environment, unproductive use of energy, human error, supplier quality, quality risk, product perishability and security. Among the criteria, the economic risks have the highest weight and among the alternatives, A3 has obtained first ranking. Originality/value Modeling of risk for the food supply chain is the unique contribution of this work.


2021 ◽  
Vol 12 ◽  
Author(s):  
Andrea López-Cáceres ◽  
María Velasco-Rueda ◽  
Elkin Garcia-Cifuentes ◽  
Ignacio Zarante ◽  
Diana Matallana

Frontotemporal dementia (FTD) is a highly heritable condition. Up to 40% of FTD is familial and an estimated 15% to 40% is due to single-gene mutations. It has been estimated that the G4C2 hexanucleotide repeat expansions in the C9ORF72 gene can explain up to 37.5% of the familial cases of FTD, especially in populations of Caucasian origin. The purpose of this paper is to evaluate hereditary risk across the clinical phenotypes of FTD and the frequency of the G4C2 expansion in a Colombian cohort diagnosed with FTD.Methods: A total of 132 FTD patients were diagnosed according to established criteria in the behavioral variant FTD, logopenic variant PPA, non-fluent agrammatic PPA, and semantic variant PPA. Hereditary risk across the clinical phenotypes was established in four categories that indicate the pathogenic relationship of the mutation: high, medium, low, and apparently sporadic, based on those proposed by Wood and collaborators. All subjects were also examined for C9ORF72 hexanucleotide expansion (defined as >30 repetitions).Results: There were no significant differences in the demographic characteristics of the patients between the clinical phenotypes of FTD. The higher rate phenotype was bvFTD (62.12%). In accordance with the risk classification, we found that 72 (54.4%) complied with the criteria for the sporadic cases; for the familial cases, 23 (17.4%) fulfilled the high-risk criteria, 23 (17.4%) fulfilled the low risk criteria, and 14 (10.6%) fulfilled the criteria to be classified as subject to medium risk. C9ORF72 expansion frequency was 0.76% (1/132).Conclusion: The FTD heritability presented in this research was very similar to the results reported in the literature. The C9ORF72 expansion frequency was low. Colombia is a triethnic country, with a high frequency of genetic Amerindian markers; this shows consistency with the present results of a low repetition frequency. This study provides an initial report of the frequency for the hexanucleotide repeat expansions in C9ORF72 in patients with FTD in a Colombian population and paves the way for further study of the possible genetic causes of FTD in Colombia.


2021 ◽  
Author(s):  
Anand Balu Nellippallil ◽  
Parker R. Berthelson ◽  
Luke Peterson ◽  
Raj Prabhu

2021 ◽  
pp. 019459982110203
Author(s):  
Sahaja Acharya ◽  
Rebecca N. Sinard ◽  
Gustavo Rangel ◽  
Jeffrey C. Rastatter ◽  
Anthony Sheyn

Objective Indications for adjuvant radiation in pediatric salivary gland carcinoma rely on high-risk criteria extrapolated from adult data. We sought to determine whether adult-derived high-risk criteria were prognostic in children aged ≤21 years or young adults aged 22 to 39 years. Study Design Cross-sectional analysis of a hospital-based national registry. Setting Patients were identified from the National Cancer Database between 2004 and 2015. Methods High-risk criteria were defined as adenoid cystic histology, intermediate/high grade, T3/T4, positive margins, and/or lymph node involvement. Exact matching was used to adjust for differences in baseline characteristics between pediatric and young adult patients. Results We identified 215 pediatric patients aged ≤21 years, 317 patients aged 22 to 30 years, and 466 patients aged 31 to 39 years. Within the pediatric cohort, there was no significant difference in overall survival (OS) between low- and high-risk groups (5-year OS, 100% vs 98.5%; P = .29). In contrast, within the young adult cohorts, there was a significant difference in OS between low- and high-risk groups in patients aged 22 to 30 years (5-year OS, 100% vs 96.1%; P = .01) and 31 to 39 years (5-year OS, 100% vs 88.5%; P < .001). When high-risk patients were matched 1:1 on high-risk criteria and race, pediatric patients were associated with better OS than those aged 22 to 30 years ( P = .044) and those aged 31 to 39 years ( P = .005). Conclusion Children have excellent OS, irrespective of adult-derived high-risk status. These findings underscore the need to understand how age modifies clinicopathologic risk factors.


Sign in / Sign up

Export Citation Format

Share Document