Comparison of endotracheal intubation learning curves with tracheoscopic ventilation tube for simulated difficult intubation between expert and novice anesthesiologists.

2020 ◽  
Author(s):  
Heezoo Kim ◽  
Dong Kyu Lee ◽  
Choong Hun Lee ◽  
Myung-Hoon Gong ◽  
Jung Suk Oh

Abstract Background: Tracheoscopic ventilation tube (TVT) is a specially designed single-lumen endotracheal tube with a camera. It was developed to facilitate endobronchial blocker insertion without bronchoscopy; its ability to explore anatomy received attention for difficult intubations. To clarify the feasibility of TVT in difficult intubation, we evaluated the learning curves of intubation between novice and expert. Methods: 182 patients who presented as Cormack-Lehane (CL) grade IIb and III with cervical in-line stabilization, and 4 trainees (2 novices, 2 experts) at single tertiary care teaching university hospital. All trainees performed intubation with TVT during laryngoscopy. Intubation attempts were limited to two times, each within 30 seconds. For every attempt, trainees visualized an imaginary pathway from the teeth to vocal cords and then shaped the stylet. Intubation was confirmed by three successive ETCO 2 measurements > 30 mmHg. Using CUSUM analysis, the trial was continued until every trainee reached an acceptable failure rate. Results: Patients were constituted with 94.5% CL grade IIb and 5.5% grade III. The median number of acceptable performances (10% of the acceptable failure rate) was 36. Overall failure rate was 5.5% (95%CI: 2.2-8.8%), with 6.9% (95%CI: 2.0-11.8%) for novices and 3.7% (95%CI: 0.0-7.8%, P=0.165, Cohen’s h=0.14) for experts. Intubation time was longer in novices by about 3 seconds compared to experts (mean difference=2.8, 95%CI: 1.3-4.3, P<0.001, Cohen's d=0.57). Conclusions: Intubation with TVT in CL grades IIb and III was easy to learn and could be an alternative for difficult intubation. It required small cases to reach acceptable performance, and provided a short learning period even for novice anesthesiologists, with failure rates similar to those of experienced anesthesiologists.

2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Seven Johannes Sam Aghdassi ◽  
Britta Kohlmorgen ◽  
Christin Schröder ◽  
Luis Alberto Peña Diaz ◽  
Norbert Thoma ◽  
...  

Abstract Background Early detection of clusters of pathogens is crucial for infection prevention and control (IPC) in hospitals. Conventional manual cluster detection is usually restricted to certain areas of the hospital and multidrug resistant organisms. Automation can increase the comprehensiveness of cluster surveillance without depleting human resources. We aimed to describe the application of an automated cluster alert system (CLAR) in the routine IPC work in a hospital. Additionally, we aimed to provide information on the clusters detected and their properties. Methods CLAR was continuously utilized during the year 2019 at Charité university hospital. CLAR analyzed microbiological and patient-related data to calculate a pathogen-baseline for every ward. Daily, this baseline was compared to data of the previous 14 days. If the baseline was exceeded, a cluster alert was generated and sent to the IPC team. From July 2019 onwards, alerts were systematically categorized as relevant or non-relevant at the discretion of the IPC physician in charge. Results In one year, CLAR detected 1,714 clusters. The median number of isolates per cluster was two. The most common cluster pathogens were Enterococcus faecium (n = 326, 19 %), Escherichia coli (n = 274, 16 %) and Enterococcus faecalis (n = 250, 15 %). The majority of clusters (n = 1,360, 79 %) comprised of susceptible organisms. For 906 alerts relevance assessment was performed, with 317 (35 %) alerts being classified as relevant. Conclusions CLAR demonstrated the capability of detecting small clusters and clusters of susceptible organisms. Future improvements must aim to reduce the number of non-relevant alerts without impeding detection of relevant clusters. Digital solutions to IPC represent a considerable potential for improved patient care. Systems such as CLAR could be adapted to other hospitals and healthcare settings, and thereby serve as a means to fulfill these potentials.


2018 ◽  
Vol 39 (11) ◽  
pp. 1353-1359 ◽  
Author(s):  
Annie I. Chen ◽  
Warren B. Bilker ◽  
Keith W. Hamilton ◽  
Judith A. O’Donnell ◽  
Irving Nachamkin

AbstractObjectiveTo describe the pattern of blood culture utilization in an academic university hospital setting.DesignRetrospective cohort study.SettingA 789-bed tertiary-care university hospital that processes 40,000+blood cultures annually.MethodsWe analyzed blood cultures collected from adult inpatients at the Hospital of the University of Pennsylvania between July 1, 2014, and June 30, 2015. Descriptive statistics and regression models were used to analyze patterns of blood culture utilization: frequency of blood cultures, use of repeat cultures following a true-positive culture, and number of sets drawn per day.ResultsIn total, 38,939 blood culture sets were drawn during 126,537 patient days (incidence rate, 307.7 sets per 1,000 patient days). The median number of blood culture sets drawn per hospital encounter was 2 (range, 1–76 sets). The median interval between blood cultures was 2 days (range, 1–71 days). Oncology services and cultures with gram-positive cocci were significantly associated with greater odds of having repeat blood cultures drawn the following day. Emergency services had the highest rate of drawing single blood-culture sets (16.9%), while oncology services had the highest frequency of drawing ≥5 blood culture sets within 24 hours (0.91%). Approximately 10% of encounters had at least 1 true-positive culture, and 89.2% of those encounters had repeat blood cultures drawn. The relative risk of a patient having repeat blood cultures was lower for those in emergency, surgery, and oncology services than for those in general medicine.ConclusionsOrdering practices differed by service and culture results. Analyzing blood culture utilization can contribute to the development of guidelines and benchmarks for appropriate usage.


Author(s):  
Gurumayum Sonachand Sharma ◽  
Anupam Gupta ◽  
Meeka Khanna ◽  
Naveen Bangarpet Prakash

Abstract Objective The aim of the study is to observe the effect of post-stroke depression on functional outcomes during inpatient rehabilitation. Patients and Methods The design involved is prospective observational study. The location involved is Neurological Rehabilitation unit in a tertiary care university hospital. The study period ranges from October 2019 to April 2020. The participants involved are the patients with first ever stroke, male and female with age ≥18 years and duration less than 1 year. All participants were assessed at admission and after 14 sessions of inpatient rehabilitation by depression subscale of Hospital Anxiety and Depression Scale (HADS-D) and Hamilton Depression Rating Scale (HDRS). The stroke outcomes measures used were: Barthel Index (BI), Scandinavian Stroke Scale (SSS), and Modified Rankin Scale (MRS). Results There are a total of 30 participants (18 males) with median stroke duration of 90 days. The median age of the patients was 58 years. Sixteen patients had ischemic and 14 had hemorrhagic stroke. Out of these, 57% (n = 17) had symptoms of depression (HADS-D >7). Participants in both groups (with and without depression) showed improvement in all the functional outcome measures (BI, SSS, MRS) at the time of discharge as compared with admission scores. The changes in the outcome measures were statistically significant within groups (p < 0.05) but not significant between the groups (p > 0.05). Conclusion The post-stroke depression is common among stroke survivors of less than 1 year duration. There was no significant difference in the functional outcomes between stroke patients with depression and those without depression with inpatient rehabilitation program.


OTO Open ◽  
2021 ◽  
Vol 5 (1) ◽  
pp. 2473974X2199474
Author(s):  
Mursalin M. Anis ◽  
Jennylee Diaz ◽  
Mausam Patel ◽  
Adam T. Lloyd ◽  
David E. Rosow

Objective Glottic keratosis poses a challenge because a decision to biopsy must weigh the likelihood of dysplasia and cancer against the voice outcome after biopsy. We determined the significance of laryngoscopic findings and agreement among clinicians to identify those specific findings. Study Design Retrospective case-control study. Setting Tertiary care university hospital. Methods Adults with glottic keratosis with preoperative office laryngoscopies were included. Preoperative videostroboscopies were reviewed by a blinded reviewer. Multivariable logistic regression was used to examine the correlation between laryngoscopic appearance of glottic keratosis and presence or absence of high-grade dysplasia or carcinoma on biopsies. Consensus among head and neck cancer surgeons to detect specific laryngoscopic findings was evaluated by presenting representative laryngoscopies to a blinded cohort. Interrater reliability was calculated using Fleiss’s κ. Results Sixty glottic keratotic lesions met inclusion criteria. On logistic regression, both erythroplakia and aberrant microvasculature like vascular speckling were significantly associated with high-grade dysplasia and carcinoma, P = .002 and P = .03, respectively. Interrater reliability among clinicians to identify erythroplakia and aberrant microvasculature was minimal, κ = 0.35 and κ = 0.29, respectively. Interrater reliability was improved with the use of virtual chromoendoscopy. Conclusion The presence of erythroplakia and aberrant microvasculature in glottic keratosis is associated with the presence of high-grade dysplasia or carcinoma. Virtual chromoendoscopy can be used to improve reliability for detecting erythroplakia and vascular speckling, and this is a potential area for practice-based learning. Clinicians should identify and consider immediate diagnostic biopsy of suspicious glottic keratosis.


Viruses ◽  
2021 ◽  
Vol 13 (6) ◽  
pp. 1064
Author(s):  
Gitana Scozzari ◽  
Cristina Costa ◽  
Enrica Migliore ◽  
Maurizio Coggiola ◽  
Giovannino Ciccone ◽  
...  

This observational study evaluated SARS-CoV-2 IgG seroprevalence and related clinical, demographic, and occupational factors among workers at the largest tertiary care University-Hospital of Northwestern Italy and the University of Turin after the first pandemic wave of March–April 2020. Overall, about 10,000 individuals were tested; seropositive subjects were retested after 5 months to evaluate antibodies waning. Among 8769 hospital workers, seroprevalence was 7.6%, without significant differences related to job profile; among 1185 University workers, 3.3%. Self-reporting of COVID-19 suspected symptoms was significantly associated with positivity (Odds Ratio (OR) 2.07, 95%CI: 1.76–2.44), although 27% of seropositive subjects reported no previous symptom. At multivariable analysis, contacts at work resulted in an increased risk of 69%, or 24% for working in a COVID ward; contacts in the household evidenced the highest risk, up to more than five-fold (OR 5.31, 95%CI: 4.12–6.85). Compared to never smokers, being active smokers was inversely associated with seroprevalence (OR 0.60, 95%CI: 0.48–0.76). After 5 months, 85% of previously positive subjects still tested positive. The frequency of SARS-COV-2 infection among Health Care Workers was comparable with that observed in surveys performed in Northern Italy and Europe after the first pandemic wave. This study confirms that infection frequently occurred as asymptomatic and underlines the importance of household exposure, seroprevalence (OR 0.60, 95%CI: 0.48–0.76).


2015 ◽  
Vol 22 (4) ◽  
pp. 209-214 ◽  
Author(s):  
Chantal Robitaille ◽  
Esther Dajczman ◽  
Andrew M Hirsch ◽  
David Small ◽  
Pierre Ernst ◽  
...  

BACKGROUND: Targeted spirometry screening for chronic obstructive pulmonary disease (COPD) has been studied in primary care and community settings. Limitations regarding availability and quality of testing remain. A targeted spirometry screening program was implemented within a presurgical screening (PSS) clinic to detect undiagnosed airways disease and identify patients with COPD/asthma in need of treatment optimization.OBJECTIVE: The present quality assurance study evaluated airflow obstruction detection rates and examined characteristics of patients identified through the targeted screening program.METHODS: The targeted spirometry screening program was implemented within the PSS clinic of a tertiary care university hospital. Current or ex-smokers with respiratory symptoms and patients with a history of COPD or asthma underwent prebronchodilator spirometry. History of airways disease and smoking status were obtained during the PSS assessment and confirmed through chart reviews.RESULTS: After exclusions, the study sample included 449 current or ex-smokers. Abnormal spirometry results were found in 184 (41%) patients: 73 (16%) had mild, 93 (21%) had moderate and 18 (4%) had severe or very severe airflow obstruction. One hundred eighteen (26%) new cases of airflow obstruction suggestive of COPD were detected. One-half of these new cases had moderate or severe airflow obstruction. Only 34% of patients with abnormal spirometry results had reported a previous diagnosis of COPD. More than one-half of patients with abnormal spirometry results were current smokers.CONCLUSIONS: Undiagnosed airflow obstruction was detected in a significant number of smokers and ex-smokers through a targeted screening program within a PSS clinic. These patients can be referred for early intervention and secondary preventive strategies.


2009 ◽  
Vol 30 (2) ◽  
pp. 130-138 ◽  
Author(s):  
Sang Hoon Han ◽  
Bum Sik Chin ◽  
Han Sung Lee ◽  
Su Jin Jeong ◽  
Hee Kyung Choi ◽  
...  

Objective.To describe the incidence of recovery of both vancomycin-resistant enterococci (VRE) and methicillin-resistantStaphylococcus aureus(MRSA) from culture of a single clinical specimen, to describe the clinical characteristics of patients from whom these specimens were recovered, and to identify the risk factors of these patients.Design.A retrospective cohort and case-control study.Setting.A tertiary care university hospital and referral center in Seoul, Korea.Methods.We identified 61 case patients for whom a single clinical specimen yielded both VRE and MRSA on culture, and 122 control patients for whom any clinical specimen yielded only VRE on culture. The control patients were selected by matching 2 :1 with the case patients for age, sex, and first date of sampling that led to isolation of VRE or both VRE and MRSA among 1,536 VRE-colonized patients from January 1, 2003, through December 31, 2006. To identify patient risk factors for the recovery of both VRE and MRSA in a single clinical specimen, we performed univariate comparisons between the 2 groups and then multivariate logistic regression analysis.Results.The incidence of recovery of both VRE and MRSA from culture of a single clinical specimen was 3.97% (for 61 of 1,536 VRE-colonized patients) over 4 years. Among these 82 single clinical specimens, the most common type was wound specimens (26.8%), followed by lower respiratory tract specimens (18.3%), urine specimens (17.1%), and catheter tips (15.9%). Of the 61 case patients, 14 (23.0%) had 2 or more single clinical specimens that yielded both VRE and MRSA on culture, and the longest interval from the first sampling that yielded both organisms to the last sampling that yielded both was 174 days. Independent patient risk factors for the presence of both VRE and MRSA in a single clinical specimen were chronic renal disease (odds ratio [OR], 7.00;P= .012), urinary catheterization (OR, 3.36;P= .026), and longer total cumulative duration of hospital stay within the previous year (OR, 1.03;P< .001).Conclusion.We confirmed that the recovery of VRE and MRSA from a single clinical specimen occurs continually. Because prolonged cell-to-cell contact can facilitate transfer ofvanA,close observation and surveillance for vancomycin-resistantS. aureus, especially among patients with risk factors for the recovery of both VRE and MRSA from a single clinical specimen, should be continued.


2000 ◽  
Vol 21 (12) ◽  
pp. 761-764 ◽  
Author(s):  
Klaus Weist ◽  
Constanze Wendt ◽  
Lyle R. Petersen ◽  
Hans Versmold ◽  
Henning Rüden

Objective:To investigate an outbreak of methicillin-susceptibleStaphylococcus aureus(MSSA); infections in a neonatal clinic.Design:Prospective chart review, environmental sampling, and genotyping by two independent methods: pulsed-field gel electrophoresis (PFGE) and randomly amplified polymorphic DNA polymerase chain reaction (RAPD-PCR). A case-control study was performed with 31 controls from the same clinic.Setting:A German 1,350-bed tertiary-care teaching university hospital.Results:There was a significant increase in the incidence of pyodermas with MSSA 10 neonates in good physical condition with no infection immediately after birth developed pyodermas. A shared spatula and ultrasound gel were the only identified infection sources. The gel contained MSSA and was used for hip-joint sonographies in all neonates. PFGE and RAPD-PCR patterns from 6 neonates and from the gel were indistinguishable and thus genetically related clones. The case-control study revealed no significant risk factor with the exception of cesarean section (P=.006). The attack rate by days of hip-joint sonography between April 15 and April 27, 1994, was 11.8% to 40%.Conclusions:Inappropriate hygienic measures in connection with lubricants during routine ultrasound scanning may lead to nosocomialS aureusinfections of the skin. To our knowledge this source ofS aureusinfections has not previously been described.


Sign in / Sign up

Export Citation Format

Share Document