SimTube: A National Simulation Training and Research Project

2020 ◽  
Vol 163 (3) ◽  
pp. 522-530
Author(s):  
Gregory J. Wiet ◽  
Ellen S. Deutsch ◽  
Sonya Malekzadeh ◽  
Amanda J. Onwuka ◽  
Nathan W. Callender ◽  
...  

Objective To test the feasibility and impact of a simulation training program for myringotomy and tube (M&T) placement. Study Design Prospective randomized controlled. Setting Multi-institutional. Subjects and Methods An M&T simulator was used to assess the impact of simulation training vs no simulation training on the rate of achieving competency. Novice trainees were assessed using posttest simulator Objective Structured Assessment of Technical Skills (OSATS) scores, OSATS score for initial intraoperative tube insertion, and number of procedures to obtain competency. The effect of simulation training was analyzed using χ2 tests, Wilcoxon-Mann-Whitney tests, and Cox proportional hazards regression. Results A total of 101 residents and 105 raters from 65 institutions were enrolled; however, just 63 residents had sufficient data to be analyzed due to substantial breaches in protocol. There was no difference in simulator pretest scores between intervention and control groups; however, the intervention group had better OSATS global scores on the simulator (17.4 vs 13.7, P = .0003) and OSATS task scores on the simulator (4.5 vs 3.6, P = .02). No difference in OSATS scores was observed during initial live surgery rating ( P = .73 and P = .41). OSATS scores were predictive of the rate at which residents achieved competence in performing myringotomy; however, the intervention was not associated with subsequent OSATS scores during live surgeries ( P = .44 and P = .91) or the rate of achieving competence ( P = .16). Conclusions A multi-institutional simulation study is feasible. Novices trained using the M&T simulator achieved higher scores on simulator but not initial intraoperative OSATS, and they did not reach competency sooner than those not trained on the simulator.

2012 ◽  
Vol 30 (36) ◽  
pp. 4493-4500 ◽  
Author(s):  
John M. McLaughlin ◽  
Roger T. Anderson ◽  
Amy K. Ferketich ◽  
Eric E. Seiber ◽  
Rajesh Balkrishnan ◽  
...  

Purpose To determine the impact of longer periods between biopsy-confirmed breast cancer diagnosis and the initiation of treatment (Dx2Tx) on survival. Patients and Methods This study was a noninterventional, retrospective analysis of adult female North Carolina Medicaid enrollees diagnosed with breast cancer from January 1, 2000, through December, 31, 2002, in the linked North Carolina Central Cancer Registry–Medicaid Claims database. Follow-up data were available through July 31, 2006. Cox proportional hazards regression models were constructed to evaluate the impact on survival of delaying treatment ≥ 60 days after a confirmed diagnosis of breast cancer. Results The study cohort consisted of 1,786 low-income, adult women with a mean age of 61.6 years. A large proportion of the patients (44.3%) were racial minorities. Median time from biopsy-confirmed diagnosis to treatment initiation was 22 days. Adjusted Cox proportional hazards regression showed that although Dx2Tx length did not affect survival among those diagnosed at early stage, among late-stage patients, intervals between diagnosis and first treatment ≥ 60 days were associated with significantly worse overall survival (hazard ratio [HR], 1.66; 95% CI, 1.00 to 2.77; P = .05) and breast cancer–specific survival (HR, 1.85; 95% CI, 1.04 to 3.27; P = .04). Conclusion One in 10 women waited ≥ 60 days to initiate treatment after a diagnosis of breast cancer. Waiting ≥ 60 days to initiate treatment was associated with a significant 66% and 85% increased risk of overall and breast cancer–related death, respectively, among late-stage patients. Interventions designed to increase the timeliness of receiving breast cancer treatments should target late-stage patients, and clinicians should strive to promptly triage and initiate treatment for patients diagnosed at late stage.


2021 ◽  
Vol 5 (Supplement_1) ◽  
pp. 701-702
Author(s):  
Samuel Miller ◽  
Lauren Wilson ◽  
Melissa Greiner ◽  
Jessica Pritchard ◽  
Tian Zhang ◽  
...  

Abstract Renal dysfunction is a driver of dementia. It is also associated with renal cell carcinoma, possibly the result of the tumor itself or from cancer treatment. This study evaluates metastatic renal cell carcinoma (mRCC) as a risk factor for developing mild cognitive impairment or dementia (MCI/D) as well as the impact of RCC-directed therapies on the development of MCI/D. We identified all patients diagnosed with mRCC in SEER-Medicare from 2007-2015. The main outcome was incident MCI/D within one year of mRCC diagnosis or cohort entry. Exclusion criteria included age <65 at mRCC diagnosis and diagnosis of MCI/D within preceding year of mRCC diagnosis. Patients with mRCC (n=2,533) were matched to non-cancer controls (n=7,027) on age, sex, race, comorbidities and year. Cox proportional hazards regression showed that having mRCC (HR 8.52, 95% MCI/D 6.49-11.18, p<0.001) and being older (HR 1.05 for 1-year age increase, 95% MCI/D 1.03-1.07, p<0.001) were predictive of developing MCI/D. A second Cox proportional hazards regression of only patients with mRCC revealed that neither those initiating treatment with oral anticancer agents (OAAs) nor those who underwent nephrectomy were more likely to develop MCI/D. Black patients had a higher risk of dementia compared to white patients (HR 1.92, 95% MCI/D 1.02-3.59, p=0.047). In conclusion, patients with mRCC were more likely to develop MCI/D than those without mRCC. The medical and surgical therapies evaluated were not associated with increased incidence of MCI/D. The increased incidence of MCI/D in older adults with mRCC may be the result of the pathology itself.


2021 ◽  
Vol 50 (Supplement_2) ◽  
pp. ii14-ii18
Author(s):  
A Khan ◽  
F R Espinoza ◽  
T Kneen ◽  
A Dafnis ◽  
H Allafi ◽  
...  

Abstract Introduction The COVID-19 pandemic has had an extensive impact on the frail older population, with significant rates of COVID-related hospital admissions and deaths amongst this vulnerable group. There is little evidence of frailty prevalence amongst patients hospitalised with COVID-19, nor the impact of frailty on their survival. Methods Prospective observational study of all consecutive patients admitted to Salford Royal NHS Foundation (SRFT) Trust between 27th February and 28th April 2020 (wave 1), and 1st October to 10th November 2020 (wave 2) with a diagnosis of COVID-19. The primary endpoint was in-hospital mortality. Patient demographics, co-morbidities, admission level disease severity (estimated with CRP) and frailty (using the Clinical Frailty Scale, score 1–3 = not frail, score 4–9 = frail) were collected. A Cox proportional hazards regression model was used to assess the time to mortality. Results A total of 693 (N = 429, wave 1; N = 264, wave 2) patients were included, 279 (N = 180, 42%, wave 1; N = 104, 38%, wave 2) were female, and the median age was 72 in wave 1 and 73 in wave 2. 318 (N = 212, 49%, wave 1; N = 106, 39%, wave 2) patients presenting were frail. There was a reduction in mortality in wave 2, adjusted Hazard ratio (aHR) = 0.60 (95%CI 0.44–0.81; p = 0.001). There was an association between frailty and mortality aHR = 1.57 (95%CI 1.09–2.26; p = 0.015). Conclusion Frailty is highly prevalent amongst patients of all ages admitted to SRFT with COVID-19. Higher scores of frailty are associated with increased mortality.


Antibiotics ◽  
2020 ◽  
Vol 9 (5) ◽  
pp. 254 ◽  
Author(s):  
Caroline Derrick ◽  
P. Brandon Bookstaver ◽  
Zhiqiang K. Lu ◽  
Christopher M. Bland ◽  
S. Travis King ◽  
...  

Objectives: There is debate on whether the use of third-generation cephalosporins (3GC) increases the risk of clinical failure in bloodstream infections (BSIs) caused by chromosomally-mediated AmpC-producing Enterobacterales (CAE). This study evaluates the impact of definitive 3GC therapy versus other antibiotics on clinical outcomes in BSIs due to Enterobacter, Serratia, or Citrobacter species. Methods: This multicenter, retrospective cohort study evaluated adult hospitalized patients with BSIs secondary to Enterobacter, Serratia, or Citrobacter species from 1 January 2006 to 1 September 2014. Definitive 3GC therapy was compared to definitive therapy with other non-3GC antibiotics. Multivariable Cox proportional hazards regression evaluated the impact of definitive 3GC on overall treatment failure (OTF) as a composite of in-hospital mortality, 30-day hospital readmission, or 90-day reinfection. Results: A total of 381 patients from 18 institutions in the southeastern United States were enrolled. Common sources of BSIs were the urinary tract and central venous catheters (78 (20.5%) patients each). Definitive 3GC therapy was utilized in 65 (17.1%) patients. OTF occurred in 22/65 patients (33.9%) in the definitive 3GC group vs. 94/316 (29.8%) in the non-3GC group (p = 0.51). Individual components of OTF were comparable between groups. Risk of OTF was comparable with definitive 3GC therapy vs. definitive non-3GC therapy (aHR 0.93, 95% CI 0.51–1.72) in multivariable Cox proportional hazards regression analysis. Conclusions: These outcomes suggest definitive 3GC therapy does not significantly alter the risk of poor clinical outcomes in the treatment of BSIs secondary to Enterobacter, Serratia, or Citrobacter species compared to other antimicrobial agents.


2018 ◽  
Vol 6 ◽  
pp. 205031211880170
Author(s):  
Ajinkya M Pawar ◽  
Kerry L LaPlante ◽  
Tristan T Timbrook ◽  
Aisling R Caffrey

Objectives: Varying statin exposures in bacteremic patients have different impacts on mortality. Among patients with adherent statin use, we sought to evaluate the impact of statin continuation on inpatient mortality in bacteremic patients. Methods: A retrospective cohort study was conducted using Optum ClinformaticsTM with matched Premier Hospital data (October 2009–March 2013). Patients with a primary diagnosis of bacteremia and 6 months of continuous enrollment prior to the admission, receiving antibiotics at least 2 days of antibiotics during the first 3 days of admission, were selected for inclusion. Furthermore, patients demonstrating adherent statin use based on 90 days of continuous therapy prior to admission were included. We then compared those continuing statin therapy for at least the first 5 days after admission and those not continuing during the admission. Results: Simvastatin (53.2%) and atorvastatin (33.8%) were the most commonly used statins among the 633 patients who met our inclusion and exclusion criteria. Propensity score adjusted Cox proportional hazards regression models demonstrated significantly lower inpatient mortality among those continuing statin therapy compared with those not continuing (n = 232 vs 401, adjusted hazard ratio 0.25, 95% confidence interval 0.08–0.79). Conclusion: Among patients adherent to their statin therapy prior to a bacteremia hospitalization, continued statin use after admission increased survival by 75% compared with those not continuing.


2017 ◽  
Vol 61 (12) ◽  
Author(s):  
Natalie A. Finch ◽  
Evan J. Zasowski ◽  
Kyle P. Murray ◽  
Ryan P. Mynatt ◽  
Jing J. Zhao ◽  
...  

ABSTRACT Evidence suggests that maintenance of vancomycin trough concentrations at between 15 and 20 mg/liter, as currently recommended, is frequently unnecessary to achieve the daily area under the concentration-time curve (AUC24) target of ≥400 mg · h/liter. Many patients with trough concentrations in this range have AUC24 values in excess of the therapeutic threshold and within the exposure range associated with nephrotoxicity. On the basis of this, the Detroit Medical Center switched from trough concentration-guided dosing to AUC-guided dosing to minimize potentially unnecessary vancomycin exposure. The primary objective of this analysis was to assess the impact of this intervention on vancomycin-associated nephrotoxicity in a single-center, retrospective quasi-experiment of hospitalized adult patients receiving intravenous vancomycin from 2014 to 2015. The primary analysis compared the incidence of nephrotoxicity between patients monitored by assessment of the AUC24 and those monitored by assessment of the trough concentration. Multivariable logistic and Cox proportional hazards regression examined the independent association between the monitoring strategy and nephrotoxicity. Secondary analysis compared vancomycin exposures (total daily dose, AUC, and trough concentrations) between monitoring strategies. Overall, 1,280 patients were included in the analysis. After adjusting for severity of illness, comorbidity, duration of vancomycin therapy, and concomitant receipt of nephrotoxins, AUC-guided dosing was independently associated with lower nephrotoxicity by both logistic regression (odds ratio, 0.52; 95% confidence interval [CI], 0.34 to 0.80; P = 0.003) and Cox proportional hazards regression (hazard ratio, 0.53; 95% CI, 0.35 to 0.78; P = 0.002). AUC-guided dosing was associated with lower total daily vancomycin doses, AUC values, and trough concentrations. Vancomycin AUC-guided dosing was associated with reduced nephrotoxicity, which appeared to be a result of reduced vancomycin exposure.


2017 ◽  
Vol 118 (12) ◽  
pp. 1052-1060 ◽  
Author(s):  
Yung-Feng Yen ◽  
Fu-I Tung ◽  
Bo-Lung Ho ◽  
Yun-Ju Lai

AbstractEvidence regarding the association between BMI and mortality in tuberculosis (TB) patients is limited and inconsistent. We investigated the impact of BMI on TB-specific and non-TB-specific mortality with respect to different timing of death. All Taiwanese adults with TB in Taipei were included in a retrospective cohort study in 2012–2014. Multinomial Cox proportional hazards regression was used to evaluate the associations between BMI, cause-specific mortality and timing of death. Of 2410 eligible patients, 86·0 % (2061) were successfully treated, and TB-specific and non-TB-specific mortality occurred for 2·2 % (54) and 13·9 % (335), respectively. After controlling for potential confounders, underweight was significantly associated with a higher risk of all-cause mortality (adjusted hazard ratio (AHR) 1·57; 95 % CI 1·26, 1·95), whereas overweight was not. When cause-specific death was considered, underweight was associated with an increased risk of either TB-specific (AHR 1·85; 95 % CI 1·03, 3·33) or non-TB-specific death (AHR 1·52; 95 % CI 1·19, 1·95) during treatment. With joint consideration of cause-specific and timing of death, underweight only significantly increased the risk of TB-specific (AHR 2·23; 95 % CI 1·09, 4·59) and non-TB-specific mortality (AHR 1·81; 95 % CI 1·29, 2·55) within the first 8 weeks of treatment. This study suggests that underweight increases the risk of early death in TB patients during treatment.


2021 ◽  
pp. 1-8
Author(s):  
Chun-Yan Sun ◽  
Li-Fang Zhou ◽  
Li Song ◽  
Li-Juan Lan ◽  
Xiao-Wei Han ◽  
...  

<b><i>Objective:</i></b> Prepump arterial (Pa) pressure indicates the ease or difficulty with which the blood pump can draw blood from the vascular access (VA) during hemodialysis. Some studies have suggested that the absolute value of the Pa pressure to the extracorporeal blood pump flow (Qb) ratio set on the machine (|Pa/Qb|) can reflect the dysfunction of VA. This study was conducted to explore the impact of arteriovenous fistula (AVF) dysfunction and to explore the clinical reference value of |Pa/Qb|. <b><i>Methods:</i></b> We retrospectively identified adults who underwent hemodialysis at 3 hospitals. Data were acquired from electronic health records. We evaluated the pattern of the association between |Pa/Qb| and AVF dysfunction during 1 year using a Cox proportional hazards regression model with restricted cubic splines. Then, the patients were grouped based on the results, and hazard ratios were compared for different intervals of |Pa/Qb|. <b><i>Results:</i></b> A total of 490 patients were analyzed, with an average age of 55 (44, 66) years. There were a total of 85 cases of AVF dysfunction, of which 50 cases were stenosis and 35 cases were thrombosis. There was a U-shaped association between |Pa/Qb| and the risk of AVF dysfunction (<i>p</i> for nonlinearity &#x3c;0.001). |Pa/Qb| values &#x3c;0.30 and &#x3e;0.52 increased the risk of AVF dysfunction. Compared with the group with a |Pa/Qb| value between 0.30 and 0.52, the groups with |Pa/Qb| &#x3c;0.30 and |Pa/Qb| &#x3e;0.52 had a 4.04-fold (<i>p</i> = 0.002) and 3.41-fold (<i>p</i> &#x3c; 0.001) greater risk of AVF dysfunction, respectively. <b><i>Conclusions:</i></b> The appropriate range of |Pa/Qb| is between 0.30 and 0.52. When |Pa/Qb| is &#x3c;0.30 or &#x3e;0.52, the patient’s AVF function or Qb setting should be reevaluated to prevent subsequent failure.


Crisis ◽  
2018 ◽  
Vol 39 (1) ◽  
pp. 27-36 ◽  
Author(s):  
Kuan-Ying Lee ◽  
Chung-Yi Li ◽  
Kun-Chia Chang ◽  
Tsung-Hsueh Lu ◽  
Ying-Yeh Chen

Abstract. Background: We investigated the age at exposure to parental suicide and the risk of subsequent suicide completion in young people. The impact of parental and offspring sex was also examined. Method: Using a cohort study design, we linked Taiwan's Birth Registry (1978–1997) with Taiwan's Death Registry (1985–2009) and identified 40,249 children who had experienced maternal suicide (n = 14,431), paternal suicide (n = 26,887), or the suicide of both parents (n = 281). Each exposed child was matched to 10 children of the same sex and birth year whose parents were still alive. This yielded a total of 398,081 children for our non-exposed cohort. A Cox proportional hazards model was used to compare the suicide risk of the exposed and non-exposed groups. Results: Compared with the non-exposed group, offspring who were exposed to parental suicide were 3.91 times (95% confidence interval [CI] = 3.10–4.92 more likely to die by suicide after adjusting for baseline characteristics. The risk of suicide seemed to be lower in older male offspring (HR = 3.94, 95% CI = 2.57–6.06), but higher in older female offspring (HR = 5.30, 95% CI = 3.05–9.22). Stratified analyses based on parental sex revealed similar patterns as the combined analysis. Limitations: As only register-­based data were used, we were not able to explore the impact of variables not contained in the data set, such as the role of mental illness. Conclusion: Our findings suggest a prominent elevation in the risk of suicide among offspring who lost their parents to suicide. The risk elevation differed according to the sex of the afflicted offspring as well as to their age at exposure.


2020 ◽  
Vol 41 (Supplement_2) ◽  
Author(s):  
I.D Poveda Pinedo ◽  
I Marco Clement ◽  
O Gonzalez ◽  
I Ponz ◽  
A.M Iniesta ◽  
...  

Abstract Background Previous parameters such as peak VO2, VE/VCO2 slope and OUES have been described to be prognostic in heart failure (HF). The aim of this study was to identify further prognostic factors of cardiopulmonary exercise testing (CPET) in HF patients. Methods A retrospective analysis of HF patients who underwent CPET from January to November 2019 in a single centre was performed. PETCO2 gradient was defined by the difference between final PETCO2 and baseline PETCO2. HF events were defined as decompensated HF requiring hospital admission or IV diuretics, or decompensated HF resulting in death. Results A total of 64 HF patients were assessed by CPET, HF events occurred in 8 (12.5%) patients. Baseline characteristics are shown in table 1. Patients having HF events had a negative PETCO2 gradient while patients not having events showed a positive PETCO2 gradient (−1.5 [IQR −4.8, 2.3] vs 3 [IQR 1, 5] mmHg; p=0.004). A multivariate Cox proportional-hazards regression analysis revealed that PETCO2 gradient was an independent predictor of HF events (HR 0.74, 95% CI [0.61–0.89]; p=0.002). Kaplan-Meier curves showed a significantly higher incidence of HF events in patients having negative gradients, p=0.002 (figure 1). Conclusion PETCO2 gradient was demonstrated to be a prognostic parameter of CPET in HF patients in our study. Patients having negative gradients had worse outcomes by having more HF events. Time to first event, decompensated heart Funding Acknowledgement Type of funding source: None


Sign in / Sign up

Export Citation Format

Share Document