scholarly journals Infectious outcomes of fibrin sheath disruption in tunneled dialysis catheters

2022 ◽  
pp. 112972982110706
Author(s):  
Mara Waters ◽  
Ella Huszti ◽  
Maria Erika Ramirez ◽  
Charmaine E. Lok

Background and objectives: Fibrin sheath (FS) formation around tunneled central venous catheters (CVC) increases the risk of catheter-related bloodstream infections due to bacterial adherence to a biofilm. We sought to investigate whether FS disruption (FSD) at the time of CVC removal or exchange affects infectious outcomes in patients with CVC-related infections. Design, setting, participants, and measurements: Retrospective cohort study of 307 adult maintenance hemodialysis patients aged 18 years or older at a single center academic-based hemodialysis program (UHN, Toronto) who developed CVC-related infections requiring CVC removal or exchange between January 2000 and January 2019. Exposure was FSD at the time of CVC removal or exchange. Outcomes were infectious metastatic complications, recurrent infection with the same organism within 1 year, or death due to infection. We created a Markov Multi-State Model (MMSM) to assess patients’ trajectories through time as they transitioned between states. A time-to-event analysis was performed, adjusted for clinically relevant factors. Results: There was no significant relationship between FSD status at the time of CVC removal, the development of infectious complications in the multivariable model (adjusted HR = 0.71, 95% CI 0.09−5.80, p = 0.76), or mortality from infection (HR = 0.84, 95% CI 0.34−2.11, p = 0.73). Conclusions: FSD at the time of CVC removal was not associated with increased risk of infectious complications or death due to infection. Further prospective study is needed to determine whether FSD contributes to reducing CVC infectious related complications.

Nutrients ◽  
2019 ◽  
Vol 11 (8) ◽  
pp. 1790 ◽  
Author(s):  
Ulla Uusitalo ◽  
Carin Andren Aronsson ◽  
Xiang Liu ◽  
Kalle Kurppa ◽  
Jimin Yang ◽  
...  

Probiotics are linked to positive regulatory effects on the immune system. The aim of the study was to examine the association between the exposure of probiotics via dietary supplements or via infant formula by the age of 1 year and the development of celiac disease autoimmunity (CDA) and celiac disease among a cohort of 6520 genetically susceptible children. Use of probiotics during the first year of life was reported by 1460 children. Time-to-event analysis was used to examine the associations. Overall exposure of probiotics during the first year of life was not associated with either CDA (n = 1212) (HR 1.15; 95%CI 0.99, 1.35; p = 0.07) or celiac disease (n = 455) (HR 1.11; 95%CI 0.86, 1.43; p = 0.43) when adjusting for known risk factors. Intake of probiotic dietary supplements, however, was associated with a slightly increased risk of CDA (HR 1.18; 95%CI 1.00, 1.40; p = 0.043) compared to children who did not get probiotics. It was concluded that the overall exposure of probiotics during the first year of life was not associated with CDA or celiac disease in children at genetic risk.


2012 ◽  
Vol 30 (15_suppl) ◽  
pp. e13038-e13038
Author(s):  
Arijit Ganguli ◽  
Patrick J Reilly ◽  
Saurabh Ray

e13038 Background: Chemotherapy has been associated with increased risk of fractures1. This study examines the real-world incidence of fractures and healthcare resource use (HRU) that may be associated with CAPN in cancer patients. Methods: A retrospective analysis utilized a national health insurer claims -database (2001-2009), to identify patients ≥18 yrs with a cancer ICD-9-code (140-239) and a chemotherapy drug code (J9xxx). The 1st chemotherapy date was the "index date." Patients with a record of peripheral neuropathy (PN) in the pre-index date were excluded. Patients with a PN post-index were matched with no-PN post-index (non-PN) based on gender, age and index date. Both groups were compared for number of fractures, HRU (hospital outpatient (OP), office, and emergency-room [ER] visits) and all-cause costs in their 365-days post-index period. Time to 1st fracture post-index was compared using Kaplan Meier time to event analysis. Results: Of 34,625 patient meeting the inclusion criteria, 1675 patients (4.3%) formed the PN group and were matched to non-PN group. At baseline, mean age was 54.9 yrs, 62.5% were females, and no difference in % of bone metastasis (p=0.12) between the groups. In PN group, 5.3% (n=87) had a fracture 365-days post-index compared to 3.5% (n=58) in non-PN group (p<0.05). Mean days to fracture from index date in PN group was shorter than the non-PN group (150.9 vs. 153.4, p<0.05). In PN group, annual mean number of OP visit (14.6 vs. 12.0, p<0.0001), ER visit (0.47 vs. 0.30, p<0.001), and office visits (30.4 vs. 23.3, p<0.0001), were higher compared to non-PN group. Annual healthcare cost of PN patients was 21% higher than non-PN patients ($64,578 vs. $53,221) and CAPN-related cost in PN group was estimated to be $5,580 annually. Conclusions: Patients with CAPN were associated with higher incidence of fractures, HRU and cost.


2021 ◽  
Vol 39 (3_suppl) ◽  
pp. 83-83
Author(s):  
Yun Man ◽  
Han Yu ◽  
Sarbajit Mukherjee ◽  
Olivia Zalewski

83 Background: Biosimilar products are proven to have no clinically meaningful differences in terms of safety and effectiveness with the reference product, however, to our knowledge, there is limited comparative analysis regarding adverse events for the bevacizumab biosimilars and the reference product. The purpose of this study was to evaluate the incidence of hypertension and proteinuria in patients treated with reference versus biosimilar bevacizumab to potentially streamline clinical management. Methods: A retrospective study was conducted with data consisting of gastrointestinal cancer patients who initiated either reference bevacizumab or biosimilar bevacizumab between January 2019 and July 2020 at Roswell Park Comprehensive Cancer Center. For the primary composite endpoint, Electronic Health Records were searched for the presence or absence of hypertension and proteinuria, as well as time to event analysis. In patients treated with bevacizumab biosimilar, demographics, hypertension and proteinuria related risk factors, bevacizumab containing chemotherapy regimen, and bevacizumab dosing were also identified to assess the risk association with hypertension and proteinuria. Results: 75 patients were included with 42 patients received bevacizumab and 33 patients received bevacizumab biosimilar. Hypertension occurred in 52.4% of reference bevacizumab group versus 36.4% of bevacizumab biosimilar group. Median time to hypertension is 84 days (bevacizumab) versus 24.5 days (bevacizumab biosimilar) following the first treatment (p= 0.0064). Proteinuria developed in 35.7% of reference bevacizumab patients compared to 30% of bevacizumab biosimilar patients. Median onset of proteinuria was 213 days (bevacizumab) versus 53.5 days (bevacizumab biosimilar) (p= 0.0022). In patients with a history of hypertension or underlying renal dysfunction, there was an increased risk of further hypertension and proteinuria, respectively. Conclusions: Higher risk of hypertension and proteinuria was associated with the bevacizumab reference product group, although not statistically significant. A shorter onset of hypertension and proteinuria was associated with the bevacizumab biosimilar group. Our results need further validation in larger cohorts, especially due to the more recent implementation of bevacizumab biosimilar products. [Table: see text]


2020 ◽  
Vol 7 (Supplement_1) ◽  
pp. S138-S138
Author(s):  
Mandee Noval ◽  
Emily Heil ◽  
Paula Williams ◽  
J Kristie Johnson ◽  
Kimberly C Claeys

Abstract Background The Clinical and Laboratory Standards Institute (CLSI) revised breakpoints for cefazolin (CFZ) may be difficult to implement with current automated susceptibility testing (AST) platforms and Enterobacterales may be falsely reported as susceptible to CFZ. The possibility remains that CFZ may then be inappropriately used as definitive therapy. Methods This was a retrospective observational cohort of adult patients with Enterobacterales bloodstream infections (BSI) reported CFZ susceptible per Vitek®2 (bioMerieux, Durham NC). The primary outcome was the percentage of CFZ susceptible Enterobacterales isolates using three different susceptibility testing methods: Vitek®2 automated testing, ETEST® (bioMerieux, Durham NC), and disk diffusion. Secondary outcomes included treatment failure defined as a composite outcome of 30-day all-cause inpatient mortality, 30-day recurrent BSI, 60-day recurrent infection, or infectious complications. Results In 195 isolates reported CFZ susceptible per Vitek®2, 84 (43.1%) were CFZ susceptible using E-test vs.119 (61%) using disk diffusion (Figure 1). Rates of treatment failure were similar in both CFZ and non-CFZ groups (33.3% vs. 38.5% respectively; p=0.57). Both groups had high rates of ID consult involvement (&gt;60%) and source control (&gt;80%) with urinary tract being the most reported source. No difference was noted in 30-day all-cause mortality, secondary infectious complications, 30-day readmissions, or 60-day recurrent infections. A subgroup analysis of patients receiving CFZ vs. ceftriaxone suggests treatment failure was significantly less likely to occur in the setting of source control (adjusted OR 0.06; 95% CI, 0.13–0.32) and ID consult Figure 1: CFZ Susceptibilities by Testing Method Conclusion There was a large discrepancy among testing methods; additional confirmatory CFZ susceptibility testing beyond AST platforms should be considered prior to definitive use of CFZ for systemic Enterobacterales infections. Disclosures All Authors: No reported disclosures


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Bethany E. Higgins ◽  
Giovanni Montesano ◽  
Alison M. Binns ◽  
David P. Crabb

AbstractIn age-related macular degeneration (AMD) research, dark adaptation has been found to be a promising functional measurement. In more severe cases of AMD, dark adaptation cannot always be recorded within a maximum allowed time for the test (~ 20–30 min). These data are recorded either as censored data-points (data capped at the maximum test time) or as an estimated recovery time based on the trend observed from the data recorded within the maximum recording time. Therefore, dark adaptation data can have unusual attributes that may not be handled by standard statistical techniques. Here we show time-to-event analysis is a more powerful method for analysis of rod-intercept time data in measuring dark adaptation. For example, at 80% power (at α = 0.05) sample sizes were estimated to be 20 and 61 with uncapped (uncensored) and capped (censored) data using a standard t-test; these values improved to 12 and 38 when using the proposed time-to-event analysis. Our method can accommodate both skewed data and censored data points and offers the advantage of significantly reducing sample sizes when planning studies where this functional test is an outcome measure. The latter is important because designing trials and studies more efficiently equates to newer treatments likely being examined more efficiently.


2020 ◽  
Vol 7 (Supplement_1) ◽  
pp. S137-S137
Author(s):  
Stephanie Wo ◽  
Yanina Dubrovskaya ◽  
Justin Siegfried ◽  
John Papadopoulos ◽  
Shin-Pung Jen

Abstract Background Viridans group streptococci (VGS) is an infrequent yet significant cause of bloodstream infections, and complicated cases may require prolonged antibiotic therapy. Ceftriaxone (CTX) and penicillin G (PCN G) are both considered first line options for VGS infections, but comparisons between these agents are limited. We evaluated the clinical outcomes amongst patients treated with CTX and PCN G for complicated VGS bacteremia. Methods This was a single-center, retrospective study of adult patients with ≥1 positive VGS blood culture who were treated with either CTX or PCN G/ampicillin (both included in PCN G arm) between January 2013 and June 2019. The primary outcome was a composite of safety endpoints, including hospital readmission due to VGS or an adverse event (AE) from therapy, Clostridioides difficile infections, treatment modification or discontinuation due to an antibiotic-related AE, and development of extended-spectrum beta lactamase resistance. Secondary outcomes included the individual safety endpoints, VGS bacteremia recurrence, hospital readmission, and all-cause mortality. Results Of 328 patients screened for inclusion, 94 patients met eligibility criteria (CTX n= 64, PCN G n=34). Median age was 68 years (IQR 56–81) and 68% were male. Study patients did not present with critical illness, as reflected by a median Pitt bacteremia score of 0 in the CTX and 1 in the PCN G arms, P=0.764. Streptococcus mitis was the most common VGS isolate and infective endocarditis (IE) was the predominant source of infection. CTX was not significantly associated with increased risk of the primary outcome (14% vs. 27%; P= 0.139). The driver of the composite outcome was hospital readmission due to VGS bacteremia or therapy complications. Results were similar in the subgroup of patients with IE (12.5% vs. 23.5%). No secondary endpoints differed significantly between groups. On multivariate analysis, source removal was a protective factor of the primary outcome (OR 0.1; 95% CI 0.020–0.6771; P= 0.017). Conclusion Despite potential safety concerns with the prolonged use of CTX in complicated VGS bacteremia, this study did not demonstrate a higher rate of treatment failure, adverse events, or resistance. These findings warrant further exploration. Disclosures All Authors: No reported disclosures


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Sameera Senanayake ◽  
Nicholas Graves ◽  
Helen Healy ◽  
Keshwar Baboolal ◽  
Adrian Barnett ◽  
...  

Abstract Background Economic-evaluations using decision analytic models such as Markov-models (MM), and discrete-event-simulations (DES) are high value adds in allocating resources. The choice of modelling method is critical because an inappropriate model yields results that could lead to flawed decision making. The aim of this study was to compare cost-effectiveness when MM and DES were used to model results of transplanting a lower-quality kidney versus remaining waitlisted for a kidney. Methods Cost-effectiveness was assessed using MM and DES. We used parametric survival models to estimate the time-dependent transition probabilities of MM and distribution of time-to-event in DES. MMs were simulated in 12 and 6 monthly cycles, out to five and 20-year time horizon. Results DES model output had a close fit to the actual data. Irrespective of the modelling method, the cycle length of MM or the time horizon, transplanting a low-quality kidney as compared to remaining waitlisted was the dominant strategy. However, there were discrepancies in costs, effectiveness and net monetary benefit (NMB) among different modelling methods. The incremental NMB of the MM in the 6-months cycle lengths was a closer fit to the incremental NMB of the DES. The gap in the fit of the two cycle lengths to DES output reduced as the time horizon increased. Conclusion Different modelling methods were unlikely to influence the decision to accept a lower quality kidney transplant or remain waitlisted on dialysis. Both models produced similar results when time-dependant transition probabilities are used, most notable with shorter cycle lengths and longer time-horizons.


Author(s):  
Jin K. Kim ◽  
Mitchell Shiff ◽  
Michael E. Chua ◽  
Fadi Zu’bi ◽  
Jessica M. Ming ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document