scholarly journals The impact of long-term azithromycin on antibiotic resistance in HIV-associated chronic lung disease

2021 ◽  
pp. 00491-2021
Author(s):  
Regina E. Abotsi ◽  
Mark P. Nicol ◽  
Grace McHugh ◽  
Victoria Simms ◽  
Andrea M. Rehman ◽  
...  

Selection for resistance to azithromycin (AZM) and other antibiotics such as tetracyclines and lincosamides remains a concern with long-term AZM use for treatment of chronic lung diseases (CLD). We investigated the impact of 48 weeks of AZM on the carriage and antibiotic resistance of common respiratory bacteria among children with HIV-associated CLD.Nasopharyngeal (NP) swabs and sputa were collected at baseline, 48 and 72 weeks from participants with HIV-associated CLD randomised to receive weekly AZM or placebo for 48 weeks and followed post-intervention until 72 weeks. The primary outcomes were prevalence and antibiotic resistance of Streptococcus pneumoniae (SP), Staphylococcus aureus (SA), Haemophilus influenzae (HI), and Moraxella catarrhalis (MC) at these timepoints. Mixed-effects logistic regression and Fisher's exact test were used to compare carriage and resistance respectively.Of 347 (174 AZM, 173 placebo) participants (median age 15 years [IQR=13–18], females 49%),NP carriage was significantly lower in the AZM (n=159) compared to placebo (n=153) arm for SP (18% versus 41%, p<0.001), HI (7% versus 16%, p=0.01), and MC (4% versus 11%, p=0.02); SP resistance to AZM (62% [18/29] versus 13%[8/63], p<0.0001) or tetracycline (60%[18/29] versus 21%[13/63], p<0.0001) were higher in the AZM arm. Carriage of SA resistant to AZM (91% [31/34] versus 3% [1/31], p<0.0001), tetracycline (35% [12/34] versus 13% [4/31], p=0.05) and clindamycin (79% [27/34] versus 3% [1/31], p<0.0001) was also significantly higher in the AZM arm and persisted at 72 weeks. Similar findings were observed for sputa.The persistence of antibiotic resistance and its clinical relevance for future infectious episodes requiring treatment needs further investigation.

Author(s):  
Lise D. Cloedt ◽  
Kenza Benbouzid ◽  
Annie Lavoie ◽  
Marie-Élaine Metras ◽  
Marie-Christine Lavoie ◽  
...  

AbstractDelirium is associated with significant negative outcomes, yet it remains underdiagnosed in children. We describe the impact of implementing a pain, agitation, and delirium (PAD) bundle on the rate of delirium detection in a pediatric intensive care unit (PICU). This represents a single-center, pre-/post-intervention retrospective and prospective cohort study. The study was conducted at a PICU in a quaternary university-affiliated pediatric hospital. All patients consecutively admitted to the PICU in October and November 2017 and 2018. Purpose of the study was describe the impact of the implementation of a PAD bundle. The rate of delirium detection and the utilization of sedative and analgesics in the pre- and post-implementation phases were measured. A total of 176 and 138 patients were admitted during the pre- and post-implementation phases, respectively. Of them, 7 (4%) and 44 (31.9%) were diagnosed with delirium (p < 0.001). Delirium was diagnosed in the first 48 hours of PICU admission and lasted for a median of 2 days (interquartile range [IQR]: 2–4). Delirium diagnosis was higher in patients receiving invasive ventilation (p < 0.001). Compliance with the PAD bundle scoring was 79% for the delirium scale. Score results were discussed during medical rounds for 68% of the patients in the post-implementation period. The number of patients who received opioids and benzodiazepines and the cumulative doses were not statistically different between the two cohorts. More patients received dexmedetomidine and the cumulative daily dose was higher in the post-implementation period (p < 0.001). The implementation of a PAD bundle in a PICU was associated with an increased recognition of delirium diagnosis. Further studies are needed to evaluate the impact of this increased diagnostic rate on short- and long-term outcomes.


2020 ◽  
Author(s):  
J D Schwalm ◽  
Noah M Ivers ◽  
Zachary Bouck ◽  
Monica Taljaard ◽  
Madhu K Natarajan ◽  
...  

BACKGROUND Based on high-quality evidence, guidelines recommend the long-term use of secondary prevention medications post-myocardial infarction (MI) to avoid recurrent cardiovascular events and death. Unfortunately, discontinuation of recommended medications post-MI is common. Observational evidence suggests that prescriptions covering a longer duration at discharge from hospital are associated with greater long-term medication adherence. The following is a proposal for the first interventional study to evaluate the impact of longer prescription duration at discharge post-MI on long-term medication adherence. OBJECTIVE The overarching goal of this study is to reduce morbidity and mortality among post-MI patients through improved long-term cardiac medication adherence. The specific objectives include the following. First, we will assess whether long-term cardiac medication adherence improves among elderly, post-MI patients following the implementation of (1) standardized discharge prescription forms with 90-day prescriptions and 3 repeats for recommended cardiac medication classes, in combination with education and (2) education alone compared to (3) usual care. Second, we will assess the cost implications of prolonged initial discharge prescriptions compared with usual care. Third, we will compare clinical outcomes between longer (&gt;60 days) versus shorter prescription durations. Fourth, we will collect baseline information to inform a multicenter interventional study. METHODS We will conduct a quasiexperimental, interrupted time series design to evaluate the impact of a multifaceted intervention to implement longer duration prescriptions versus usual care on long-term cardiac medication adherence among post-MI patients. Intervention groups and their corresponding settings include: (1) intervention group 1: 1 cardiac center and 1 noncardiac hospital allocated to receive standardized discharge prescription forms supporting the dispensation of 90 days’ worth of cardiac medications with 3 repeats, coupled with education; (2) intervention group 2: 4 sites (including 1 cardiac center) allocated to receive education only; and (3) control group: all remaining hospitals within the province that did not receive an intervention (ie, usual care). Administrative databases will be used to measure all outcomes. Adherence to 4 classes of cardiac medications — statins, beta blockers, angiotensin system inhibitors, and secondary antiplatelets (ie, prasugrel, clopidogrel, or ticagrelor) — will be assessed. RESULTS Enrollment began in September 2017, and results are expected to be analyzed in late 2020. CONCLUSIONS The results have the potential to redefine best practices regarding discharge prescribing policies for patients post-MI. A policy of standardized maximum-duration prescriptions at the time of discharge post-MI is a simple intervention that has the potential to significantly improve long-term medication adherence, thus decreasing cardiac morbidity and mortality. If effective, this low-cost intervention to implement longer duration prescriptions post-MI could be easily scaled. CLINICALTRIAL ClinicalTrials.gov NCT03257579; https://clinicaltrials.gov/ct2/show/NCT03257579 INTERNATIONAL REGISTERED REPORT DERR1-10.2196/18981


Author(s):  
Ross M Boyce ◽  
Brandon D Hollingsworth ◽  
Emma Baguma ◽  
Erin Xu ◽  
Varun Goel ◽  
...  

Abstract Background Malaria epidemics are a well-described phenomenon after extreme precipitation and flooding, which account for nearly half of global disasters over the past two decades. Yet few studies have examined mitigation measures to prevent post-flood malaria epidemics. Methods We conducted an evaluation of a malaria chemoprevention program implemented in response to severe flooding in western Uganda. Children ≤12 years of age from one village were eligible to receive 3 monthly rounds of dihydroartemisinin-piperaquine (DP). Two neighboring villages served as controls. Malaria cases were defined as individuals with a positive rapid diagnostic test result as recorded in health center registers. We performed a difference-in-differences analysis to estimate changes in the incidence and test positivity of malaria between intervention and control villages. Results A total of 554 children received at least one round of chemoprevention with 75% participating in at least two rounds. Compared to control villages, we estimated a 53.4% reduction (aRR 0.47, 95% CI 0.34 – 0.62, p&lt;.01) in malaria incidence and a 30% decrease in the test positivity rate (aRR=0.70, CI 0.50 - 0.97, p=0.03) in the intervention village in the six months post-intervention. The impact was greatest among children receiving the intervention, but decreased incidence was also observed in older children and adults (aRR=0.57, CI 0.38-0.84, p&lt;.01). Conclusions Three rounds of chemoprevention with DP delivered under pragmatic conditions reduced the incidence of malaria after severe flooding in western Uganda. These findings provide a proof-of-concept for the use of malaria chemoprevention to reduce excess disease burden associated with severe flooding.


2020 ◽  
Vol 41 (10) ◽  
pp. 1162-1168
Author(s):  
Shawn E. Hawken ◽  
Mary K. Hayden ◽  
Karen Lolans ◽  
Rachel D. Yelin ◽  
Robert A. Weinstein ◽  
...  

AbstractObjective:Cohorting patients who are colonized or infected with multidrug-resistant organisms (MDROs) protects uncolonized patients from acquiring MDROs in healthcare settings. The potential for cross transmission within the cohort and the possibility of colonized patients acquiring secondary isolates with additional antibiotic resistance traits is often neglected. We searched for evidence of cross transmission of KPC+ Klebsiella pneumoniae (KPC-Kp) colonization among cohorted patients in a long-term acute-care hospital (LTACH), and we evaluated the impact of secondary acquisitions on resistance potential.Design:Genomic epidemiological investigation.Setting:A high-prevalence LTACH during a bundled intervention that included cohorting KPC-Kp–positive patients.Methods:Whole-genome sequencing (WGS) and location data were analyzed to identify potential cases of cross transmission between cohorted patients.Results:Secondary KPC-Kp isolates from 19 of 28 admission-positive patients were more closely related to another patient’s isolate than to their own admission isolate. Of these 19 cases, 14 showed strong genomic evidence for cross transmission (<10 single nucleotide variants or SNVs), and most of these patients occupied shared cohort floors (12 patients) or rooms (4 patients) at the same time. Of the 14 patients with strong genomic evidence of acquisition, 12 acquired antibiotic resistance genes not found in their primary isolates.Conclusions:Acquisition of secondary KPC-Kp isolates carrying distinct antibiotic resistance genes was detected in nearly half of cohorted patients. These results highlight the importance of healthcare provider adherence to infection prevention protocols within cohort locations, and they indicate the need for future studies to assess whether multiple-strain acquisition increases risk of adverse patient outcomes.


2017 ◽  
Vol 4 (suppl_1) ◽  
pp. S63-S63
Author(s):  
Fabian Andres Romero ◽  
Evette Mathews ◽  
Ara Flores ◽  
Susan Seo

Abstract Background Antibiotic stewardship program (ASP) implementation is paramount across the healthcare spectrum. Nursing homes represent a challenge due to limited resources, complexity of medical conditions, and less controlled environments. National statistics on ASP for long-term care facilities (LTCF) are sparse. Methods A pilot ASP was launched in August 2016 at a 270-bed nursing home with a 50-bed chronic ventilator-dependent unit. The program entailed a bundle of interventions including leadership engagement, a tracking and reporting system for intravenous antibiotics, education for caregivers, Infectious Disease (ID) consultant availability, and implementation of nursing protocols. Data were collected from pharmacy and medical records between January 2016 and March 2017, establishing pre-intervention and post-intervention periods. Collected data included days of therapy (DOT), antibiotic costs, resident-days, hospital transfers, and Clostridium difficile infection (CDI) rates. Variables were adjusted to 1,000 resident-days (RD) and findings between periods were compared by Mann–Whitney U test. Results A total of 47,423 resident-days and 1,959 DOT were analyzed for this study. Antibiotic use decreased from 54.5 DOT/1000 RD pre-intervention to 27.6 DOT/1000 RD post-intervention (P = 0.017). Antibiotic costs were reduced from a monthly median of US $17,113 to US $7,073 but was not statistically significant (P = 0.39). Analysis stratified by individual antibiotic was done for the five most commonly used antibiotics and found statistically significant reduction in vancomycin use (14.4 vs. 6.5; P = 0.023). Reduction was also found for cefepime/ceftazidime (6.9 vs. 1.3; P = 0.07), ertapenem (6.8 vs. 3.6; P = 0.45), and piperacillin/tazobactam (1.8 vs. 0.6; P = 0.38). Meropenem use increased (1.3 vs. 3.2; P = 0.042). Hospital transfers slightly trended up (6.73 vs. 7.77; P = 0.065), and there was no change in CDI (1.1 s 0.94; P = 0.32). Conclusion A bundle of standardized interventions tailored for LTCF can achieve successful reduction of antibiotic utilization and costs. Subsequent studies are needed to further determine the impact on clinical outcomes such as transfers to hospitals and CDI in these settings. Disclosures All authors: No reported disclosures.


2009 ◽  
Vol 94 (3) ◽  
pp. 761-764 ◽  
Author(s):  
Manivannan Srinivasan ◽  
Brian A. Irving ◽  
Ketan Dhatariya ◽  
Katherine A. Klaus ◽  
Stacy J. Hartman ◽  
...  

Abstract Context: Levels of dehydroepiandrosterone (DHEA) and its sulfate form (DHEAS) are inversely associated with cardiovascular mortality in men but not women. Very little evidence is available on the impact of DHEA administration on lipoprotein profile in women. DHEAS levels are very low/undetectable in hypoadrenal women. Objective: The objective of the study was to determine the impact of DHEA replacement on lipoprotein profile in hypoadrenal women. Design and Setting: A double-blind, randomized, placebo-controlled, cross-over design study was conducted at the Mayo Clinic. Participants: Thirty-three hypoadrenal Caucasian women (mean ± sd; age 50.3 ± 15.2 yr, body mass index 26.6 ± 4.4 kg/m2) took part in the study. Intervention: Study participants were assigned to receive either a placebo or 50 mg/d of DHEA for 3 months each. Lipid levels and lipoprotein profile were analyzed using the Lipo Science Lipoprotein nuclear magnetic resonance system. Main Outcome Measures: Changes in various lipoprotein sizes and levels were measured. Results: The DHEA period had higher plasma DHEAS levels than during placebo (&lt;0.3 ± 0.0 vs. 3.5 ± 1.3 nmol/liter, P &lt; 0.001). DHEA replacement significantly reduced total cholesterol (20.0 vs. −22, P = 0.02) and high-density lipoprotein (HDL) levels (2.0 vs. −6.0, P = 0.006) and tends to reduce triglyceride and total low-density lipoprotein levels. Although, DHEA replacement had no effect on low-density lipoprotein particle size, it significantly reduced larger HDL particles and to modest extent small HDL particles. Conclusions: Our study findings showed that oral DHEA administration in hypoadrenal women results in an unfavorable lipoprotein profile. The results warrant long-term studies to determine the impact of DHEA replacement on cardiovascular risk.


Crime Science ◽  
2021 ◽  
Vol 10 (1) ◽  
Author(s):  
William A. Chernoff

Abstract Objective The opportunity for web camera theft increased globally as institutions of higher education transitioned to remote learning during COVID-19. Given the thousands of cameras currently installed in classrooms, many with little protection, the present study tests the effectiveness of anti-theft signage for preventing camera theft. Methods Examined web camera theft at a southern, public university located in the United States of America by randomly assigning N = 104 classrooms to receive either anti-theft signage or no signage. Camera theft was analyzed using Blaker’s exact test. Results Classrooms not receiving anti-theft signage (control) were 3.42 times more likely to exhibit web camera theft than classrooms receiving anti-theft signage (medium effect size). Conclusions Using classrooms as the unit of analysis presents new opportunities for not only future crime prevention experiments, but also improving campus safety and security. Also, preventing web camera theft on campus is both fiscally and socially responsible, saving money and ensuring inclusivity for remote learners.


2020 ◽  
Author(s):  
Ravena Melo Ribeiro da Silva ◽  
Ana Cláudia de Brito Câmara ◽  
Ellen Karla Chaves Vieira Koga ◽  
Iza Maria Fraga Lobo ◽  
Wellington Barros da Silva

Abstract Background: Antimicrobials are among the most prescribed drugs in ICUs, where the use of these drugs is approximately 10 times greater than that of other wards. Even so, it is observed that between 30 to 60% of antimicrobial prescriptions performed in these units are unnecessary or inadequate. Thus, surveillance of antimicrobial prescription is a first and essential step to identify potential overuse or misuse, which could be the target of interventions for antimicrobial administration.Methods: This is an observational, analytical, and prospective study conducted in two adult intensive care units (ICU 1 = surgical and ICU 2 = clinic), with 27 beds each. The study period was divided into pre-intervention (January to June 2019) and post-intervention (July to December 2019).Results: Overall, in the pre- and post-intervention period, 91.4% and 90.0%, respectively, of patients received at least one antimicrobial agent. The most frequently prescribed antimicrobial classes were carbapenems (PRE = 26.0% vs POST = 24.9%; p = 0.245) followed by glycopeptides (PRE = 21.0% vs POST = 18.6%; p = 0.056). Overall, there was a significant reduction in the duration of therapy (PRE = 727 LOT / 1000pd vs POST = 680 LOT / 1000pd; p = 0.028). The highest rates regarding the time of use of antimicrobials were observed for carbapenems, followed by glycopeptides, with significant reductions in the time of exposure of glycopeptides (PRE = 284 DOT / 1000pd vs POST = 234 DOT / 1000pd; p = 0.014) and polymyxin B (PRE = 121 DOT / 1000pd vs POST = 88 DOT / 1000pd; p = 0.029), and significant increases for penicillins (PRE = 25 DOT / 1000pd vs POST = 45 DOT / 1000pd; p = 0.009), and tigecycline ( PRE = 3 DOT / 1000pd vs POST = 27 DOT / 1000pd; p = 0.046).Conclusions: In general, the intervention of infectologists in intensive care units had a limited impact on the results evaluated. This may be due to the short period analyzed. Therefore, it is important to monitor the impact of these changes in the long term, drawing a more accurate assessment of the effectiveness of an intervention, with the implementation of active feedback.


Author(s):  
Hulya Sahin

Pulmonary rehabilitation (PR) is a comprehensive intervention in chronic lung diseases, including personalized special therapies, exercise training, education and behavioral changes to improve the physical and psychological status of the patients, and aims to promote behavior that helps improve health status in the long term. A personalized PR program administered by a multidisciplinary team is recently considered a standard and complementary treatment method in chronic lung diseases. After the PR program, dyspnea of COPD patients decreases and their exercise capacities increase. Their daily life activities and physical activities increase. Their functional dependence decreases and quality of life increases. It presents a perfect opportunity to provide self-management and independence for the patients and improve their quality of life. Studies have shown that, unless there is a structured maintenance program, after an average of 6–12 months following PR programs, the gains that are realized start to decrease. Decrease of gains due to causes like a decrease in compliance to exercises, disease progress, attacks and co-morbidities. Causes such as decreased compliance to exercise, progression of the disease, attacks and comorbidities play a role in reducing gains. Especially in advanced age and in the presence of severe disease, the gain in exercise tolerance is lost more rapidly. The methods used and the results obtained to ensure the continuation of the gains differ.


Sign in / Sign up

Export Citation Format

Share Document