scholarly journals Analysis of Recurrent Urinary Tract Infection Management in Outpatient Settings Reveals Opportunities for Antibiotic Stewards

2021 ◽  
Vol 1 (S1) ◽  
pp. s34-s34
Author(s):  
Marissa Valentine-King ◽  
Barbara Trautner ◽  
Roger Zoorob ◽  
George Germanos ◽  
Jason Salemi ◽  
...  

Background: Studies of antibiotic prescribing choice and duration have typically excluded women with recurrent UTI (rUTI), yet the Infectious Disease Society of America (IDSA) UTI treatment guidelines are applicable to recurrent and sporadic cystitis. We sought to better understand prescribing practices among uncomplicated rUTI patients in terms of choice of drug, duration of therapy, and the risk factors for receiving guideline-discordant therapy. Methods: We performed a retrospective database study by extracting electronic health record data from adults seen at academic primary care, internal medicine, or urology practices between November 2016 and December 2018. Inclusion criteria included having ≥2 or ≥3 International Classification of Diseases Tenth Edition (ICD-10) cystitis codes recorded within a 6- or 12-month period, respectively. We excluded patients with ICD-10 codes indicating any structural or functional genitourinary comorbidities, interstitial cystitis, vaginosis, compromised immune systems, or pregnancy in the prior year. Patients were also excluded if they had signs or symptoms of pyelonephritis at presentation. Results: Overall, 232 patients presented for 597 outpatient visits. Most were married (52.2%), non-Hispanic white (62.9%), and female (92.2%), with a median age of 58 years (IQR, 41–68). Only 21% of visits with an antibiotic prescribed for treatment consisted of a first-line therapy agent prescribed for the recommended duration. In terms of antibiotic choice, these agents were prescribed in 58.4% of scenarios, which primarily included nitrofurantoin (37.8%) and trimethoprim-sulfamethoxazole (TMP-SMX) (20.3%). Guideline-discordant choices of fluoroquinolones (28.8%), and β-lactams (11.2%) were the second and third most commonly prescribed drug categories, respectively. Multinomial logistic regression identified age (OR, 1.02; 95% CI, 1.002–1.04) or having a telephone visit (OR, 3.17; 95% CI, 1.54–6.52) as independent risk factors for receiving a β-lactam. The duration exceeded the 3-day guideline recommendation in 87.6% of fluoroquinolones and 73% of TMP-SMX (73%) prescriptions, and 61% of nitrofurantoin prescriptions exceeded the recommended 5-day duration. Multiple logistic regression analysis revealed that seeking care at a urology clinic (OR, 2.81; 95% CI, 1.59–5.17) served as an independent factor for therapy duration exceeding guideline recommendations. Conclusions: This retrospective study revealed shortcomings in prescribing practices in the type and duration of therapy for rUTI. rUTI as well as sporadic UTI are important targets for outpatient antibiotic stewardship interventions.Funding: This investigator-initiated research study was funded by Rebiotix Inc, a Ferring Company.Disclosures: None

2021 ◽  
Vol 29 (3) ◽  
pp. 230949902110609
Author(s):  
Hidetomi Terai ◽  
Yusuke Hori ◽  
Shinji Takahashi ◽  
Koji Tamai ◽  
Masayoshi Iwamae ◽  
...  

Background The coronavirus disease 2019 (COVID-19) pandemic has affected people in various ways, including restricting their mobility and depriving them of exercise opportunities. Such circumstances can trigger locomotor deterioration and impairment, which is known as locomotive syndrome. The purpose of this study was to investigate the incidence of locomotive syndrome in the pandemic and to identify its risk factors. Methods: This was a multicenter questionnaire survey performed between 1 November 2020 and 31 December 2020 in Japan. Patients who visited the orthopedics clinic were asked to answer a questionnaire about their symptoms, exercise habits, and locomotor function at two time points, namely, pre-pandemic and post-second wave (current). The incidence of locomotive syndrome in the COVID-19 pandemic was investigated. Additionally, multiple logistic regression analysis was used to identify the risk factors for developing locomotive syndrome during the pandemic. Results: A total of 2829 patients were enrolled in this study (average age: 61.1 ± 17.1 years; 1532 women). The prevalence of locomotive syndrome was 30% pre-pandemic, which increased significantly to 50% intra-pandemic. Among the patients with no symptoms of locomotive syndrome, 30% developed it in the wake of the pandemic. In the multinomial logistic regression analysis, older age, deteriorated or newly occurring symptoms of musculoskeletal disorders, complaints about the spine or hip/knee joints, and no or decreased exercise habits were independent risk factors for developing locomotive syndrome. Conclusions: The prevalence of locomotive syndrome in patients with musculoskeletal disorders has increased during the COVID-19 pandemic. In addition to age, locomotor symptoms, especially spine or hip/knee joint complaints, and exercise habits were associated with the development of locomotive syndrome. Although the control of infection is a priority, the treatment of musculoskeletal disorders and ensuring exercise habits are also essential issues to address during a pandemic such as COVID-19.


2021 ◽  
Vol 22 (Supplement_1) ◽  
Author(s):  
T Heseltine ◽  
SW Murray ◽  
RL Jones ◽  
M Fisher ◽  
B Ruzsics

Abstract Funding Acknowledgements Type of funding sources: None. onbehalf Liverpool Multiparametric Imaging Collaboration Background Coronary artery calcium (CAC) score is a well-established technique for stratifying an individual’s cardiovascular disease (CVD) risk. Several well-established registries have incorporated CAC scoring into CVD risk prediction models to enhance accuracy. Hepatosteatosis (HS) has been shown to be an independent predictor of CVD events and can be measured on non-contrast computed tomography (CT). We sought to undertake a contemporary, comprehensive assessment of the influence of HS on CAC score alongside traditional CVD risk factors. In patients with HS it may be beneficial to offer routine CAC screening to evaluate CVD risk to enhance opportunities for earlier primary prevention strategies. Methods We performed a retrospective, observational analysis at a high-volume cardiac CT centre analysing consecutive CT coronary angiography (CTCA) studies. All patients referred for investigation of chest pain over a 28-month period (June 2014 to November 2016) were included. Patients with established CVD were excluded. The cardiac findings were reported by a cardiologist and retrospectively analysed by two independent radiologists for the presence of HS. Those with CAC of zero and those with CAC greater than zero were compared for demographic and cardiac risks. A multivariate analysis comparing the risk factors was performed to adjust for the presence of established risk factors. A binomial logistic regression model was developed to assess the association between the presence of HS and increasing strata of CAC. Results In total there were 1499 patients referred for CTCA without prior evidence of CVD. The assessment of HS was completed in 1195 (79.7%) and CAC score was performed in 1103 (92.3%). There were 466 with CVD and 637 without CVD. The prevalence of HS was significantly higher in those with CVD versus those without CVD on CTCA (51.3% versus 39.9%, p = 0.007). Male sex (50.7% versus 36.1% p= <0.001), age (59.4 ± 13.7 versus 48.1 ± 13.6, p= <0.001) and diabetes (12.4% versus 6.9%, p = 0.04) were also significantly higher in the CAC group compared to the CAC score of zero. HS was associated with increasing strata of CAC score compared with CAC of zero (CAC score 1-100 OR1.47, p = 0.01, CAC score 101-400 OR:1.68, p = 0.02, CAC score >400 OR 1.42, p = 0.14). This association became non-significant in the highest strata of CAC score. Conclusion We found a significant association between the increasing age, male sex, diabetes and HS with the presence of CAC. HS was also associated with a more severe phenotype of CVD based on the multinomial logistic regression model. Although the association reduced for the highest strata of CAC (CAC score >400) this likely reflects the overall low numbers of patients within this group and is likely a type II error. Based on these findings it may be appropriate to offer routine CVD risk stratification techniques in all those diagnosed with HS.


2019 ◽  
Vol 6 (Supplement_2) ◽  
pp. S692-S692
Author(s):  
Jon P Furuno ◽  
Brie N Noble ◽  
Vicki Nordby ◽  
Bo Weber ◽  
Jessina C McGregor ◽  
...  

Abstract Background Nursing homes (NHs) are required by the Centers for Medicare and Medicaid Services to maintain antimicrobial stewardship programs. Hospital-initiated antibiotics may pose a barrier to optimizing antibiotic prescribing in this setting. Our objective was to characterize hospital-initiated antibiotic prescriptions among NH residents. Methods We collected electronic health record data on antibiotic prescribing events within 60 days of residents’ admission to 17 for-profit NHs in Oregon, California, and Nevada between January 1, and December 31, 2017. We characterized antibiotics prescribed, administration route, and proportion initiated in a hospital setting. Results Over the one-year study period, there were 4350 antibiotic prescribing events among 1633 NH residents. Mean (standard deviation) age was 77 (12) years and 58% were female. Approximately 45% (1,973/4,350) of antibiotics prescribed within 60 days of NH admission were hospital-initiated. The most frequently prescribed hospital-initiated antibiotics were cephalosporins (27%; 1st gen: 54%, 2nd gen: 6%, 3rd gen: 34%, 4th gen: 5%, 5th gen: 1%), fluoroquinolones (20%), and penicillins (14%; natural penicilins: 4%, semisynthetic penicillins: 3%, aminopenicillans: 57%, β-lactam/β-lactamase inhibitors: 21%, and antipseudomonal penicillins: 15%). Additionally, 24% of antibiotics were parenteral and the median (interquartile range) duration of therapy was 6 (3–10) days. Over 15% of residents with hospital-initiated antibiotics were readmitted to the hospital within 30 days. Conclusion Approximately 45% of antibiotic prescribing in a multistate sample of NHs were hospital-initiated, of which roughly 40% was broad-spectrum. Interventions specifically targeting antibiotic prescribing during and following the transition from hospitals to NHs are needed. Disclosures All authors: No reported disclosures.


2020 ◽  
Vol 20 (4) ◽  
pp. 1646-54
Author(s):  
Peter Thomas Cartledge ◽  
Fidel Shofel Ruzibuka ◽  
Florent Rutagarama ◽  
Samuel Rutare ◽  
Tanya Rogo

Introduction: There is limited published data on antibiotic use in neonatal units in resource-poor settings. Objectives: This study sought to describe antibiotic prescribing practices in three neonatology units in Kigali, Rwanda. Methods: A multi-center, cross-sectional study conducted in two tertiary and one urban district hospital in Kigali, Rwan- da. Participants were neonates admitted in neonatology who received a course of antibiotics during their admission. Data collected included risk factors for neonatal sepsis, clinical signs, symptoms, investigations for neonatal sepsis, antibiotics prescribed, and the number of deaths in the included cohort. Results: 126 neonates were enrolled with 42 from each site. Prematurity (38%) followed by membrane rupture more than 18 hours (25%) were the main risk factors for neonatal sepsis. Ampicillin and Gentamicin (85%) were the most commonly used first-line antibiotics for suspected neonatal sepsis. Most neonates (87%) did not receive a second-line antibiotic. Cefotaxime (11%), was the most commonly used second-line antibiotic. The median duration of antibiotic use was four days in all sur- viving neonates (m=113). In neonates with negative blood culture and normal C-reactive protein (CRP), the median duration of antibiotics was 3.5 days; and for neonates, with positive blood cultures, the median duration was 11 days. Thirteen infants died (10%) at all three sites, with no significant difference between the sites. Conclusion: The median antibiotic duration for neonates with normal lab results exceeded the recommended duration mandated by the national neonatal protocol. We recommend the development of antibiotic stewardship programs in neo- natal units in Rwanda to prevent the adverse effects which may be caused by inappropriate or excessive use of antibiotics. Keywords: (MeSH): Antimicrobial stewardship; anti-bacterial agents; neonatal sepsis; sepsis; infant mortality; neonatal intensive care units; Africa; Rwanda.


2021 ◽  
Author(s):  
yatao jia ◽  
Hongwei Zhao ◽  
Yun Hao ◽  
Jiang Zhu ◽  
Yingyi Li ◽  
...  

Abstract Background: To determine independent predictors of inguinal lymph node(ILN) metastasis in patients with penile-cancer.Patients and methods: We retrospectively analyzed all patients with penile-cancer undergoing surgery at our medical center in ten years(N=157). Using univariate and multivariate logistic-regression models, we assessed associations between the following factors: age, medical-history, phimosis, onset-time, number and maximum diameter of involved ILNs, pathological T stage, degree of tumor differentiation and/or cornification, lymphatic vascular infiltration(LVI), nerve infiltration, and ILN metastases. Interaction and stratified analyses were then used to assess age, phimosis, onset-time, number of ILNs, cornification, and nerve infiltration.Results: Ultimately, 110 patients were included. Multiple logistic-regression analysis showed that the following factors were significantly correlated with ILN metastasis: maximum diameter of enlarged ILNs, T stage, pathological differentiation, and LVI. Among patients with a maximum ILN diameter of ≥1.5 cm, 50%(19/38) had LNM(HR=2.3, 95%CI: 1.0–5.1), whereas only 30.6%(22/72) of patients with a maximum ILN diameter <1.5 cm showed LNM. Among 44 patients with stage Ta/T1, 10(22.7%) showed ILN metastases, while 31 of 66(47.0%) patients with stage T2 showed ILN metastases(HR=3.0, 95%CI: 1.3–7.1). Among 40 patients with highly differentiated penile-cancer, eight(20%) showed ILN metastasis, while 33 of 70(47.1%) patients with low-to-middle differentiation showed ILN metastases(HR=3.6, 95%CI: 1.4–8.8). In the LVI-free group, the rate of LNM was 33.3%(32/96), whereas it was 64.3%(9/14) in the LVI group(HR=3.6, 95%CI: 1.1–11.6). Conclusion: Our single-center results suggested that maximum ILN diameter, pathological T stage, pathological differentiation, and LVI were independent risk factors for ILN metastases.


2020 ◽  
Author(s):  
Bo You ◽  
Zi Chen Yang ◽  
Yu Long Zhang ◽  
Yu Chen ◽  
Yun Long Shi ◽  
...  

Abstract BackgroundAcute kidney injury (AKI) is a morbid complication and the main cause of multiple organ failure and death in severely burned patients. The objective of this study was to explore the epidemiological characteristics, the risk factors, and impact of both early and late AKIs, respectively.MethodsThis retrospective study was performed with prospectively collected data of severely burned patients from the Institute of Burn Research in Southwest Hospital during 2011-2017. AKI was diagnosed according to Kidney Disease Improving Global Outcomes (KDIGO) criteria (2012), and it was divided into early and late AKIs depending on its onset time (within the first 3 days or >3 days post burn). The baseline characteristics, clinical data, and outcomes of the three groups (early AKI, late AKI and non-AKI) were compared using logistic regression analysis. Mortality predictors of patients with AKI were assessed.ResultsA total of 637 patients were included in analysis. The incidence of AKI was 36.9% (early AKI 29.4%, late AKI 10.0%). The mortality of patients with AKI was 32.3% (early AKI 25.7%, late AKI 56.3%), and that of patients without AKI was 2.5%. AKI was independently associated with obviously increased mortality of severely burned patients [early AKI, OR = 12.98 (6.08-27.72); late AKI, OR = 34.02 (15.69-73.75)]. Multiple logistic regression analysis revealed that age, gender, total burn surface area (TBSA), full-thickness burns of TBSA, chronic comorbidities (hypertension or/and diabetes), hypovolemic shock of early burn, and tracheotomy were independent risk factors for both early and late AKIs. However, sepsis was only a risk factor for late AKI. Decompression escharotomy was a protective factor for both AKIs. ConclusionsAKI remains prevalent and is associated with high mortality in severely burned patients. Compared with early AKI, late AKI has a lower occurrence rate, but greater severity and worse prognosis,is a devastating complication. Late AKI is a poor prognosis sign in severe burns.


2021 ◽  
Vol 8 ◽  
Author(s):  
Xiya Lu ◽  
Zhijing Wang ◽  
Liu Yang ◽  
Changqing Yang ◽  
Meiyi Song

Background and Objectives: Liver cirrhosis is known to be associated with atrial arrhythmia. However, the risk factors for atrial arrhythmia in patients with liver cirrhosis remain unclear. This retrospective study aimed to investigate the risk factors for atrial arrhythmia in patients with liver cirrhosis.Methods: In the present study, we collected data from 135 patients with liver cirrhosis who were admitted to the Department of Gastroenterology at Shanghai Tongji Hospital. We examined the clinical information recorded, with the aim of identifying the risk factors for atrial arrhythmia in patients with liver cirrhosis. Multiple logistic regression analysis was used to screen for significant factors differentiating liver cirrhosis patients with atrial arrhythmia from those without atrial arrhythmia.Results: The data showed that there were seven significantly different factors that distinguished the group with atrial arrhythmia from the group without atrial arrhythmia. The seven factors were age, white blood cell count (WBC), albumin (ALB), serum Na+, B-type natriuretic peptide (BNP), ascites, and Child-Pugh score. The results of multivariate logistic regression analysis suggested that age (β = 0.094, OR = 1.098, 95% CI 1.039–1.161, P = 0.001) and ascites (β =1.354, OR = 3.874, 95% CI 1.202–12.483, P = 0.023) were significantly associated with atrial arrhythmia.Conclusion: In the present study, age and ascites were confirmed to be risk factors associated with atrial arrhythmia in patients with liver cirrhosis.


Perfusion ◽  
2009 ◽  
Vol 24 (3) ◽  
pp. 173-178 ◽  
Author(s):  
Guowei Zhang ◽  
Naishi Wu ◽  
Hongyu Liu ◽  
Hang Lv ◽  
Zhifa Yao ◽  
...  

Background: Gastrointestinal complications (GIC) after cardiopulmonary bypass (CPB) surgery are rare, but, nevertheless, extremely dangerous.The identification of risks for GIC may be helpful in planning appropriate perioperative management strategies. The aim of the present study was to analyze perioperative factors of GIC in patients undergoing CPB surgery. Methods: We retrospectively analysed 206 patients who underwent GIC after cardiopulmonary bypass surgery from 2000 to 2007 and compared them with 206 matched control patients (matched for surgery, temperature, hemodilution and date). Univariate analysis and multiple logistic regression analysis were performed on 12 risk factors. Result: Sex and types of cardioplegia perfusate did not significantly influence the GIC after CPB surgery. Multiple logistic regression revealed that CPB time, preoperative serum creatinine (PSC) ≥ 179 mg/dL, emergency surgery, perfusion pressure ≤40mmHg, low cardiac output syndrome (LCOS), age ≥ 61, mechanical ventilation ≥96 h, New York Heart Association (NYHA) class III and IV were predictors of the occurrence of GIC after CPB surgery. Perfusion pressure and aprotinin administration were protective factors. Conclusion: Gastrointestinal complications after CPB surgery could be predictive in the presence of the above risk factors. This study suggests that GIC can be reduced by maintenance of higher perfusion pressure and shortening the time on CPB and ventilation.


2020 ◽  
Vol 35 (Supplement_3) ◽  
Author(s):  
Yang Xu ◽  
Marie Evans ◽  
Franz Peter Barany ◽  
Glen James ◽  
Katarina Hedman ◽  
...  

Abstract Background and Aims Attaining the narrow hemoglobin range (10-12 g/dL) recommended by current ERBP renal anemia guidelines may be difficult, and whether this leads to better outcomes is not well known. This study aimed to identify patient and clinical factors associated with difficulties in maintaining hemoglobin target ranges in routine non-dialysis nephrologist care. We also evaluated whether adherence to ERBP hemoglobin recommendations during pre-dialysis care predicted early post-dialysis outcomes. Method Observational study from the Swedish Renal Registry including all patients with non-dialysis dependent CKD stages 3b-5 developing renal anemia or initiating treatment (iron, ESA or both) between 2012-2016. Through multinomial logistic regression with clustered variance, we identified clinical conditions associated to serum hemoglobin values outside the ERBP recommended range (&lt;10 and &gt;12 g/dL) throughout all recorded patient visits until death, dialysis or end of follow-up. For those who initiated dialysis, we calculated the proportion of patient-time in which hemoglobin was maintained within range (time in range [TIR]). We then explored associations between TIR and subsequent one-year risk of death or MACE (composite of death caused by CVD and non-fatal MI, stroke, heart failure) with Cox proportional hazards regression. Results A total 8106 patients with CKD 3b-5 developed incident anemia in Sweden during 2012-2016, contributing with 37422 nephrology visits during median 2 years of follow up. In multinomial logistic regression, being a man and having received iron or higher ESA doses was associated with hemoglobin values outside target range. Patients with CKD 3b and 4, ongoing transplant, history of CVD, or with higher serum calcium and albumin levels had higher odds of maintaining hemoglobin values above range. Conversely, recent bleeding or transfusions, nephrosclerosis, inflammation (CRP&gt;5 mg/dl), and higher phosphate levels increased the odds of having hemoglobin values below range. A total 2435 patients initiated maintenance dialysis during the study period. Of those, 327 died and 701 developed MACE during the subsequent year. Their median TIR during their pre-dialysis period was 44% (IQR: 34-50). On a continuous scale (FIGURE), we observed worse outcomes for patients with poor guideline recommendation adherence (low percentage TIR), although the association was judged weak. On a categorical scale, patients that spent more than 40% of their pre-dialysis TIR had lower hazards of death (0.57, 95% CI 0.41-0.80) and MACE (0.67, 95% CI, 0.54-0.84) compared to those with &lt;44% TIR. Conclusion This nationwide study reports that greater adherence to ERBP anemia guidelines during pre-dialysis care, using existing conventional therapeutic approaches, is associated with better post-dialysis outcomes. Whether active interaction by healthcare practitioners affected the observed relationship needs to be further explored.


2020 ◽  
pp. 088626051989842
Author(s):  
Jane C. Daquin ◽  
Leah E. Daigle

Historically, criminologists have examined offending and victimization in the community as separate outcomes. Recently, however, researchers have begun to explore the shared commonalities of being an offender and a victim. The victim–offender overlap literature shows that victimization and offending are not different and distinct outcomes, but rather these outcomes share numerous risk factors. A close examination of the victim–offender overlap has not been done within the prison literature. Thus, it remains unclear whether there are commonalities among prisoners who offend while incarcerated and those who experience victimization. The focus of the current study is to (a) identify the proportion of the prisoners who were victims-only, offenders-only, victim–offenders, or neither victim nor offender and (b) identify the factors that predict membership into the four categories of the overlap. The current study used the 2004 Survey of Inmates in State and Federal Correctional Facilities with multinomial logistic regression analyses to examine which factors are associated with group membership into the victim–only, offender–only, or victim–offender groups in prison. Findings show that although the victim–offender overlap exists among prisoners, the majority of prisoners were neither a victim nor an offender. Victim–offenders and victims-only comprise only a small proportion of the sample. Findings also indicate that there are few unique factors across the groups. Results of the study have implications policy and future research.


Sign in / Sign up

Export Citation Format

Share Document