scholarly journals 45 Effect of Increasing Corn Silage Inclusion in Finishing Diets with or Without Tylosin on Performance and Liver Abscesses

2020 ◽  
Vol 98 (Supplement_3) ◽  
pp. 31-32
Author(s):  
Hannah C Wilson ◽  
Bradley M Boyd ◽  
Levi J McPhillips ◽  
Andrea K Watson ◽  
James C MacDonald ◽  
...  

Abstract Elevating corn silage inclusion in finishing diets has been investigated and suggests feeding more silage in farming and feeding operations improves profitability, despite decreased gain (ADG) and feed efficiency (G:F). Feeding more silage may decrease liver abscesses and the need for antibiotic control of abscesses. A finishing study was conducted to assess the impact of silage inclusion in finishing diets to reduce the incidence of liver abscesses in beef cattle. A total of 640 (BW = 334 ± 25 kg) steers were utilized in a 2 × 2 factorial treatment design with two levels of corn silage (15 and 45%, diet DM), with or without tylosin for control of abscesses. This study utilized 32 pens of cattle with 20 steers per pen and 8 pens per treatment. There was an interaction for liver abscesses (P = 0.05) and a tendency for an interaction for performance (P = 0.10) between silage and tylosin inclusion but not for carcass traits (P ≥ 0.20). Cattle fed 15% corn silage had the greatest incidence of liver abscesses (34.5%) compared to other treatments (P = 0.05), and abscess rate was decreased to 19% if tylosin was fed. Feeding 45% silage was also effective at lowering liver abscess rates which were 12.4% regardless of whether an antibiotic was fed. Feeding corn silage at 45% of diet DM was as effective as feeding an antibiotic to cattle on 85% concentrate diets. Feeding corn silage at greater inclusions decreased ADG (P ≤ 0.01) but increased final body weight when fed to an equal fatness. However, feeding corn silage at 45% was more economical compared to feeding 15% corn silage, especially at higher corn prices, provided shrink is well managed. Feeding elevated concentrations of corn silage may be an economically viable method to control liver abscesses without antibiotic use.

2017 ◽  
Vol 1 (3) ◽  
pp. 367-381 ◽  
Author(s):  
D. B. Burken ◽  
B. L. Nuttelman ◽  
J. L. Gramkow ◽  
A. L. McGee ◽  
K. M. Sudbeck ◽  
...  

Abstract Corn plants were sampled over 2 consecutive years to assess the effects of corn hybrid maturity class, plant population, and harvest time on whole corn plant quality and yield in Nebraska. A finishing experiment evaluated the substitution of corn with corn silage in diets with corn modified distillers grains with solubles (MDGS). The first 2 harvest dates were at the mid- and late-silage harvest times whereas the final harvest was at the grain harvest stage of plant maturity. Whole plant yields increased as harvest time progressed (yr 1 quadratic P < 0.01; yr 2 linear P < 0.01). However, differences in TDN concentration in both years were quite minimal across harvest time, because grain percentage increased but residue NDF in-situ disappearance decreased as harvest time was delayed. In the finishing experiment, as corn silage inclusion increased from 15 to 55% (DM basis) by replacing dry rolled and high moisture corn grain with corn silage in diets containing 40% MDGS, DMI, ADG, and G:F linearly decreased (P ≤ 0.01), with the steers on the 15% corn silage treatment being 1.5%, 5.0%, and 7.7% more efficient than steers on treatments containing 30, 45, and 55% corn silage, respectively. Calculated dietary NEm and NEg decreased linearly as corn silage inclusion increased indicating that net energy values were greater for corn grain than for corn silage. In addition, dressing percentage decreased linearly (P < 0.01) as silage inclusion increased suggesting more fill as silage inclusion increases in diets. Cattle fed greater than 15% corn silage in finishing diets based on corn grain will gain slower and be slightly less efficient and likely require increased days to market at similar carcass fatness and size. When 30% silage was fed with 65% MDGS, DMI, and ADG were decreased (P < 0.01) compared to feeding 30% silage with 40% MDGS suggesting some benefit to including a proportion of corn in the diet. Conversely, when 45% silage was fed with 40% MDGS, ADG, and G:F were greater (P < 0.04) than when 45% silage was fed with just grain implying a greater energy value for MDGS than for corn grain. Substituting corn silage for corn grain in finishing diets decreased ADG and G:F which would increase days to finish to an equal carcass weight; however, in this experiment, increasing corn silage levels with MDGS present reduced carcass fat thickness without significantly decreasing marbling score.


2019 ◽  
Vol 4 (1) ◽  
pp. 129-140 ◽  
Author(s):  
Jordan A Johnson ◽  
Brittney D Sutherland ◽  
John J McKinnon ◽  
Tim A McAllister ◽  
Gregory B Penner

Abstract The objective of this study was to evaluate the effects of the source of silage, cereal grain, and their interaction on growth performance, digestibility, and carcass characteristics of finishing beef cattle. Using a completely randomized design within an 89-d finishing study, 288 steers were randomly assigned to 1 of 24 pens (12 steers/pen) with average steer body weight (BW) within a pen of 464 kg ± 1.7 kg (mean ± SD). Diets were arranged in a 2 × 3 factorial with corn silage (CS) or barley silage (BS) included at 8% (dry matter [DM] basis). Within each silage source, diets contained dry-rolled barley grain (BG; 86% of DM), dry-rolled corn grain (CG; 85% of DM), or an equal blend of BG and CG (BCG; 85% of DM). Total tract digestibility of nutrients was estimated from fecal samples using near-infrared spectroscopy. Data were analyzed with pen as the experimental unit using the Mixed Model of SAS with the fixed effects of silage, grain, and the two-way interaction. Carcass and fecal kernel data were analyzed using GLIMMIX utilizing the same model. There were no interactions detected between silage and grain source. Feeding CG increased (P < 0.01) DM intake by 0.8 and 0.6 kg/d relative to BG and BCG, respectively. Gain-to-feed ratio was greater (P = 0.04) for BG (0.172 kg/kg) than CG (0.162 kg/kg) but did not differ from BCG (0.165 kg/kg). Furthermore, average daily gain (2.07 kg/d) and final body weight did not differ among treatments (P ≥ 0.25). Hot carcass weight (HCW) was 6.2 kg greater (372.2 vs. 366.0 kg; P < 0.01) and dressing percentage was 0.57 percentage units greater (59.53 vs. 58.96 %; P = 0.04) for steers fed CS than BS, respectively. There was no effect of dietary treatment on the severity of liver abscesses (P ≥ 0.20) with 72.0% of carcasses having clear livers, 24.4% with minor liver abscesses, and 3.6% with severe liver abscesses. Digestibility of DM, organic matter, crude protein, neutral detergent fiber, and starch were greater for BG (P < 0.01) than CG or BCG. As expected, grain source affected the appearance of grain kernels in the feces (P ≤ 0.04). Feeding CS silage increased the appearance of fractured corn kernels (P = 0.04), while feeding BS increased fiber appearance in the feces (P = 0.02). Current results indicate that when dry rolled, feeding BG resulted in improved performance and digestibility compared with CG and BCG. Even at low inclusion levels (8% of DM), CS resulted in improved carcass characteristics relative to BS.


2020 ◽  
Vol 10 (2) ◽  
pp. 25-41
Author(s):  
Thejaswini Karanth ◽  
Someswar Deb ◽  
Lal Ruatpuii Zadeng ◽  
Rajeswari Ramasamy ◽  
Teena Nazeem ◽  
...  

Objective to assess the impact of pharmacist assisted counselling in improving Parental Knowledge, Attitude and Practice [KAP] towards antibiotic use in children. A Prospective, Educational Interventional Study was conducted in 200 subjects, from the randomly chosen communities in Bangalore. The investigators did door to door visit. The primary demographics data of parents and their children were collected using standard Case Report Form (CRF), and the baseline towards antibiotic use in Children was obtained from parents using validated Questionnaire. In the presence of both parents, only one was supposed to answer the Questionnaire. Pharmacist assisted parent centred interventional counselling was provided with the help of Patient Information Leaflet1s (PIL). Follow-up and post interventional KAP assessment were done after two months from the baseline measurement. The changes in parental KAP towards antibiotics use in children were being assessed by comparing the Pretest and Posttest responses using statistical analysis. The knowledge of parents towards antibiotic use in children was medium to good in the baseline KAP assessment; however, in the majority of the participating parents it was not satisfactory in attitude and practice domains. A statistically significant improvement was seen in the KAP of parents towards antibiotic use in children after the pharmacist assisted interventional counselling. Thus, Investigators could bring excellent changes in the knowledge part; whereas the result for changes in the Attitude and Practice was good to medium respectively.


2020 ◽  
Vol 7 (Supplement_1) ◽  
pp. S116-S116
Author(s):  
Julia Sessa ◽  
Helen Jacoby ◽  
Bruce Blain ◽  
Lisa Avery

Abstract Background Measuring antimicrobial consumption data is a foundation of antimicrobial stewardship programs. There is data to support antimicrobial scorecard utilization to improve antibiotic use in the outpatient setting. There is a lack of data on the impact of an antimicrobial scorecard for hospitalists. Our objective was to improve antibiotic prescribing amongst the hospitalist service through the development of an antimicrobial scorecard. Methods Conducted in a 451-bed teaching hospital amongst 22 full time hospitalists. The antimicrobial scorecard for 2019 was distributed in two phases. In October 2019, baseline antibiotic prescribing data (January – September 2019) was distributed. In January 2020, a second scorecard was distributed (October – December 2019) to assess the impact of the scorecard. The scorecard distributed via e-mail to physicians included: Antibiotic days of therapy/1,000 patient care days (corrected for attending census), route of antibiotic prescribing (% intravenous (IV) vs % oral (PO)) and percentage of patients prescribed piperacillin-tazobactam (PT) for greater than 3 days. Hospitalists received their data in rank order amongst their peers. Along with the antimicrobial scorecard, recommendations from the antimicrobial stewardship team were included for hospitalists to improve their antibiotic prescribing for these initiatives. Hospitalists demographics (years of practice and gender) were collected. Descriptive statistics were utilized to analyze pre and post data. Results Sixteen (16) out of 22 (73%) hospitalists improved their antibiotic prescribing from pre- to post-scorecard (χ 2(1)=3.68, p = 0.055). The median antibiotic days of therapy/1,000 patient care days decreased from 661 pre-scorecard to 618 post-scorecard (p = 0.043). The median PT use greater than 3 days also decreased significantly, from 18% pre-scorecard to 11% post-scorecard (p = 0.0025). There was no change in % of IV antibiotic prescribing and no correlation between years of experience or gender to antibiotic prescribing. Conclusion Providing antimicrobial scorecards to our hospitalist service resulted in a significant decrease in antibiotic days of therapy/1,000 patient care days and PT prescribing beyond 3 days. Disclosures All Authors: No reported disclosures


2020 ◽  
Vol 7 (Supplement_1) ◽  
pp. S97-S97
Author(s):  
Christina M Kaul ◽  
Eric Molina ◽  
Donna Armellino ◽  
Mary Ellen Schilling ◽  
Mark Jarrett

Abstract Background Overutilization of antibiotics remains an issue in the inpatient setting. What is more, many protocols geared toward curbing improper antibiotic use rely heavily on resource- and personnel-intensive interventions. Thus, the potential for using the EMR to facilitate antibiotic stewardship remains largely unexplored. Methods We implemented a novel change for ordering certain antibiotics in our EMR: ceftriaxone, daptomycin, ertapenem, imipenem, meropenem, and piperacillin-tazobactam. When ordering one of these antibiotics, providers had to note a usage indication, which assigned a usage duration as per our Antibiotic Stewardship Committee guidelines. Pre-intervention, manual discontinuation was required if a provider did not enter a duration. The intervention was enacted August 2019 in 13 hospitals. Data was collected from January 2018 to February 2020. Antibiotic usage was reported monthly as rate per 1000-patient days. Monthly pre- and post-intervention rates were averaged, respectively. Paired samples t-tests were used to compare pre- and post-intervention rates per unit type per hospital. A p-value of less than 0.05 was considered significant. Units with minimal usage, as defined by a pre- or post-intervention mean of 0, were excluded from analysis. Example of Ordering an Antibiotic Prior to Intervention Example of Ordering an Antibiotic After Intervention Results Ertapenem was noted to have a statistically significant decrease in utilization in seven units at three hospitals. Piperacillin-tazobactam was found to have a decrease in utilization in 19 units at eight hospitals. Daptomycin was found to have a decrease in utilization in one unit. Significant decreases in the utilization of ceftriaxone, imipenem, and meropenem were not noted. Example of Statistically Significant Decreased Utilization in Piperacillin-Tazobactam on a Medical-Surglcal Unit Conclusion Our study showed a statistically significant decrease in use of ertapenem, piperacillin-tazobactam and daptomycin using a simple built-in EMR prompt that curtails provider error. This should allow for an increased ease of integration, as the protocol does not require a host of resources for maintenance. Of note is decreased utilization of piperacillin-tazobactam and ertapenem across multiple hospitals, most notably on the medical and surgical wards. Thus, usage of the EMR without personnel-intensive protocols is a viable method for augmenting antibiotic stewardship in health systems. Disclosures All Authors: No reported disclosures


2020 ◽  
Vol 7 (Supplement_1) ◽  
pp. S86-S86
Author(s):  
Ann F Chou ◽  
Yue Zhang ◽  
Makoto M Jones ◽  
Christopher J Graber ◽  
Matthew B Goetz ◽  
...  

Abstract Background About 30–50% of inpatient antimicrobial therapy is sub-optimal. Health care facilities have utilized various antimicrobial stewardship (AS) strategies to optimize appropriate antimicrobial use, improve health outcomes, and promote patient safety. However, little evidence exists to assess relationships between AS strategies and antimicrobial use. This study examined the impact of changes in AS strategies on antimicrobial use over time. Methods This study used data from the Veterans Affairs (VA) Healthcare Analysis & Informatics Group (HAIG) AS survey, administered at 130 VA facilities in 2012 and 2015, and antimicrobial utilization from VA Corporate Data Warehouse. Four AS strategies were examined: having an AS team, feedback mechanism on antimicrobial use, infectious diseases (ID) attending physicians, and clinical pharmacist on wards. Change in AS strategies were computed by taking the difference in the presence of a given strategy in a facility between 2012–2015. The outcome was the difference between antimicrobial use per 1000 patient days in 2012–2013 and 2015–2016. Employing multiple regression analysis, changes in antimicrobial use was estimated as a function of changes in AS strategies, controlling for ID human resources in and organizational complexity. Results Of the 4 strategies, only change in availability of AS teams had an impact on antimicrobial use. Compared to facilities with no AS teams at both time points, antibiotic use decreased by 63.9 uses per 1000 patient days in facilities that did not have a AS team in 2012 but implemented one in 2015 (p=0.0183). Facilities that had an AS team at both time points decreased use by 62.2 per 1000 patient days (p=0.0324). Conclusion The findings showed that AS teams reduced inpatient antibiotic use over time. While changes in having feedback on antimicrobial use and clinical pharmacist on wards showed reduced antimicrobial use between 2012–2015, the differences were not statistically significant. These strategies may already be a part of a comprehensive AS program and employed by AS teams. In further development of stewardship programs within healthcare organizations, the association between AS teams and antibiotic use should inform program design and implementation. Disclosures All Authors: No reported disclosures


Livestock ◽  
2021 ◽  
Vol 26 (4) ◽  
pp. 176-179
Author(s):  
Chris Lloyd

The Responsible Use of Medicines in Agriculture Alliance (RUMA) was established to promote the highest standards of food safety, animal health and animal welfare in the British livestock industry. It has a current focus to deliver on the Government objective of identifying sector-specific targets for the reduction, refinement or replacement of antibiotics in animal agriculture. The creation and roll out of sector specific targets in 2017 through the RUMA Targets Task Force, has helped focus activity across the UK livestock sectors to achieve a 50% reduction in antibiotic use since 2014. This has been realised principally through voluntary multi-sector collaboration, cross sector initiatives, codes of practice, industry body support and farm assurance schemes. This article provides an overview of RUMA's work to date providing insight into the methods used to create the targets, why they are so important, the impact they are having and how ongoing support and robust data are vital components in achieving the latest set of targets.


2020 ◽  
Vol 7 (Supplement_1) ◽  
pp. S87-S87
Author(s):  
Ebbing Lautenbach ◽  
Keith W Hamilton ◽  
Robert Grundmeier ◽  
Melinda M Neuhauser ◽  
Lauri Hicks ◽  
...  

Abstract Background Although most antibiotic use occurs in outpatients, antibiotic stewardship programs (ASPs) have primarily focused on inpatients. A major challenge for outpatient ASPs is lack of accurate and accessible electronic data to target interventions. We developed and validated an electronic algorithm to identify inappropriate antibiotic use for adult outpatients with acute pharyngitis. Methods In the University of Pennsylvania Health System, we used ICD-10 diagnostic codes to identify patient encounters for acute pharyngitis at outpatient practices between 3/15/17 – 3/14/18. Exclusion criteria included immunocompromising conditions, comorbidities, and concurrent infections that might require antibiotic use. We randomly selected 300 eligible subjects. Inappropriate antibiotic use based on chart review served as the basis for assessment of the electronic algorithm which was constructed using only data in the electronic health record (EHR). Criteria for appropriate prescribing, choice of antibiotic, and duration included positive streptococcal testing, use of penicillin/amoxicillin (absent b-lactam allergy), and 10 days maximum duration of therapy. Results Of 300 subjects, median age was 42, 75% were female, 64% were seen by internal medicine (vs. family medicine), and 69% were seen by a physician (vs. advanced practice provider). On chart review, 127 (42%) subjects received an antibiotic, of which 29 had a positive streptococcal test and 4 had another appropriate indication. Thus, 74% (94/127) of patients received antibiotics inappropriately. Of the 29 patients who received appropriate prescribing, 27 (93%) received an appropriate antibiotic. Finally, of the 29 patients who were appropriately treated, 29 (100%) received the correct duration. Test characteristics of the EHR algorithm (compared to chart review) are noted in the Table. Conclusion Inappropriate antibiotic prescribing for acute pharyngitis is common. An electronic algorithm for identifying inappropriate prescribing, antibiotic choice, and duration is highly accurate. This algorithm could be used to efficiently assess prescribing among practices and individual clinicians. The impact of interventions based on this algorithm should be tested in future work. Test Characteristics of Electronic Algorithm for Inappropriate Prescribing, Agent, and Duration Disclosures All Authors: No reported disclosures


Author(s):  
Elad Keren ◽  
Abraham Borer ◽  
Lior Nesher ◽  
Tali Shafat ◽  
Rivka Yosipovich ◽  
...  

Abstract Objective: To determine whether a multifaceted approach effectively influenced antibiotic use in an orthopedics department. Design: Retrospective cohort study comparing the readmission rate and antibiotic use before and after an intervention. Setting: A 1,000-bed, tertiary-care, university hospital. Patients: Adult patients admitted to the orthopedics department between January 2015 and December 2018. Methods: During the preintervention period (2015–2016), 1 general orthopedic department was in operation. In the postintervention period (2017–2018), 2 separate departments were created: one designated for elective “clean” surgeries and another that included a “complicated wound” unit. A multifaceted strategy including infection prevention measures and introducing antibiotic stewardship practices was implemented. Admission rates, hand hygiene practice compliance, surgical site infections, and antibiotic treatment before versus after the intervention were analyzed. Results: The number of admissions and hospitalization days in the 2 periods did not change. Seven-day readmissions per annual quarter decreased significantly from the preintervention period (median, 7 days; interquartile range [IQR], 6–9) to the postintervention period (median, 4 days; IQR, 2–7; P = .038). Hand hygiene compliance increased and surgical site infections decreased in the postintervention period. Although total antibiotic use was not reduced, there was a significant change in the breakdown of the different antibiotic classes used before and after the intervention: increased use of narrow-spectrum β-lactams (P < .001) and decreased use of β-lactamase inhibitors (P < .001), third-generation cephalosporins (P = .044), and clindamycin (P < .001). Conclusions: Restructuring the orthopedics department facilitated better infection prevention measures accompanied by antibiotic stewardship implementation, resulting in a decreased use of broad-spectrum antibiotics and a significant reduction in readmission rates.


2021 ◽  
Vol 99 (Supplement_1) ◽  
pp. 139-140
Author(s):  
Frédéric A Vangroenweghe

Abstract Post-weaning Escherichia coli diarrhea (PWD) remains a major cause of economic losses for the pig industry. PWD, caused by enterotoxigenic E. coli (ETEC), typically provokes mild to severe watery diarrhea between 5–10 days after weaning. Recently, an oral live bivalent E. coli F4/F18 vaccine (Coliprotec® F4/F18; Elanco) was approved on the European market, which reduces the impact of PWD provoked by F4-ETEC and F18-ETEC. The objective was to compare technical results and antibiotic use following E. coli F4/F18 vaccination with previous standard therapeutic approach under field conditions. A 1600-sow farm (weaning at 26 days) with diagnosed problems of PWD due to F18-ETEC was selected. Piglets were vaccinated at 21 days with the oral live bivalent E. coli F4/F18 vaccine. At weaning, no standard group medication (ZnO and antibiotics) was applied for prevention of PWD. Several performance parameters were collected: treatment incidence (TI100), mortality and days in nursery. Statistical analysis was performed using JMP 14.0 – comparison of means. Oral E. coli F4/F18 vaccination significantly reduced TI100 (7 ± 2 days to 0 ± 1 days; P &lt; 0.05). Mortality rate remained stable (2.05% in Control to 1.96% in Vaccinated group; P &lt; 0.05). Days in nursery (40 ± 3 days) remained at the same level compared to pre-vaccination. The results show that live E. coli F4/F18 vaccination against PWD has led to similar technical performance parameters and mortality, in combination with a significant reduction in medication use. In conclusion, control of PWD through oral vaccination is a successful option in order to prevent piglets from the negative clinical outcomes of F18-ETEC infection during the post-weaning period.


Sign in / Sign up

Export Citation Format

Share Document