scholarly journals Association between antimicrobial drug class selection for treatment and retreatment of bovine respiratory disease and health, performance, and carcass quality outcomes in feedlot cattle

2020 ◽  
Vol 98 (4) ◽  
Author(s):  
Johann F Coetzee ◽  
Natalia Cernicchiaro ◽  
Pritam K Sidhu ◽  
Michael D Kleinhenz

Abstract Treatment and control of bovine respiratory disease (BRD) is predicated on the use of two categories of antimicrobials, namely bacteriostatic drugs that inhibit bacterial growth and replication (STATIC), and bactericidal drugs that kill bacteria in in vitro culture systems (CIDAL). Recently, we reported that initial BRD treatment with a STATIC antimicrobial followed by retreatment with a CIDAL antimicrobial was associated with a higher frequency of multidrug-resistant bacteria isolated from field cases of BRD submitted to a veterinary diagnostic laboratory. The present study was conducted to test the hypothesis that calves administered the same class of antimicrobial for first and second BRD treatment (i.e., CIDAL-CIDAL or STATIC-STATIC) would have improved health and performance outcomes at the feedlot compared to calves that received a different antimicrobial class for retreatment (i.e., STATIC-CIDAL or CIDAL-STATIC). The association between antimicrobial treatments and health, performance, and carcass quality outcomes were determined by a retrospective analysis of 4,252 BRD treatment records from a commercial feedlot operation collected from 2001 to 2005. Data were compared using generalized linear mixed statistical models that included gender, season, and arrival weight as covariates. The mean (±SE) probability of BRD cases identified as requiring four or more treatments compared to three treatments was greater in calves that received STATIC-CIDAL (73.58 ± 2.38%) or STATIC-STATIC (71.32 ± 2.52%) first and second antimicrobial treatments compared to calves receiving CIDAL-CIDAL (50.35 ± 3.46%) first and second treatments (P < 0.001). Calves receiving CIDAL-CIDAL first and second treatments also had an increased average daily gain (1.11 ± 0.03 kg/d) compared to calves receiving STATIC-CIDAL (0.95 ± 0.03 kg/d) and STATIC-STATIC (0.84 ± 0.02 kg/d) treatments (P < 0.001). Furthermore, CIDAL-CIDAL-treated calves had a higher probability of a choice quality grade at slaughter (36.44 ± 4.80%) compared to STATIC-CIDAL calves (28.09 ± 3.88%) (P = 0.037). There was no effect of antimicrobial treatment combination on BRD mortality (P = 0.855) or yield grade (P = 0.240) outcomes. These observations suggest that consideration should be given to antimicrobial pharmacodynamics when selecting drugs for retreatment of BRD. These findings have implications for developing BRD treatment protocols that address both post-treatment production and antimicrobial stewardship concerns.

2021 ◽  
Vol 99 (Supplement_3) ◽  
pp. 323-323
Author(s):  
Haley Yeatter ◽  
Beth B Kegley ◽  
Reagan N Cauble ◽  
Jana Reynolds ◽  
Ben P Shoulders ◽  
...  

Abstract Citrus pulp is a source of flavonoids which have been found to have antioxidant properties. Thus, the objective of this experiment was to investigate the effects of feeding dried citrus pulp on performance of newly received calves. Crossbred beef heifers (n = 254, initial body weight = 248 ± 5.9 kg) were obtained on 3 dates (block, 8 pens/block). Treatments were: 1) a corn and distillers’ grains based receiving supplement (control) or 2) a receiving supplement that contained 20% dried citrus pulp (replacing a portion of the corn). Upon arrival from regional livestock markets, cattle had access to hay and water and rested overnight then were processed the next day in which they received an identification tag, were vaccinated with a clostridial and a 5-way modified live bovine respiratory viral, dewormed, weighed, branded, and ear notched for detection of persistent infection with bovine viral diarrhea virus. Each truckload was assigned randomly to pens resulting in 8 pens with 9 to 12 heifers/pen. Pens were assigned randomly to 1 of the 2 treatments. Cattle were offered bermudagrass hay and water for ad libitum intake and were offered up to 1.8 kg/day of their appropriate receiving supplement. Overall average daily gain for the 42-day receiving period was increased (P < 0.01) for calves fed the supplement that included dried citrus pulp (1.01 kg/day) compared to the calves fed the control supplement (0.90 kg/day). However, the percentage of calves treated for clinical bovine respiratory disease was increased (P < 0.05) for calves fed the citrus pulp containing supplement (14% morbidity) compared to control calves (7% morbidity). The supplementation of dried citrus pulp to calves improved growth performance, but did not reduce the incidence of clinical bovine respiratory disease during the receiving period; however, clinical morbidity was low for both treatments.


2020 ◽  
Author(s):  
C. Blakebrough-Hall ◽  
P. Hick ◽  
L.A. González

Abstract BackgroundBovine respiratory disease (BRD) is the most significant disease affecting feedlot cattle. Indicators of BRD often used in feedlots such as visual signs, rectal temperature, computer-assisted lung auscultation (CALA) score, the number of BRD treatments, presence of viral pathogens, viral seroconversion and lung damage at slaughter vary in their ability to predict an animal’s BRD outcome, and no studies have been published determining how a combination of these BRD indicators may define the number of BRD disease outcome groups. The objectives of the current study were 1) to identify BRD outcome groups using BRD indicators collected during the feeding phase and at slaughter through latent class analysis, and 2) to determine the importance of these BRD indicators to predict disease outcome. Animals with BRD (n=127) were identified by visual signs and removed from production pens for further examination. Control animals displaying no visual signs of BRD (n=143) were also removed and examined. Blood, nasal swab samples and clinical measurements were collected. Lung and pleural lesions indicative of BRD were scored at slaughter. Latent class analysis was applied to identify possible outcome groups. Results Three latent classes were identified in the best model fit, categorized as non-BRD, mild BRD and severe BRD. Animals in the mild BRD group had a higher probability of visual signs of BRD compared to animals with severe BRD. Animals in the severe BRD group were more likely to require more than one treatment for BRD and have ≥ 40oC rectal temperature, ≥ 10% total lung consolidation and severe pleural lesions at slaughter. Animals in the severe BRD group were also more likely to be naïve at feedlot entry and first BRD pull and have a positive nasal swab result for some BRD viruses. Lower overall ADG (average daily gain) was also associated with severe BRD (P < 0.001). Conclusions These results demonstrate that there are important indicators of BRD severity. Using this information to predict an animal’s BRD outcome would greatly enhance treatment efficacy and aid in better management of animals at risk of suffering from severe BRD.


2019 ◽  
Vol 97 (Supplement_3) ◽  
pp. 196-197
Author(s):  
Autumn T Pickett ◽  
Jase Ball ◽  
Elizabeth Kegley ◽  
Ken Blue ◽  
Jacob A Hagenmaier ◽  
...  

Abstract Crossbred male beef calves (n = 259; bulls = 134, steers = 125; body weight = 250 ± 3.4 kg) approximately 6 months of age and considered high-risk for developing bovine respiratory disease arrived on 3 dates (block) and were stratified by arrival castrate status and weight to be evenly distributed across pens (8 pens/block; 9 to 12 calves/pen). The pens were randomly assigned to 1 of 2 treatments: 1) Nuplura PH (administration of a Mannheimia haemolytica leukotoxoid at processing) or 2) Control (no M. haemolytica leukotoxoid). All cattle received tilmicosin on d 0 with a 5-d post-metaphylactic interval. Body weights were recorded on d -1, 0, 14, 28, 41 and 42. Blood was collected on d -1, 14, 28, and 42 and sera were harvested to determine serum neutralization titers for bovine virus diarrhea (BVD) type I and bovine anti-M. haemolytica leukotoxin antibodies. Calves were observed daily for signs of morbidity. Body weight and average daily gain were not affected (P ≥ 0.26) by treatment. The percentage of calves administered 1, 2, or 3 antibiotic treatments for clinical bovine respiratory disease did not differ (P ≥ 0.35). There was a tendency for mortality to be greater for Control compared to Nuplura PH (1.6 vs 0.0%; P = 0.10). Calves administered Nuplura PH possessed greater antibody response against M. haemolytica leukotoxin on d 14, 28, and 42 compared to Control calves (P &lt; 0.01). There was no treatment × day interaction for antibody titers against BVD (P = 0.98). The use of a M. haemolytica leukotoxoid had no effect on growth performance and morbidity for the 42-d following receiving in this small-pen study, but reduced the incidence of mortality and did not interfere with antibody response to BVD vaccination in high-risk, newly received calves metaphylactically treated with tilmicosin on arrival.


2020 ◽  
Vol 4 (2) ◽  
pp. 1091-1102
Author(s):  
Elliott J Dennis ◽  
Ted C Schroeder ◽  
David G Renter

Abstract This study’s objective was to estimate net returns and return risk for antimicrobial metaphylaxis options to manage bovine respiratory disease (BRD) in high health-risk feedlot cattle. The effectiveness of antimicrobials for metaphylaxis varies by cattle population. How differing antimicrobial effectiveness translates to net return profitability for heterogeneous cattle populations is less understood. Net returns and return risk were assessed using a net return simulation model adapted to allow for heterogeneity in high health-risk cattle placement characteristics and antimicrobial choice to control BRD. The net return model incorporated how antimicrobials modify BRD health and performance outcomes. Health and performance outcomes were calibrated from published literature and proprietary feedlot data. Proprietary data came from 10 Midwestern feedlots representing nearly 6 million animals and 50,000 cohorts. Twelve placement-by-metaphylaxis decision combinations were assessed: high health-risk steer placement demographics were 600 or 800 lb steers placed in Winter (Oct–Mar) or Summer (Apr–Sept) managed with one of three different health programs: “no metaphylaxis,” “Upper Tier” antimicrobial, or “Lower Tier” antimicrobial. Net return distributions were compared between “no metaphylaxis” and a specific antimicrobial tier within specific cattle populations. We found the expected incremental net return of administering an “Upper Tier” (“Lower Tier”) antimicrobial for metaphylaxis compared to “no metaphylaxis” for high health-risk steers was $122.55 per head ($65.72) for 600 lb and $148.65 per head ($79.65) for 800 lb winter placements. The incremental expected net return and risk mitigated by metaphylaxis varied by placement weight, season, and antimicrobial choice. The probability net returns would decline by at least $50 per head was significantly reduced (from approximately 4% to 40%) when any antimicrobial was used on high health-risk steers. Both tiers of antimicrobials used for metaphylaxis increased expected net returns and decreased net return variability relative to no metaphylaxis. Thus, feedlots were more certain and realize a greater profit on high health-risk pens of steers when metaphylaxis was used. This occurred because the reduction in cattle health and performance outcomes using any antimicrobial was sufficiently large to cover added initial and subsequent antimicrobial costs. Results aid in assessing metaphylaxis strategies in high health-risk cattle.


Author(s):  
J A Cortes ◽  
S Hendrick ◽  
E Janzen ◽  
E A Pajor ◽  
K Orsel

Abstract Digital dermatitis has emerged in North American feedlots, although production and economic impacts are not fully understood. Objectives of this study were to: 1) estimate economic impact of a single case of digital dermatitis (DD), foot rot (FR) and bovine respiratory disease (BRD) in feedlot cattle; and 2) determine its impact on average daily gain (ADG). Feedlot cattle health and production records were available from 2 feedlots for a 3-year interval. The dataset consisted of 77,115 animal records, with 19.3% (14,900) diagnosed with a disease. Diseased animals were categorized in 5 groups: DD, FR, BRD, other diseases (OT) and 2 or more diseases (TM), with a treatment cumulative incidence of 6.0, 59.1, 10.7, 12.7 and 11.5%, respectively. Foot rot was the disease with the highest cumulative incidence in both heifers and steers (58.8 and 59.6%, respectively). Of all fall placed cattle diagnosed with any disease, 48.1% of cases were FR. Digital dermatitis affected the partial budget in 5 out of the 8 groups of cattle, with the highest impact of DD seen in grass yearling heifers (GYH) and grass yearling steers (GYS): $-98 and $-96 CAD, respectively relative to their healthier counterparts. Healthy cattle had a significantly higher ADG compared to DD cattle in 5 of 8 categories, ranging from 0.11 kg/d in winter placed heifers to 0.17 kg/d in fall placed steers. In the economic analysis it was concluded that on an individual animal basis BRD was the most impactful of all analyzed diseases, where DD was second, marking the importance of controlling and mitigating this foot condition. Identifying differential effects of diseases on a partial budget analysis and ADG of the types of cattle stratified by sex, enables feedlot producers to focus control and mitigation strategies on specific groups.


Sign in / Sign up

Export Citation Format

Share Document