scholarly journals PSI-5 Effects of rest period prior to processing on anthelmintic response during the receiving period in feedlot heifers

2020 ◽  
Vol 98 (Supplement_3) ◽  
pp. 225-226
Author(s):  
Nicole Stafford ◽  
A J Tarpoff ◽  
Miles Theurer ◽  
Tom Jones ◽  
Cassandra K Jones

Abstract A major stressor affecting feedlot cattle performance is transportation. Knowing when to process cattle and how much rest to provide prior to processing may improve cattle health and response to processing, such as anthelmintic administration. The goal of this project was to determine the impact of time of rest prior to initial processing on parasite prevalence during the receiving period. Eighty mixed-breed heifers (250±4.2 kg BW) were purchased at live auction in Oklahoma City, OK and transported to the Kansas State University Beef Cattle Research Center in Manhattan. Heifers were allotted in a completely randomized design to one of four treatments processed at 0, 6, 24, or 48 hours after arrival. At processing (d 0), fecal samples were collected, and cattle were subcutaneously injected with 1.0 ml/50 kg BW moxidectin (Cydectin®, Bayer Animal Health, Shawnee Mission, KS) and orally dosed with 1.0 mL/50 kg BW oxfendazole (Synanthic®, Boehringer Ingelheim, St. Joseph, MO). Fecal samples were collected again on d 14. Fecal samples were analyzed by the Kansas State University Veterinary Diagnostic Laboratory for qualitative and quantitative fecal float. Time of processing did not impact (P > 0.05) any measured response criteria. On d 0, there was a high prevalence of fecal parasites, which was significantly reduced by d 14 (94.5% vs. 23.1% of cattle with detected fecal parasites on d 0 vs. 14, respectively). On d 0, semi-quantitative density showed the highest concentration of strongyle and eimeria parasites, which were significantly reduced (P < 0.05) by d 14 (315 and 155 vs. 2 and 6.5 eggs/g of feces detected for strongyle vs. eimeria, respectively). In summary, time of rest prior to processing had no detected impact on anthelmintic response, but the dual injection/oral protocol used in this experiment was highly effective at reducing parasite levels within 2 weeks of administration.

2020 ◽  
Vol 98 (Supplement_3) ◽  
pp. 152-152
Author(s):  
Zachary Buessing ◽  
A J Tarpoff ◽  
Miles Theurer ◽  
Tom Jones ◽  
Cassandra K Jones

Abstract Cattle feeders work to decrease the severity of transport stress. The objective of this experiment was to determine how the time of rest prior to processing impacts subsequent performance in feedlot heifers during the receiving period. Eighty mixed-breed heifers (250 ± 4.2 kg BW) were purchased at live auction in Oklahoma City, OK and transported to the Kansas State University Manhattan. Heifers were allotted in a completely randomized design to one of four treatments, then processed at 0, 6, 24, or 48 hours after arrival. After all cattle were processed, they were placed in individual pens, where daily DMI, refusals, and health outcomes were evaluated twice daily. Cattle were individually weighed on d 0, 14, and 35. Data were analyzed using the GLIMMIX procedure of SAS (v. 9.4, Cary, NC). Time of processing did not impact (P > 0.10) heifer body weight or ADG. Overall, there was a linear inverse relationship between DMI and time at rest (P = 0.027) from d 0 to 14. The same pattern was observed overall, from d 0 to 35 (P = 0.027). Time of rest prior to processing impacted (P = 0.038) the proportion of heifers that reached a target of 2.5% DMI per BW by 14 days after arrival (25, 60, 53, and 24% of cattle with 0, 6, 24, or 48 hours of rest, respectively). While G:F and morbidity did not differ among treatments (P > 0.10), mortality increased linearly (P = 0.026) with increasing time of rest. This study suggests that allowing feedlot heifers to rest after arrival for more than 24 hours before processing may negatively affect subsequent DMI, but without substantially altering body weight or ADG in calves fed in individual pens; additional research in traditional feedlot group-housed pens and environment is warranted to further evaluate these effects.


2020 ◽  
Vol 98 (Supplement_3) ◽  
pp. 224-225
Author(s):  
Macie Reeb ◽  
A J Tarpoff ◽  
Miles Theurer ◽  
Tom Jones ◽  
Cassandra K Jones

Abstract It is common for feedyards to rest cattle after transport and prior to administration of their receiving vaccinations in an attempt to minimize transport stress and optimize vaccine efficacy and subsequent growth and performance. This study aimed to test how time of rest after feedlot heifer arrival impacts vaccine titer and blood metabolites indicating cattle health. Eighty mixed-breed heifers (250 ± 4.2 kg BW) were purchased at live auction in Oklahoma City, OK and transported to the Kansas State University in Manhattan. Heifers were allotted in a completely randomized design to one of four treatments processed at 0, 6, 24, or 48 hours after arrival. At the time of processing (d 0) and again on d 35, serum samples were collected and analyzed for infectious bovine rhinotrachitis (IBR) titer, large animal chemistry panel, and hepatic profile by the Kansas State Veterinary Diagnostic Laboratory. There were time × treatment interactions (P < 0.02) for serum IBR titer, glucose, urea nitrogen, and bicarbonate. Specifically, heifers had greater (P < 0.05) IBR titer on d 35 than d 0, as expected because they were vaccinated upon arrival. Upon arrival, cattle held for 48 hours prior to processing had significantly greater (P < 0.05) glucose than those held for 24 hours (108 vs. 68 mg/dL, respectively), with no differences in processing treatment by d 35. Urea nitrogen levels in cattle held for 6, 24, or 48 hr prior to processing were greater (P < 0.05) than those held for 0 hours on d 0 or for any length of time by d 35. Finally, cattle held for 0 or 48 hr prior to processing had lower (P < 0.05) bicarbonate levels on d 0 than those held for 6 or 24 hours by d 35. These data show that while initial rest may be helpful to normalize blood metabolites, cattle should be placed within 24 hours of arrival. However, most blood metabolite levels normalize within 35 days.


2012 ◽  
Vol 32 (5) ◽  
pp. 419-423 ◽  
Author(s):  
Luis E. Fazzio ◽  
Nicolas Yacachury ◽  
Walter R. Galvan ◽  
Elias Peruzzo ◽  
Ricardo O. Sánchez ◽  
...  

The aim was to evaluate for 75 days the impact on production of the remaining burden of ivermectin (IVM)-resistant parasites in naturally infected feedlot calves. The herds came from tick-infested areas of cattle breeding where the systematic use of IVM to control tick increases the gastrointestinal parasites resistant to this drug. This investigation was carried out in two commercial feedlots in Buenos Aires province. In feedlot A, two groups of 35 animal each received IVM 1% and the other received ricobendazole (RBZ) 10% respectively. The same was done in feedlot B. On day 0, two groups of 35 animals were made in feedlots A and B. Fecal samples were taken on days 0, 22, 54 and 75 pos-treatment (PT), and body weight was registered, from each animal. Fecal samples were processed for individual count of eggs per gram (EPG) and pooled fecal culture was carried out for identification of the parasite genus in each sampling. Fecal egg count reduction test (FECR) was calculated on day 22 PT. The study design used was a totally randomized block, with commercial feedlot and sex as block variables. For data analysis, a mixed model of the SAS statistical program was used. The FECR average on day 22 was 28.4% in the IVM group, and 94,2 % in the RBZ group . From this date on, significant differences in EPG were kept until day 54. EPG counts were only equal near the end of the trial, on day 75 (p=0.16). In both commercial feedlots, especially in the IVM group, Cooperia spp. was the most prevalent parasite in the fecal cultures. Significant differences in weight (P<0.01) on post-treatment day 75 was found between the average weight in the RBZ and the IVM group (246 vs. 238 kg respectively), what means a difference of 8.3% in gains. The importance for production in the antiparasite failure treatment in commercial feedlots was demonstrated, and the need of pos-treatment controls to evaluate the efficacy of the antiparasitic administered is emphasized.


2019 ◽  
Vol 97 (Supplement_3) ◽  
pp. 141-141
Author(s):  
Raghavendra Amachawadi ◽  
Xiaorong Shi ◽  
LeighAnn George ◽  
Miles Theurer ◽  
Twig Marston ◽  
...  

Abstract Shiga toxin-producing E. coli (STEC) belonging to serogroups O26, O45, O111, O103, O121, O145, and O157, called ‘top-7’, are major foodborne pathogens. Cattle are a major reservoir, in which STEC colonize the hindgut and are shed in the feces, which is a major source of contamination of food. Our objective was to evaluate the impact of a proprietary yeast-based synbiotic product (prebiotic and probiotic; Alltech, Inc., Nicholasville, KY) on fecal shedding of top-7 STEC in feedlot cattle. Twenty existing pens, housing 40–112 steers per pen, with an estimated 60 to 90 days to slaughter, were randomly assigned to a control group or a treatment group that received 22 g of the synbiotic product per steer per day, as a top dress, in a finishing diet. Twenty pen-floor fecal samples were collected from each pen on days 0, 21, 42, and 54. Fecal samples were enriched and subjected to a multiplex PCR assay targeting serogroup-specific genes for the top-7 STEC and three major virulence genes, stx1 (Shiga toxin 1), stx2 (Shiga toxin 2), and eae (intimin). Bivariate descriptive statistics for the major serogroups and virulence genes were assessed prior to multivariable analysis using mixed effects logistic regression. The overall prevalence of the top-7 serogroups were 44.5% of O26, 41.3% of O157, 15.1% of O103, 13.7% of O45, 7.8% of O121, and 0.6% of O111. The overall prevalence of stx1, stx2, and eae were 43.9%, 70.8%, and 49%, respectively. E. coli O26, O157, and O45 had a significant treatment and sampling day interaction (P &lt; 0.0001). On d 42, fecal samples from treated group had lower prevalence (P &lt; 0.01) of O26, O103, and O45 compared to the control group. In conclusion, the in-feed administration of the synbiotic product appears to reduce fecal shedding of certain top-7 STEC serogroups in the feedlot cattle.


2020 ◽  
Vol 98 (Supplement_3) ◽  
pp. 221-221
Author(s):  
Yuzhi Li ◽  
Alexander Hernandez ◽  
Rick Carr ◽  
Shelby Dukes ◽  
Maria Lou ◽  
...  

Abstract Swine parasites present challenges for organic pig farmers and represent suboptimal animal health because the use of synthetic anthelmintics is not allowed. The objective of this study was to investigate prevalence and fecal egg counts (FEC) of three intestinal parasites (Ascaris, Trichuris, and Oesophagostomum) on organic pig farms in the United States. Nine farms across 4 states were investigated. Pigs on all farms were raised within no-confinement facilities, had access to the outdoors or pasture except for one farm that housed sows in a hoop barn, and no use of synthetic anthelmintics from birth to market weight for growing/finishing pigs and from the third trimester of gestation for sows. Herd size varied from 12 to 416 (median=50) pigs. Four to 16 fecal samples were collected in each pen or pasture. A total of 186 samples were analyzed for FEC using the concentrated McMaster technique to yield eggs per gram (epg) of fecal samples. Pigs were categorized as breeders (gestating sows of all parity and boars), growing pigs (2 to 5 months old), or finishing pigs (5 months old to market weight). Results indicate that 56%, 89%, and 44% farms were infected with Oesophagostomum, Ascaris, and Trichuris, respectively. Overall, breeders on infected farms had higher (P=0.01) FEC of Oesophagostomum (1,115 epg ± 1,647 SD) than growing (60 epg ± 9.5 SD) and finishing pigs (237 epg ± 234 SD). Growing and finishing pigs had higher (P&lt; 0.001) FEC of Ascaris (1,733 epg ± 1,208 SD for growing pigs; 1,162 epg ± 630 SD for finishing pigs) than breeders (5 epg ± 2.5 SD). Trichuris FEC was relatively low (&lt; 80 epg for pigs in all production stages) compared to other parasites. Results suggest that swine parasite infection is common on organic/alternative farms and strategies to control parasites need to be developed.


2019 ◽  
Author(s):  
Johann F. Coetzee ◽  
Drew R. Magstadt ◽  
Lendie Follett ◽  
Pritam K. Sidhu ◽  
Adlai M. Schuler ◽  
...  

AbstractAlthough 90% of BRD relapses are reported to receive retreatment with a different class of antimicrobial, studies examining the impact of antimicrobial selection (i.e. bactericidal or bacteriostatic) on retreatment outcomes and the emergence of antimicrobial resistance (AMR) are deficient in the published literature. A survey was conducted to determine the association between antimicrobial class selection for retreatment of BRD relapses on antimicrobial susceptibility of Mannheimia haemolytica, Pasteurella multocida, and Histophilus somni. Pathogens were isolated from samples submitted to the Iowa State University Veterinary Diagnostic Laboratory from January 2013 to December 2015. A total of 781 isolates with corresponding animal case histories, including treatment protocols, were included in the analysis. Original susceptibility testing of these isolates for ceftiofur, danofloxacin, enrofloxacin, florfenicol, oxytetracycline, spectinomycin, tilmicosin, and tulathromycin was performed using Clinical and Laboratory Standards Institute guidelines. Data were analyzed using a Bayesian approach to evaluate whether retreatment with antimicrobials of different mechanistic classes (bactericidal or bacteriostatic) increased the probability of resistant BRD pathogen isolation in calves. The posterior distribution we calculated suggests that an increased number of treatments is associated with a greater probability of isolates resistant to at least one antimicrobial. In addition, the frequency of resistant M. haemolytica isolates was greater with retreatment using antimicrobials of different mechanistic classes than retreatment with the same class. Specifically, treatment protocols using a bacteriostatic drug first followed by retreatment with a bactericidal drug was associated with a higher frequency of resistant BRD pathogen isolation. This effect was more profound with specific treatment combinations; tulathromycin (bacteriostatic) followed by ceftiofur (bactericidal) was associated with the highest probability of resistant isolates among all antimicrobial combinations. These findings suggest that the selection of antimicrobial mechanistic class for retreatment of BRD should be considered as part of an antimicrobial stewardship program.


2021 ◽  
Vol 99 (Supplement_3) ◽  
pp. 163-164
Author(s):  
Maria B Niehues ◽  
Alexandre Perdigão ◽  
Victor Valério de Carvalho ◽  
Tiago S Acedo ◽  
Guilherme S F M Vasconcellos ◽  
...  

Abstract Our objective was to evaluate the effects of feed additives on ruminal pH of finishing cattle fed a 90%-concentrate diet. Twenty four 18-mo Angus-Nellore crossbred bulls (IBW, 456 ± 6,5 kg) were allocated in a completely randomized design to three treatments with eight replicates each, as follows: 1) Control (MON) - Sodium Monensin (26 mg/kg DM, Rumensin, Elanco Animal Health); 2) Crina® RumistarTM (CR) - a blend of essential oils, 90 mg/kg DM + exogenous α-amylase, 560 mg/kg DM) and 3) CR + HyD® (25-hydroxy-vitamin-D3 at 1 mg/animal/d, CRD, DSM Nutritional Products). The ruminal pH and temperature was monitored individually for 98 days, using a wireless bolus, (SmaXtec Animal Care, Austria). Data were analyzed using the Mixed procedure of SAS and means comparison evaluated by Tukey test at P&lt; 0.05. During adaptation period (i.e. first 14 days), bulls fed CR and CRD had increased rumen mean pH (6.40 and 6.36 vs. 6.16; P &lt; 0.01) and minimum pH (5.89 and 5.87 vs. 5.57; P &lt; 0.01) than bulls fed MON. In addition, rumen pH from bulls fed CR spent less time below 6.0 than bulls fed MON (256.07 vs. 452.62 min/d; P = 0.05). Regarding the total period, bulls fed MON had lower mean (6.22 vs. 6.51 and 6.42, P &lt; 0.01) and minimum rumen pH (5.60 vs. 5.92 and 5.85, P &lt; 0.01) than bulls fed CR and CRD. Additionally, feeding MON increased time duration of rumen pH (390.79 min/day, P &lt; 0.01) and had a larger area below 6.0 (81.52 min x pH units/day; P &lt; 0.01). Moreover, the addition of monensin increased pH time duration below 5.8 (161.10 vs. 121.13 and 122.56 min/day; P = 0.02) compared with CR and CRD, and increased ruminal temperature (39.60 vs. 39.51 and 39.5 °C; P &lt; 0.01). We conclude that feeding Crina® RumistarTM and Crina® RumistarTM HyD® increased the rumen pH of bulls.


2021 ◽  
Vol 99 (Supplement_3) ◽  
pp. 313-313
Author(s):  
Xandra Christine A Meneses ◽  
Rachel M Park ◽  
Emily Ridge ◽  
Courtney L Daigle

Abstract Every organism has evolved patterned responses to its temporal and physical surroundings. Rhythmicity is a central regulator of life and a sentinel for animal health and metabolism, thus chronic stress and disease can disrupt behavioral patterns. Feedlot cattle may exhibit irregularities in circadian rhythms due to social, environmental, and nutritional stressors and may benefit from behavior-based management strategies. This study characterized the hourly behavioral patterns of feedlot cattle with and without environmental enrichment, established behavioral expectations for animal managers, and proposed practical interventions. Fifty-four crossbred steers were shipped to Texas A&M AgriLife Feedlot in Bushland, Texas, blocked by weight, and assigned to one of six pens (n = 9 steers/pen), half of which had a cattle brush and half did not. Frequency of headbutting, mounting, bar licking, tongue rolling, allogrooming, and brush usage was decoded from video recordings of cattle from 08:00h to 17:30h on d -2, -1, 0, 1, 2, 4, 8, 16, 32, and 64 relative to brush implementation. The impact of time (hour), treatment, and their interaction on cattle behavior were evaluated using PROC MIXED in SAS. Brush use (P &lt; 0.0001), allogrooming (P &lt; 0.0001), and mounting (P &lt; 0.0001) were performed at lower frequencies during early hours of the day and at higher frequencies in the afternoon. Both tongue rolling (P &lt; 0.0001) and bar licking (P &lt; 0.0349) occurred most often during daylight hours in accordance with a diurnal pattern. Major periods of headbutting (P &lt; 0.0001) were observed in the morning and afternoon. Behavioral expectations were characterized so that stockpeople could observe the prevalence of each behavior during morning, midday, and evening to facilitate the implementation of best management practices. Proposed interventions include medical treatment, modified pen surface or bunk management, altered stocking density, and/or introduction of environmental enrichment.


2003 ◽  
Vol 15 (4) ◽  
pp. 387-389 ◽  
Author(s):  
Michael J. Yaeger ◽  
Andrew Holtcamp ◽  
Julie A. Jarvinen

This report describes an outbreak of coccidiosis in a boar stud. A live, untreated, adult boar with a history of diarrhea was submitted to the Iowa State University Veterinary Diagnostic Laboratory, Ames, IA. For a 3-month period, approximately 40% of the boars in this stud had developed gray to brown diarrhea that lasted 1–3 days. Affected boars did not lose condition, and antibiotic therapy did not appear to affect the clinical course of the disease. At necropsy, the distal ileum was palpably thickened and covered by a thick, yellow-green, fibrinous exudate. Microscopic changes in the ileum consisted of an erosive enteritis associated with the presence of numerous coccidia within mid to superficial villus enterocytes. The mucosa was covered by a fibrinous exudate admixed with numerous nonsporulated coccidian oocysts. A light growth of Salmonella enterica serovar Derby was isolated from the small intestine of this animal, but laboratory tests were negative for Lawsonia and Brachyspira spp. Individual or paired fecal samples were obtained from 6 additional boars experiencing similar clinical signs. Numerous Eimeria spinosa oocysts were identified in these samples. Neither Salmonella nor Brachyspira spp. were cultured from submitted fecal samples. Necropsy of a live boar and examination of feces from 6 additional animals confirmed that the mild, sporadic, transient diarrhea in this boar stud was due to coccidiosis.


2018 ◽  
Vol 4 (2) ◽  
pp. 205511691881324
Author(s):  
Jessica Meekins ◽  
Ada G Cino-Ozuna

Case series summary A 5-month-old male intact domestic shorthair (DSH) cat (cat 1), a 1-year-old male neutered DSH cat (cat 2) and a 1.5-year-old female spayed DSH cat (cat 3) were submitted for gross necropsy after acute death, with the clinical suspicion of cytauxzoonosis. All three cats displayed signs of rapidly progressive clinical deterioration, including lethargy, anorexia, and hyper- or hypothermia. Cat 1 was euthanized owing to the grave prognosis for survival, whereas cats 2 and 3 were found dead 1–4 days after the onset of clinical signs. Remains were submitted to the Kansas State University Veterinary Diagnostic Laboratory for gross necropsy. In all three cats, general examination findings included icterus of the mucous membranes, multifocal pulmonary parenchymal hemorrhages, and splenic reddening and enlargement. Histologic examination revealed macrophages laden with protozoal schizonts diffusely distributed within blood vessels and vascular spaces of all affected organs, including the blood vessels of the uveal tract. The ciliary body within the anterior uveal tract was most affected. Relevance and novel information This is the first description of cytauxzoonosis affecting the eyes of infected cats. This report confirms involvement of ocular blood vessels similar to the classic lesions of the lungs, spleen and liver. In cats presenting with a history and clinical findings suggestive of cytauxzoonosis, complete ophthalmic examination is indicated to confirm or rule out ocular involvement.


Sign in / Sign up

Export Citation Format

Share Document