Serotyping and Antibiotic Resistance Profiling of Salmonella in Feedlot and Nonfeedlot Beef Cattle

2002 ◽  
Vol 65 (11) ◽  
pp. 1694-1699 ◽  
Author(s):  
JOHN C. BEACH ◽  
ELSA A. MURANO ◽  
GARY R. ACUFF

As part of a larger study to assess risk factors associated with hide and carcass contamination of beef cattle during transport to slaughter, a total of 281 salmonellae were isolated from 1,050 rectal, hide, carcass, and environmental samples. For feedlot cattle, salmonellae were recovered from 4.0% of rectal samples, 37.5% of hide samples, 19.0% of carcass samples, and 47.4% of environmental samples. For nonfeedlot cattle, salmonellae were recovered from 10.9% of rectal samples, 37.5% of hide samples, 54.2% of carcass samples, and 50.0% of environmental samples. Overall, the five serotypes most commonly associated with feedlot cattle and their environment were Salmonella Anatum (18.3% of the isolates), Salmonella Kentucky (17.5%), Salmonella Montevideo (9.2%), Salmonella Senftenberg (8.3%), and Salmonella Mbandaka (7.5%). The five serotypes most commonly associated with nonfeedlot cattle and their environment were Salmonella Kentucky (35.4%), Salmonella Montevideo (21.7%), Salmonella Cerro (7.5%), Salmonella Anatum (6.8%), and Salmonella Mbandaka (5.0%). Antimicrobial susceptibility testing of all of the isolates associated with feedlot cattle revealed that 21.7% were resistant to tetracycline, compared with 11.2% of the isolates associated with nonfeedlot cattle. None of the other isolates from feedlot cattle were resistant to any of other antimicrobial agents tested, whereas 6.2% of nonfeedlot cattle isolates were resistant to more than four of the antimicrobial agents tested.

2003 ◽  
Vol 131 (3) ◽  
pp. 1187-1203 ◽  
Author(s):  
T. HALD ◽  
A. WINGSTRAND ◽  
M. SWANENBURG ◽  
A. von ALTROCK ◽  
B.-M. THORBERG

This study was part of an international research project entitled SALINPORK (FAIR CT-950400) initiated in 1996. The objectives were to investigate the occurrence of Salmonella in pig slaughterhouses and to identify risk factors associated with the contamination of pig carcasses. Data was collected from 12 slaughterhouses in five European countries. Isolates were characterized by serotyping, phage typing and antimicrobial susceptibility. In one country, no Salmonella was found. Salmonella was isolated from 5·3% of 3485 samples of pork and from 13·8% of 3573 environmental samples from the seven slaughterhouses in the four remaining countries. The statistical analyses (multi-level logistic regression) indicated that the prevalence was significantly higher during the warmer months and that the environmental contamination increased during the day of slaughter. The polishing (OR 3·74, 95% CI 1·43–9·78) and pluck removal (OR 3·63, 95% CI 1·66–7·96) processes were found to contribute significantly to the total carcass contamination, the latter especially if the scalding water also was contaminated. To reduce carcass contamination, it is recommended to ensure sufficiently high temperatures of scalding water (62 °C) and appropriate cleaning and disinfection of the polishing equipment at least once a day in order to reduce the level of carcass contamination and consequently the prevalence of Salmonella in pork.


2010 ◽  
Vol 7 (7) ◽  
pp. 825-833 ◽  
Author(s):  
Alice L. Green ◽  
David A. Dargatz ◽  
Bruce A. Wagner ◽  
Paula J. Fedorka-Cray ◽  
Scott R. Ladely ◽  
...  

1984 ◽  
Vol 5 (8) ◽  
pp. 390-394 ◽  
Author(s):  
James W. Buehler ◽  
Robert J. Finton ◽  
Richard A. Goodman ◽  
Keewhan Choi ◽  
John C. Hierholzer ◽  
...  

AbstractIn Fall 1981, an outbreak of acute infectious conjunctivitis with keratitis (EKC) occurred in patients who had visited a private ophthalmology clinic just prior to onset of illness. Among an estimated 2,200 patient visits to the office from August 10 to October 15, 1981 for problems unrelated to infectious conjunctivitis, 39 (1.8%) persons subsequently developed EKC. The median incubation period was 6.5 days (range, 1 to 14 days). A case-control study was done to identify risk factors associated with contracting EKC; patients with EKC were more likely than control patients to have been examined by one or the other of two of the four ophthalmologists at the clinic and to have undergone procedures such as tonometry or foreign body removal. Adenovirus was isolated from conjunctival swabs from four of five persons with conjunctivitis; three were type 8 and one was type 7. Recognition of the problem and improved handwashing practices were associated with terminating the outbreak. This outbreak illustrates the potential for transmission of adenovirus infection during the provision of eye care. Infection control practitioners should be familiar with measures for the prevention of such infections among ophthalmology patients.


2005 ◽  
Vol 71 (7) ◽  
pp. 3872-3881 ◽  
Author(s):  
G. D. Inglis ◽  
T. A. McAllister ◽  
H. W. Busz ◽  
L. J. Yanke ◽  
D. W. Morck ◽  
...  

ABSTRACT The influence of antimicrobial agents on the development of antimicrobial resistance (AMR) in Campylobacter isolates recovered from 300 beef cattle maintained in an experimental feedlot was monitored over a 315-day period (11 sample times). Groups of calves were assigned to one of the following antimicrobial treatments: chlortetracycline and sulfamethazine (CS), chlortetracycline alone (Ct), virginiamycin, monensin, tylosin phosphate, and no antimicrobial agent (i.e., control treatment). In total, 3,283 fecal samples were processed for campylobacters over the course of the experiment. Of the 2,052 bacterial isolates recovered, 92% were Campylobacter (1,518 were Campylobacter hyointestinalis and 380 were C. jejuni). None of the antimicrobial treatments decreased the isolation frequency of C. jejuni relative to the control treatment. In contrast, C. hyointestinalis was isolated less frequently from animals treated with CS and to a lesser extent from animals treated with Ct. The majority (≥94%) of C. jejuni isolates were sensitive to ampicillin, erythromycin, and ciprofloxacin, but more isolates with resistance to tetracycline were recovered from animals fed Ct. All of the 1,500 isolates of C. hyointestinalis examined were sensitive to ciprofloxacin. In contrast, 11%, 10%, and 1% of these isolates were resistant to tetracycline, erythromycin, and ampicillin, respectively. The number of animals from which C. hyointestinalis isolates with resistance to erythromycin and tetracycline were recovered differed among the antimicrobial treatments. Only Ct administration increased the carriage rates of erythromycin-resistant isolates of C. hyointestinalis, and the inclusion of CS in the diet increased the number of animals from which tetracycline-resistant isolates were recovered. The majority of C. hyointestinalis isolates with resistance to tetracycline were obtained from cohorts within a single pen, and most of these isolates were recovered from cattle during feeding of a forage-based diet as opposed to a grain-based diet. The findings of this study show that the subtherapeutic administration of tetracycline, alone and in combination with sulfamethazine, to feedlot cattle can select for the carriage of resistant strains of Campylobacter species. Considering the widespread use of in-feed antimicrobial agents and the high frequency of beef cattle that shed campylobacters, the development of AMR should be monitored as part of an on-going surveillance program.


2007 ◽  
Vol 29 (3) ◽  
pp. 207-212 ◽  
Author(s):  
Ana Paula Werneck de Castro ◽  
Helio Elkis

OBJECTIVE:The purpose of this study was to evaluate the rehospitalization rates of patients discharged from the Institute of Psychiatry of the Hospital das Clínicas of the Universidade de São Paulo Medical School while being treated with haloperidol, risperidone or clozapine. METHOD: This is a naturalistic study designed to monitor rehospitalization rates for patients discharged on haloperidol (n = 43), risperidone (n = 22) or clozapine (n = 31). Time to readmission over the course of three years was measured by the product-limit (Kaplan-Meier) method. Risk factors associated with rehospitalizations were examined. RESULTS: At 36 months, remained in the community 74% of the haloperidol-treated patients, 59% of the risperidone-treated patients and 84% of the clozapine-treated patients. The haloperidol group showed a higher proportion of women, a late age of onset and shorter length of illness than the other groups, whereas the opposite was observed in the clozapine group. CONCLUSIONS: This study suggests that the rehospitalization rates of patients taking clozapine are lower than the rate for patients treated with haloperidol and risperidone. However confounding variables such as gender distribution and age of onset represent limitations that should be taken into account for the interpretation of the results.


2021 ◽  
Vol 9 (7) ◽  
pp. 1535
Author(s):  
Shahar Rotem ◽  
Ida Steinberger-Levy ◽  
Ofir Israeli ◽  
Eran Zahavy ◽  
Ronit Aloni-Grinstein

A bioterror event using an infectious bacterium may lead to catastrophic outcomes involving morbidity and mortality as well as social and psychological stress. Moreover, a bioterror event using an antibiotic resistance engineered bacterial agent may raise additional concerns. Thus, preparedness is essential to preclude and control the dissemination of the bacterial agent as well as to appropriately and promptly treat potentially exposed individuals or patients. Rates of morbidity, death, and social anxiety can be drastically reduced if the rapid delivery of antimicrobial agents for post-exposure prophylaxis and treatment is initiated as soon as possible. Availability of rapid antibiotic susceptibility tests that may provide key recommendations to targeted antibiotic treatment is mandatory, yet, such tests are only at the development stage. In this review, we describe the recently published rapid antibiotic susceptibility tests implemented on bioterror bacterial agents and discuss their assimilation in clinical and environmental samples.


Pathogens ◽  
2021 ◽  
Vol 10 (12) ◽  
pp. 1553
Author(s):  
Yi-Chen Chen ◽  
Wen-Yu Chin ◽  
Chao-Chin Chang ◽  
Shih-Te Chuang ◽  
Wei-Li Hsu

Bovine leukaemia virus (BLV), which is classified as a Deltaretrovirus, is the aetiologic agent of enzootic bovine leukosis (EBL), a chronic lymphoproliferative disorder with a worldwide distribution. EBL is widespread in dairy herds and causes a direct economic impact due to reduced milk production and the early culling of BLV-infected cattle. The BLV infection status in Taiwan remains largely unknown; a high prevalence of BLV in dairy cows was recently revealed. The present study further investigated BLV infections in beef cattle. Surprisingly, the prevalence of BLV proviral DNA was as low as 11.8% (23/195), which is significantly lower than that noted in dairy cows, which was 42.5% (102/240) (p < 0.001). Factors associated with BLV infections were subsequently investigated. Due to the differences in herd management, an analysis of risk factors for a BLV infection was independently conducted in these two sectors. Several factors associated with a BLV infection were identified. Age was significantly associated with BLV infection status in dairy cows (p < 0.001) but not in beef cattle. A high prevalence of BLV was observed in cattle >15.5 months old (57.8%) compared with those ≤15.5 months old (11.4%). Moreover, after stratification analysis, based on the critical age of 15.5 months, as determined by the receiver operating characteristic (ROC) curve, a significantly higher BLV prevalence was demonstrated in lactating dairy cows, cattle undergoing bull breeding, heifers at older ages, and those undergoing routine rectal palpation. Due to the high prevalence of BLV in Taiwan, the development of an effective control program, based on the identified risk factors, is important for interrupting the routes of BLV transmission within herds.


2021 ◽  
Vol 14 (1) ◽  
Author(s):  
Paula Andreia Fabris Giudice ◽  
Susana Angélica Zevallos Lescano ◽  
William Henry Roldan Gonzáles ◽  
Rogério Giuffrida ◽  
Fernanda Nobre Bandeira ◽  
...  

Abstract Background Toxocariasis, caused by a nematode species of the genus Toxocara, has been described as one of the most prevalent zoonotic helminthiases worldwide. Human transmission may occur by ingesting Toxocara spp. larvae from raw or undercooked meat or organs; however, no comprehensive serosurvey study has been conducted to date investigating the role of cattle as paratenic hosts. The aim of the study reported here was to assess the prevalence of anti-Toxocara spp. antibodies and associated risk factors in bovines from two slaughterhouses located in Presidente Prudente, southeastern Brazil. Methods Blood samples were collected and tested by indirect enzyme-linked immunosorbent assay (ELISA). Cattle farmers voluntarily responded to an epidemiologic questionnaire. Results Overall, 213 of the 553 (38.5%) bovine samples were assessed as seropositive for anti-Toxocara spp. antibodies by indirect ELISA. Multivariate analysis revealed that the source of beef cattle and the presence of dogs or cats at the farm were associated with seropositivity. The use of feedlot systems was associated with lower likelihood of seropositivity. Conclusions These results indicate a high level of anti-Toxocara seropositivity in slaughterhouse cattle, with potentially contaminated meat posing an infection risk to humans. In addition, the presence of dogs and cats where the slaughtered beef cattle were raised was statistically associated with bovine seropositivity, probably due to the overlapping environment at the farm and the lack of pet deworming. The use of feedlot systems was a protective factor likely due to the absence of dog and cat contact, elevated feeding troughs that avoid contact with contaminated soil or grass, and younger age at slaughter of feedlot cattle. In summary, bovines may be used as environmental sentinels of Toxocara spp. contamination, and high seropositivity of slaughterhouse cattle may indicate a potential risk of human toxocariasis through the ingestion of raw or undercooked contaminated meat.


PEDIATRICS ◽  
1984 ◽  
Vol 73 (5) ◽  
pp. 626-630
Author(s):  
R. G. Bhagat ◽  
H. Chai

Posterior subcapsular cataracts (PSCC) are known complications of systemic steroid therapy. Previous studies have not clearly identified the asthmatic children at risk for the development of PSCC. The possible factors associated with systemic steroid administration that may influence the development of PSCC in asthmatic children were evaluated: (1) duration of administered steroids, (2) dose, (3) number of steroid "bursts" in the preceding year, (4) degree of retardation of bone age, and (5) mode of administration (daily or alternate day) of steroids. Of the 40 asthmatic children requiring steroids, seven (17.5%) had PSCC. All of the children with PSCC had been receiving steroids on a daily or alternate-day basis for at least 2 years and all had markedly delayed bone age. Only one child with the concurrent occurrence of these two factors did not develop PSCC. None of the other risk factors considered could distinguish the group of patients with PSCC. It is concluded that asthmatic children receiving steroids for 2 years or longer and having markedly delayed bone age are at a greater risk for the development of PSCC.


Sign in / Sign up

Export Citation Format

Share Document