70 Occurrence of Mycotoxins in 2020 US Corn Silage and Dairy Total Mixed Rations

2021 ◽  
Vol 99 (Supplement_3) ◽  
pp. 37-38
Author(s):  
Paige N Gott ◽  
Erin Schwandt ◽  
Shelby M Ramirez ◽  
Ursula Hofstetter ◽  
Raj Murugesan

Abstract Mycotoxins are fungal metabolites which have been identified in many feed ingredients and ruminants have an increased risk of exposure due to the complex nature of their diets. Despite varying degrees of natural detoxification in the rumen, cattle health, productivity, and reproduction can be compromised by mycotoxins. This study investigated mycotoxin occurrence and contamination levels in 2020 U.S. corn silage and dairy total mixed ration (TMR) samples. Samples were screened via LC-MS/MS for six major mycotoxin groups including: aflatoxins, type A trichothecenes (A-Trich), type B trichothecenes (B-Trich), fumonisins (FUM), zearalenone (ZEN), and ochratoxin A. Information was collected with each submission including state of origin and whether or not clinical health or performance concerns were present. Contamination levels were analyzed using the GLIMMIX procedure (SAS 9.4, Cary, NC) to investigate the interaction of clinical concern (yes/no) and harvest year. No interaction was statistically significant for each of the toxin groups, so main effects of harvest year are presented in Table 1. Type B trichothecenes have been identified most frequently in 2020 corn silage. Although not often detected, A-Trich occurrence has increased ten percentage points from 2019 crop year. Among 236 TMR samples analyzed, B-Trich have been detected in a high percentage of samples. Mean contamination levels (parts per billion, ppb) are presented on a dry matter basis and within each data set were similar in 2020 compared to 2019 for respective toxin groups. Preliminary survey results indicate B-Trich occur frequently in both U.S. corn silage and TMR samples. Despite less frequent detection, other mycotoxin groups do occur including ZEN, FUM, and A-Trich, so the potential risk from toxin interactions due to co-contamination should be considered.

2020 ◽  
Vol 98 (Supplement_4) ◽  
pp. 35-36
Author(s):  
Paige N Gott ◽  
Erin F Schwandt ◽  
Shelby M Ramirez ◽  
Erika G Hendel ◽  
G Raj Murugesan ◽  
...  

Abstract Mycotoxins are secondary fungal metabolites that contaminate a variety of feedstuffs and are detrimental to animal health and productivity. The risk of broad-spectrum mycotoxin exposure is elevated for ruminants due to the complexity of their diets. This study investigated the occurrence and contamination levels of mycotoxins in forage-based feeds including corn silage, haylage (including various cuttings of multiple forage sources), and straw samples from across the US and Canada. Samples were screened for the six major mycotoxin groups: aflatoxins, type A trichothecenes, type B trichothecenes (B-Trich), fumonisins (FUM), zearalenone (ZEN), and ochratoxin A via LC-MS/MS technique. Samples submitted for clinical health or performance concerns were excluded from the data set. The B-Trich occurred most frequently (95% positive) among 92 corn silage samples harvested in 2019 followed by ZEN (34%) and FUM (9%). Mean contamination levels (parts per billion, ppb) on dry matter basis were 2,788 ppb, 456 ppb, and 194 ppb, respectively. Only 4% of silage samples were below the limit of detection (LOD) for all mycotoxins evaluated. Among 20 haylage samples from 2019 crop year, B-Trich were the most frequently detected, with 50% of samples positive at a mean concentration of 3,222 ppb. Twenty-one 2019 straw samples were submitted for analysis, with 100% testing positive for some level of B-Trich (mean 2,001 ppb) and 81% ZEN positive with an average of 640 ppb. Seventy-one percent of straw samples were positive for both B-Trich and ZEN. Preliminary mycotoxin survey results from US and Canada suggest B-Trich and ZEN are the most frequently occurring major mycotoxins in 2019 forage-based feed samples. Based on the frequency and levels of mycotoxin contamination identified in the current data set, continued analysis of forage-based feeds is warranted as mycotoxins pose a potential risk to the health, performance, and reproductive success of ruminants.


2019 ◽  
Vol 97 (Supplement_3) ◽  
pp. 126-127
Author(s):  
Paige N Gott ◽  
Erika G Hendel ◽  
Shelby M Curry ◽  
Ursula Hofstetter ◽  
G Raj Murugesan

Abstract Mycotoxins are harmful secondary fungal metabolites that are detrimental to animal health and productivity. This study investigated occurrence and contamination levels of mycotoxins in the 2018 US corn harvest including corn grain, corn silage and corn by-product feed ingredients (distillers dried grains, gluten feed, etc.). Corn and corn silage samples marked as 2018 harvest from feed mills, livestock producers, and integrators and corn DDGS and other by-products from mid-August 2018 through January 2019 were screened via the LC-MS/MS technique for the presence of six major mycotoxin groups: aflatoxins, type A trichothecenes, type B trichothecenes (B-Trich), fumonisins (FUM), zearalenone (ZEN) and ochratoxin. Parameters of the main toxins found were compared to the two prior harvest years using the Kruskal-Wallis Test (Prism7, GraphPad, La Jolla, CA) and are presented in Table 1. Mean toxin count per sample in corn grain increased in 2018 versus 2017, returning to over two toxins per sample as in 2016. Mean B-Trich level (ppb) is elevated in 2018 from 2017. Although ZEN contamination does not differ from prior years, prevalence increased to 45% from 25% in 2017. Co-contamination in corn by-product feeds is steady, averaging nearly three toxins per sample. Mean B-Trich is higher than the 2017 crop, while FUM contamination is similar to 2017. ZEN levels in 2018 by-products are the highest seen in the past three years. Corn silage has increased in mean toxin count per sample and contamination levels of B-Trich and ZEN in 2018 versus 2017. Prevalence of B-trich has remained constant from year to year, while both ZEN and FUM prevalence have increased from 2017 to 2018. Preliminary results of the 2018 corn-based feed ingredients survey suggest mycotoxin occurrence and contamination levels are approaching those observed in the challenging 2016 crop with continued concerns for co-contamination.


2019 ◽  
Vol 11 (1) ◽  
Author(s):  
Christina N. Lessov-Schlaggar ◽  
Olga L. del Rosario ◽  
John C. Morris ◽  
Beau M. Ances ◽  
Bradley L. Schlaggar ◽  
...  

Abstract Background Adults with Down syndrome (DS) are at increased risk for Alzheimer disease dementia, and there is a pressing need for the development of assessment instruments that differentiate chronic cognitive impairment, acute neuropsychiatric symptomatology, and dementia in this population of patients. Methods We adapted a widely used instrument, the Clinical Dementia Rating (CDR) Scale, which is a component of the Uniform Data Set used by all federally funded Alzheimer Disease Centers for use in adults with DS, and tested the instrument among 34 DS patients recruited from the community. The participants were assessed using two versions of the modified CDR—a caregiver questionnaire and an in-person interview involving both the caregiver and the DS adult. Assessment also included the Dementia Scale for Down Syndrome (DSDS) and the Raven’s Progressive Matrices to estimate IQ. Results Both modified questionnaire and interview instruments captured a range of cognitive impairments, a majority of which were found to be chronic when accounting for premorbid function. Two individuals in the sample were strongly suspected to have early dementia, both of whom had elevated scores on the modified CDR instruments. Among individuals rated as having no dementia based on the DSDS, about half showed subthreshold impairments on the modified CDR instruments; there was substantial agreement between caregiver questionnaire screening and in-person interview of caregivers and DS adults. Conclusions The modified questionnaire and interview instruments capture a range of impairment in DS adults, including subthreshold symptomatology, and the instruments provide complementary information relevant to the ascertainment of dementia in DS. Decline was seen across all cognitive domains and was generally positively related to age and negatively related to IQ. Most importantly, adjusting instrument scores for chronic, premorbid impairment drastically shifted the distribution toward lower (no impairment) scores.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Richard Johnston ◽  
Xiaohan Yan ◽  
Tatiana M. Anderson ◽  
Edwin A. Mitchell

AbstractThe effect of altitude on the risk of sudden infant death syndrome (SIDS) has been reported previously, but with conflicting findings. We aimed to examine whether the risk of sudden unexpected infant death (SUID) varies with altitude in the United States. Data from the Centers for Disease Control and Prevention (CDC)’s Cohort Linked Birth/Infant Death Data Set for births between 2005 and 2010 were examined. County of birth was used to estimate altitude. Logistic regression and Generalized Additive Model (GAM) were used, adjusting for year, mother’s race, Hispanic origin, marital status, age, education and smoking, father’s age and race, number of prenatal visits, plurality, live birth order, and infant’s sex, birthweight and gestation. There were 25,305,778 live births over the 6-year study period. The total number of deaths from SUID in this period were 23,673 (rate = 0.94/1000 live births). In the logistic regression model there was a small, but statistically significant, increased risk of SUID associated with birth at > 8000 feet compared with < 6000 feet (aOR = 1.93; 95% CI 1.00–3.71). The GAM showed a similar increased risk over 8000 feet, but this was not statistically significant. Only 9245 (0.037%) of mothers gave birth at > 8000 feet during the study period and 10 deaths (0.042%) were attributed to SUID. The number of SUID deaths at this altitude in the United States is very small (10 deaths in 6 years).


2019 ◽  
Vol 3 (Supplement_1) ◽  
pp. S641-S641
Author(s):  
Shanna L Burke

Abstract Little is known about how resting heart rate moderates the relationship between neuropsychiatric symptoms and cognitive status. This study examined the relative risk of NPS on increasingly severe cognitive statuses and examined the extent to which resting heart rate moderates this relationship. A secondary analysis of the National Alzheimer’s Coordinating Center Uniform Data Set was undertaken, using observations from participants with normal cognition at baseline (13,470). The relative risk of diagnosis with a more severe cognitive status at a future visit was examined using log-binomial regression for each neuropsychiatric symptom. The moderating effect of resting heart rate among those who are later diagnosed with mild cognitive impairment (MCI) or Alzheimer’s disease (AD) was assessed. Delusions, hallucinations, agitation, depression, anxiety, elation, apathy, disinhibition, irritability, motor disturbance, nighttime behaviors, and appetite disturbance were all significantly associated (p&lt;.001) with an increased risk of AD, and a reduced risk of MCI. Resting heart rate increased the risk of AD but reduced the relative risk of MCI. Depression significantly interacted with resting heart rate to increase the relative risk of MCI (RR: 1.07 (95% CI: 1.00-1.01), p&lt;.001), but not AD. Neuropsychiatric symptoms increase the relative risk of AD but not MCI, which may mean that the deleterious effect of NPS is delayed until later and more severe stages of the disease course. Resting heart rate increases the relative risk of MCI among those with depression. Practitioners considering early intervention in neuropsychiatric symptomology may consider the downstream benefits of treatment considering the long-term effects of NPS.


2021 ◽  
pp. 1-11
Author(s):  
Zach Pennington ◽  
Jeff Ehresman ◽  
Andrew Schilling ◽  
James Feghali ◽  
Andrew M. Hersh ◽  
...  

OBJECTIVE Patients with spine tumors are at increased risk for both hemorrhage and venous thromboembolism (VTE). Tranexamic acid (TXA) has been advanced as a potential intervention to reduce intraoperative blood loss in this surgical population, but many fear it is associated with increased VTE risk due to the hypercoagulability noted in malignancy. In this study, the authors aimed to 1) develop a clinical calculator for postoperative VTE risk in the population with spine tumors, and 2) investigate the association of intraoperative TXA use and postoperative VTE. METHODS A retrospective data set from a comprehensive cancer center was reviewed for adult patients treated for vertebral column tumors. Data were collected on surgery performed, patient demographics and medical comorbidities, VTE prophylaxis measures, and TXA use. TXA use was classified as high-dose (≥ 20 mg/kg) or low-dose (< 20 mg/kg). The primary study outcome was VTE occurrence prior to discharge. Secondary outcomes were deep venous thrombosis (DVT) or pulmonary embolism (PE). Multivariable logistic regression was used to identify independent risk factors for VTE and the resultant model was deployed as a web-based calculator. RESULTS Three hundred fifty patients were included. The mean patient age was 57 years, 53% of patients were male, and 67% of surgeries were performed for spinal metastases. TXA use was not associated with increased VTE (14.3% vs 10.1%, p = 0.37). After multivariable analysis, VTE was independently predicted by lower serum albumin (odds ratio [OR] 0.42 per g/dl, 95% confidence interval [CI] 0.23–0.79, p = 0.007), larger mean corpuscular volume (OR 0.91 per fl, 95% CI 0.84–0.99, p = 0.035), and history of prior VTE (OR 2.60, 95% CI 1.53–4.40, p < 0.001). Longer surgery duration approached significance and was included in the final model. Although TXA was not independently associated with the primary outcome of VTE, high-dose TXA use was associated with increased odds of both DVT and PE. The VTE model showed a fair fit of the data with an area under the curve of 0.77. CONCLUSIONS In the present cohort of patients treated for vertebral column tumors, TXA was not associated with increased VTE risk, although high-dose TXA (≥ 20 mg/kg) was associated with increased odds of DVT or PE. Additionally, the web-based clinical calculator of VTE risk presented here may prove useful in counseling patients preoperatively about their individualized VTE risk.


2018 ◽  
Vol 57 (1) ◽  
Author(s):  
Shannon Katiyo ◽  
Berit Muller-Pebody ◽  
Mehdi Minaji ◽  
David Powell ◽  
Alan P. Johnson ◽  
...  

ABSTRACT Nontyphoidal Salmonella (NTS) bacteremia causes hospitalization and high morbidity and mortality. We linked Gastrointestinal Bacteria Reference Unit (GBRU) data to the Hospital Episode Statistics (HES) data set to study the trends and outcomes of NTS bacteremias in England between 2004 and 2015. All confirmed NTS isolates from blood from England submitted to GBRU between 1 January 2004 and 31 December 2015 were deterministically linked to HES records. Adjusted odds ratios (AOR), proportions, and confidence intervals (CI) were calculated to describe differences in age, sex, antibiotic resistance patterns, and serotypes over time. Males, neonates, and adults above 65 years were more likely to have NTS bacteremia (AOR, 1.54 [95% CI, 1.46 to 1.67]; 2.57 [95% CI, 1.43 to 4.60]; and 3.56 [95% CI, 3.25 to 3.90], respectively). Proportions of bacteremia increased from 1.41% in 2004 to 2.67% in 2015. Thirty-four percent of all blood isolates were resistant to a first-line antibiotic, and 1,397 (56%) blood isolates were linked to an HES record. Of the patients with NTS bacteremia, 969 (69%) had a cardiovascular condition and 155 (12%) patients died, out of which 120 (77%) patients were age 65 years and above. NTS bacteremia mainly affects older people with comorbidities placing them at increased risk of prolonged hospital stay and death. Resistance of invasive NTS to first-line antimicrobial agents appeared to be stable in England, but the emergence of resistance to last-resort antibiotics, such as colistin, requires careful monitoring.


2020 ◽  
Vol 32 (6) ◽  
pp. 891-899 ◽  
Author(s):  
Jonathan J. Rasouli ◽  
Brooke T. Kennamer ◽  
Frank M. Moore ◽  
Alfred Steinberger ◽  
Kevin C. Yao ◽  
...  

OBJECTIVEThe C7 vertebral body is morphometrically unique; it represents the transition from the subaxial cervical spine to the upper thoracic spine. It has larger pedicles but relatively small lateral masses compared to other levels of the subaxial cervical spine. Although the biomechanical properties of C7 pedicle screws are superior to those of lateral mass screws, they are rarely placed due to increased risk of neurological injury. Although pedicle screw stimulation has been shown to be safe and effective in determining satisfactory screw placement in the thoracolumbar spine, there are few studies determining its utility in the cervical spine. Thus, the purpose of this study was to determine the feasibility, clinical reliability, and threshold characteristics of intraoperative evoked electromyographic (EMG) stimulation in determining satisfactory pedicle screw placement at C7.METHODSThe authors retrospectively reviewed a prospectively collected data set. All adult patients who underwent posterior cervical decompression and fusion with placement of C7 pedicle screws at the authors’ institution between January 2015 and March 2019 were identified. Demographic, clinical, neurophysiological, operative, and radiographic data were gathered. All patients underwent postoperative CT scanning, and the position of C7 pedicle screws was compared to intraoperative neurophysiological data.RESULTSFifty-one consecutive C7 pedicle screws were stimulated and recorded intraoperatively in 25 consecutive patients. Based on EMG findings, 1 patient underwent intraoperative repositioning of a C7 pedicle screw, and 1 underwent removal of a C7 pedicle screw. CT scans demonstrated ideal placement of the C7 pedicle screw in 40 of 43 instances in which EMG stimulation thresholds were > 15 mA. In the remaining 3 cases the trajectories were suboptimal but safe. When the screw stimulation thresholds were between 11 and 15 mA, 5 of 6 screws were suboptimal but safe, and in 1 instance was potentially dangerous. In instances in which the screw stimulated at thresholds ≤ 10 mA, all trajectories were potentially dangerous with neural compression.CONCLUSIONSIdeal C7 pedicle screw position strongly correlated with EMG stimulation thresholds > 15 mA. In instances, in which the screw stimulates at values between 11 and 15 mA, screw trajectory exploration is recommended. Screws with thresholds ≤ 10 mA should always be explored, and possibly repositioned or removed. In conjunction with other techniques, EMG threshold testing is a useful and safe modality in determining appropriate C7 pedicle screw placement.


2021 ◽  
Vol 1 (S1) ◽  
pp. s41-s42
Author(s):  
Swapnil Lanjewar ◽  
Ashley Kates ◽  
Lauren Watson ◽  
Nasia Safdar

Background: Up to 30% of patients with Clostridioides difficile infection (CDI) develop recurrent infection, which is associated with a 33% increased risk of mortality at 180 days. The gut microbiome plays a key role in initial and recurrent episodes of CDI. We examined the clinical characteristics and gut microbial diversity in patients with recurrent (rCDI) versus nonrecurrent CDI at a tertiary-care academic medical center. Methods: Stool samples were collected from 113 patients diagnosed with CDI between 2018 and 2019. Clinical and demographic data were extracted from the electronic medical record (Table 1), and 16S rRNA sequencing of the v4 region was carried out on the Illumina MiSeq using 2×250 paired-end reads. Sequences were binned into operational taxonomic units (OTUs) using mothur and were classified to the genus level whenever possible using the ribosomal database project data set version 16. Alpha diversity was calculated using the Shannon diversity index. Β diversity was calculated using the Bray-Curtis dissimilarity matrix. Differential abundance testing was done using DESeq to assess taxonomic differences between groups. A P value of .05 was used to assess significance. Results: In total, 55 patients had rCDI (prior positive C. difficile polymerase chain reaction in last 7–365 days) and 58 had nonrecurrent CDI (Table 1). Patients with rCDI had a higher frequency of organ transplant and comorbidity. No differences in α not β diversity were observed between groups. Also, 4 OTUs were more abundant in those with rCDI: Ruminococcus (n = 2), Odoribacter, and Lactobacillus. Patients with rCDI had microbiomes with greater proportions of Bacteroidetes (27% of OTUs) compared to the nonrecurrent group (18%) as well as fewer OTUs belonging to the Firmicutes phyla compared to the nonrecurrent patients (56% vs 59%). Among the rCDI patients, those experiencing 2 or more recurrences had greater abundances of Bacteroides and Ruminococcus, while those experiencing only 1 recurrence had significantly greater abundances of Akkermensia, Ruminococcus, Streptococcus, Roseburia, Clostridium IV, and Collinsella compared to those with only 1 recurrence (Table 2). Conclusions: Patients with rCDI had a more impaired microbiome than those with initial CDI. Ruminococcus OTUs have been previously indicated as a risk factor for recurrence and treatment failure, and they were significantly more abundant in those with rCDI and among those with multiple recurrences. The greatest differences in the microbiome were observed between those with 1 recurrence compared to those with multiple recurrences. Interventions for gut microbiome restoration should focus particularly on those with recurrent CDI.Funding: NoDisclosures: None


Author(s):  
Marie Prášilová ◽  
Tomáš Hlavsa

Vysočina is the region with highest numbers of the smallest villages in the Czech Republic. The current levels of the infrastructure, conditions of living of the inhabitants and chances for development of the communities were examined by means of a questionnaire area survey over the entire number of villages and towns of Vysočina Region in 2007. Survey results were tested as concerns representativity, statistically grouped and ordered in the manner of contingenc tables. For those aspects, where the community representatives felt a degree of deterioration a detailed statistical analysis was carried out. For the groups of villages up to 199 head and up to 499 head significance tests were carried out first and the degree of dependence measured by the Cramer coefficient. Statistical significance was an argument for deeper analyses. Sign sketches for 0,1 %, 1 % and 5 % significance levels were prepared for all the contingency tables. The answers of the conjuncture research were reduced to an alternative statistical variable and association was further studied between the village size and the problem areas of the community development. Probabilities of the separate variants were stated and risks and chances were evaluated for the possibilities for the smallest villages to be threatened as compared with the larger ones. The research results are presented in the shape of risk probabilities, both the relative and absolute ones, using the less applied measures for the risk measurement in two-way contingency tables. The computations are commented verbally and they bring new looks upon the perception of increased risk and chances improvement problems by means of qualitative statistical attributes. The solution is not only practically important but it offers an applicable general methodology instruction, too, for detailed analyses in the empirical research of qualitative phenomena.


Sign in / Sign up

Export Citation Format

Share Document