How food supply in rubbish dumps affects the breeding success and offspring mortality of cattle Egret Bubulcus ibis?

2021 ◽  
pp. 175815592110660
Author(s):  
Rachida Gherbi-Salmi ◽  
Abdelkrim Si Bachir ◽  
Cherif Ghazi ◽  
Salah Eddine Doumandji

The objective of this study was to evaluate the effect of food supply in garbage dumps on the reproductive fitness of Cattle Egret Bubulcus ibis and offspring losses. A total of 236 nests were monitored during two distinct periods of 2 years for each: 146 nests during a period without food supply in dumps (1998–1999) and 90 with food supply in dumps (2007–2008). The study was carried out in the colony of El-Kseur in the Lower Soummam Valley (northeast Algeria). For the entire study period, the mean of clutch size, average number of hatched chicks, productivity, and breeding success varied significantly between years (Kruskal–Wallis test: p < .05). Also, the average calculated losses for eggs, chicks, and total offspring vary significantly (Chi2 test: p > .0001). The clutch size and the number of hatched chicks per nest were highest during the period with food supply in garbage dump (respectively: 3.46 ± 0.86; 2.85 ± 1.11), compared to the period when cattle egrets feed in natural or agricultural habitats (3.04 ± 0.87; 2.54 ± 1.03). However, productivity and breeding success were highest during the period without food supply (respectively: 2.11 ± 1.16 fledging’s/nest; 0.70 ± 0.35) than in the period with food supply (1.14 ± 0.91; 0.35 ± 0.30). While egg losses were substantially similar between the two study periods, chick’s mortality (59.9%) and total offspring losses (36.7%) were higher during the period with food supply. The generalized linear mixed model (GLMM) analysis indicated a large negative effect of food supply in dumps on the productivity, on the chick’s losses; and a positive effect on the total offspring losses ( p < .001). Also, feed in dump garbage revealed a significant negative effect on the breeding success linear mixed model (LMM, p = .01). However, no significant effects (GLMM, p > .05) of food supply in dumps were noted on average clutch size, the mean number of hatched chicks per nest, and egg losses.

2021 ◽  
Vol 50 (4) ◽  
pp. E7
Author(s):  
Arvid Frostell ◽  
Maryam Haghighi ◽  
Jiri Bartek ◽  
Ulrika Sandvik ◽  
Bengt Gustavsson ◽  
...  

OBJECTIVE Isolated nonsyndromic sagittal synostosis (SS) is the most common form of craniosynostosis in children, accounting for approximately 60% of all craniosynostoses. The typical cranial measurement used to define and follow SS is the cephalic index (CI). Several surgical techniques have been suggested, but agreement on type and timing of surgery is lacking. This study aimed to evaluate the authors’ institutional experience of surgically treating SS using a modified subtotal cranial vault remodeling technique in a population-based cohort. Special attention was directed toward the effect of patient age at time of surgery on long-term CI outcome. METHODS A retrospective analysis was conducted on all patients with isolated nonsyndromic SS who were surgically treated from 2003 to 2011. Data from electronic medical records were gathered. Eighty-two patients with SS were identified, 77 fulfilled inclusion criteria, and 72 had sufficient follow-up data and were included. CI during follow-up after surgery was investigated with ANOVA and a linear mixed model. RESULTS In total, 72 patients were analyzed, consisting of 16 females (22%) and 56 males (78%). The mean ± SD age at surgery was 4.1 ± 3.1 months. Blood transfusions were received by 81% of patients (26% intraoperatively, 64% postoperatively, 9% both). The mean ± SD time in the pediatric ICU was 1.1 ± 0.25 days, and the mean ± SD total hospital length of stay was 4.6 ± 2.0 days. No patient required reoperation. The mean ± SD CI increased from 69 ± 3 to 87 ± 5 for patients who underwent surgery before 45 days of age. Surgery resulted in a larger increase in CI for patients who underwent surgery at a younger age compared with older patients (p < 0.05, Tukey’s HSD test). In the comparison of patients who underwent surgery before 45 days of age with patients who underwent surgery at 45–90, 90–180, and more than 180 days of age, the linear mixed model estimated a long-term loss of CI of 3.0, 5.5, and 7.4 points, respectively. CONCLUSIONS The modified subtotal cranial vault remodeling technique used in this study significantly improved CI in patients with SS. The best results were achieved when surgery was performed early in life.


2021 ◽  
Vol 30 (1) ◽  
pp. 29-34
Author(s):  
Hector Nava-Trujillo ◽  
Robert Valeris-Chacin ◽  
Adriana Morgado-Osorio ◽  
Javier Hernández ◽  
Janeth Caamaño ◽  
...  

This study aimed to determine the effect of parity and season of calving on the probability of water buffalo cows becoming pregnant before 90 days postpartum. A retrospective analysis of reproductive records of 1,465 water buffaloes with 3,181 pregnancies was carried out. Buffaloes were grouped according to parity in one, two, or three and more calvings. Season of calving was created with the following values: long photoperiod (March-August) and short photoperiod (September-February) and predicted probabilities from the mixed-effects logistic regression model were calculated, and a generalized linear mixed model was fitted with random intercepts to calculate the log odds of becoming pregnant ≤90 days postpartum. The probability of pregnancy ≤90 days postpartum was 0.3645, and this was lower in primiparous (0.2717) in comparison with two-calved (0.3863) and three or more calving buffaloes (0.5166). Probability of pregnancy ≤90 days postpartum increased 1.77 odds by each increase in parity. The probability of becoming pregnant ≤90 days postpartum was higher in water buffaloes calving during the short photoperiod season (0.4239 vs. 0.2474, P>0.000), and water buffaloes calving during the long photoperiod season only had 0.2645 odds to become pregnant than those calving during the short photoperiod season. The negative effect of long photoperiod was observed indifferently of parity. In conclusion, primiparity and the long photoperiod affect water buffalo cow's reproductive performance, decreasing pregnancy probability during the first 90 days postpartum.


2017 ◽  
Vol 35 (1) ◽  
pp. 5-11 ◽  
Author(s):  
Alexander K Leung ◽  
Gaurav Puri ◽  
Bingshu E Chen ◽  
Zhenxian Gong ◽  
Eddie Chan ◽  
...  

ObjectivesWe created Physician Navigators in our ED to help improve emergency physician (EP) productivity. We aimed to quantify the effect of Physician Navigators on measures of EP productivity: patient seen per hour (Pt/hr), and turn-around time (TAT) to discharge. Secondary objectives included examining their impact on measures of ED throughput for non-resuscitative patients: ED length of stay (LOS), door-to-physician time and left-without-being-seen rates (LWBS).MethodsIn this retrospective study, 6845 clinical shifts worked by 20 EPs at a community ED in Newmarket, Canada from 1 January 2012 to 31 March 2015 were evaluated. Using a clustered design, we compared productivity measures between shifts with and without Physician Navigators, by physician. We used a linear mixed model to examine mean changes in Pt/hr and TAT to discharge for EPs who employed Physician Navigators. For secondary objectives, autoregressive modelling was performed to compare ED throughput metrics before and after the implementation of Physician Navigators for non-resuscitative patients.ResultsPatient volumes increased by 20 patients per day (p<0.001). Mean Pt/hr increased by 1.07 patients per hour (0.98 to 1.16, p<0.001). The mean TAT to discharge decreased by 10.6 min (−13.2 to −8.0, p<0.001). After implementation of the Physician Navigator programme, overall mean LOS for non-resuscitative patients decreased by 2.6 min (p=0.007), and mean door-to-physician time decreased by 7.4 min (p<0.001). LBWS rates decreased from 1.13% to 0.63% of daily patient volume (p<0.001).ConclusionDespite an ED volume increase, the use of a Physician Navigator was associated with significant improvements in EP productivity, and significant reductions in ED throughput times.


2015 ◽  
Vol 2015 ◽  
pp. 1-5 ◽  
Author(s):  
Alireza Mirshahi ◽  
Peter Raak ◽  
Katharina Ponto ◽  
Bernhard Stoffelns ◽  
Katrin Lorenz ◽  
...  

Purpose. To report one-year results of phacoemulsification combined with deep sclerectomy and goniosynechiolysis ab interno for chronic glaucoma associated with peripheral anterior synechiae (PAS).Methods. We retrospectively analyzed medical charts of 16 patients (20 eyes) treated by one-site combined phacoemulsification and deep sclerectomy with goniosynechiolysis ab interno. PAS were transected by a spatula introduced into the anterior chamber through a paracentesis. To account for the correlation of right and left eyes a linear mixed model with unstructured covariance structure was calculated.Results. The mean preoperative intraocular pressure (IOP) was20.3±5.2 mmHg with2.4±1.0medications. One year postoperatively, the mean IOP was15.3±3.3 mmHg (P=0.004, pairedt-test) with0.6±1.0medications. A postoperative IOP of ≤21 mmHg without medication was achieved in 17 of 19 eyes (89.5%) and in 12/19 eyes (63.2%) at 3 and 12 months after surgery, respectively. In the remaining eyes (10.5% at 3 months and 36.8% at 12 months), additional medication led to an IOP ≤21 mmHg or the target pressure. No case required further glaucoma surgery. In one eye, conversion of the surgery to trabeculectomy was necessary due to Descemet’s window rupture.Conclusions. With goniosynechiolysis ab interno, effective and safe nonpenetrating glaucoma surgery is possible in presence of PAS.


2013 ◽  
Vol 59 (3) ◽  
pp. 527-535 ◽  
Author(s):  
Charlotte CM Schaap ◽  
Jan CM Hendriks ◽  
Guus AM Kortman ◽  
Siem M Klaver ◽  
Joyce JC Kroot ◽  
...  

BACKGROUND The iron-regulating hormone hepcidin is a promising biomarker in the diagnosis of iron disorders. Concentrations of hepcidin have been shown to increase during the day in individuals who are following a regular diet. It is currently unknown whether these increases are determined by an innate rhythm or by other factors. We aimed to assess the effect of dietary iron on hepcidin concentrations during the day. METHODS Within a 7-day interval, 32 volunteers received an iron-deficient diet on 1 day and the same diet supplemented with 65 mg ferrous fumarate at 0815 and 1145 on another day. Blood was drawn to assess ferritin, hepcidin-25, and transferrin saturation (TS) throughout both days at 4 time points between 0800 (fasted) and 1600. A linear mixed model for repeated data was used to analyze the effect of iron intake on TS and hepcidin concentrations. RESULTS Baseline values of hepcidin at 0800 correlated significantly with ferritin (r = 0.61). During the day of an iron-deficient diet the mean TS was similar both in men and in women, whereas hepcidin increased. During the day with iron supplementation the mean TS was significantly higher both in men and in women, and the mean hepcidin was moderately but significantly higher in women (1.0 nmol/L, 95% CI, 0.2–1.8) but not in men (0.0 nmol/L, 95% CI, −0.8 to 0.8). CONCLUSIONS Our data demonstrate that ferritin sets the basal hepcidin concentrations and suggest that innate diurnal rhythm rather than dietary iron mediates the daily hepcidin variations. These findings will be useful for optimizing sampling protocols and will facilitate the interpretation of hepcidin as an iron biomarker.


2021 ◽  
Author(s):  
Jørn Henrik Vold ◽  
Fatemeh Chalabianloo ◽  
Christer F. Aas ◽  
Else-Marie Løberg ◽  
Kjell Arne Johansson ◽  
...  

Abstract BackgroundContinuous use of amphetamines, alcohol, benzodiazepines, cannabis, cocaine, or opioids contributes to health impairments, increased morbidity, and overdose deaths among patients with substance use disorders (SUDs). This study evaluates the impact of inpatient detoxification, specialized opioid agonist therapy (OAT), and low-threshold municipality care on substance use over time. MethodsWe used data from a cohort of SUD patients in Norway through health assessments of self-reported substance use and sociodemographic and clinical factors. A total of 881 substance use measurements, including type and amount of substances, were assessed from 708 SUD patients in 2016-2020. Substance use for individual and total substances was calculated, creating a substance use severity index (SUSI) ranging from zero (no use) to one (daily use). We defined baseline as the first substance use measurement when the measurements were listed chronologically. Time was defined as years from baseline. We used a linear mixed model to analyze associations between the SUSI and inpatient detoxification, specialized OAT compared with low-threshold municipality care, as well as the factors like injecting substance use, gender, and age, presented with coefficients and 95% confidence intervals (CI).ResultsNeither inpatient detoxification (mean SUSI change: 0.01, -0.03;0.04) nor specialized OAT (0.03, -0.09;0.14) compared with low-threshold municipality care were associated with changes in substance use over time. Patients who were over 60 years of age (mean SUSI difference: -0.06, -0.13;0.00) had a lower SUSI than those under 30 years of age, while patients who injected substances had a higher SUSI than those who did not inject substances (0.18, 0.15;0.20) at baseline. The mean SUSI for the individual substances were 0.50 (standard deviation (SD): 0.38) for cannabis, 0.40 (0.37) for benzodiazepines, 0.33 (0.34) for amphetamines and cocaine, 0.31 (0.29) for alcohol, and 0.22 (0.31) for opioids at baseline. The mean SUSI of all substances was 0.35 (0.20). Conclusion The present study demonstrates that neither inpatient detoxification nor specialized OAT compared to low-threshold municipality care were associated with changes in substance use over time. Future research needs to evaluate the impact on substance use and healthy survival of multiple health care interventions to this patient group.


2020 ◽  
Vol 98 (Supplement_3) ◽  
pp. 211-211
Author(s):  
Jae-Cheol Jang ◽  
Zhikai Zeng ◽  
Pedro E Urriola ◽  
Gerald C Shurson

Abstract The objective of this study was to conduct a meta-analysis to quantitatively summarize the growth responses of broilers fed cDDGS and the efficacy of various types of dietary enzyme supplementation. A total of 12 publications with 69 observations were included in the database. Individual observations were analyzed using a multivariable linear mixed model. The mean differences (MD) of BWG, FI, and gain efficiency (G/F) were calculated by subtracting either the enzyme response in corn-soybean meal (CSB) or CSB+cDDGS based diets to the control, and was expressed as a percentage (MD = (enzyme – control)/control ×100%). A type of exogenous enzymes (xylanase; protease; carbohydrases; cocktail = proteases + carbohydrases), and feeding phase (starter = d 0 to d 21; finisher = d 21 to d 42 or 49; overall = d 0 to d 42 or more) were included as fixed effects. Dietary enzyme inclusion showed significant improvement on BWG (3.19%, P &lt; 0.01) and G/F (5.69%, P &lt; 0.01) in broilers fed cDDGS diet. However, no significant enzyme responses were observed in broilers fed CSB diet on growth performance. Broilers fed cDDGS diet had increased (P &lt; 0.01) BWG with the addition of protease (3.32 %) and cocktail (3.27 %), whereas addition of xylanased improved (P &lt; 0.01) G/F by (3.56 %) and carbohydrases (1.90 %). Broilers fed cDDGS diet with enzyme supplementation showed greater improvement in BWG (3.71 %, P &lt; 0.01) and G/F (3.78 %, P &lt; 0.01) at finisher phase compared with starter phase. Likewise, Broilers fed CSB diet with enzyme supplementation increased BWG (9.40 %, P &lt; 0.01) and G/F (3.11 %, P &lt; 0.01) at finisher phase. In conclusion, supplementation of xylanase and carbohydrases in cDDGS diet improved G/F, and the enzyme response can be maximized when fed during the finisher phase diet compared with the starter phase diet.


Author(s):  
Bruce Walsh ◽  
Michael Lynch

The selection intensity, the mean change in a trait within a generation expressed in phenotypic standard deviations, provides an important metric for comparing the strength of selection over designs. Further, under truncation selection (only individuals above some threshold leave offspring), the selection intensity is a function of the fraction saved, and hence the breeder's equation is often expressed in terms of the selection intensity. An important special case of truncation selection is a threshold trait, wherein an individual only expresses a particular phenotype when its underlying liability value exceeds some threshold. This chapter examines selection on such traits, and generalizes this binary-trait setting (with binomial residuals) to other classes of discrete traits, wherein some underling linear model (generating the threshold) is this transformed via a generalized linear mixed model into an observed trait value.


2020 ◽  
Vol 32 (2) ◽  
pp. 192
Author(s):  
R. G. Droher ◽  
F. Morotti ◽  
A. Guidugli Lindquist ◽  
A. Fonseca Zangirolamo ◽  
M. Marcondes Seneda

The antral follicle count (AFC) has been pointed out as one of the main factors that influence the efficiency of reproductive biotechniques. Thus, AFC is considered a fertility marker, which has high repeatability in the same animal. The objective of this study was to evaluate AFC in cows at AI (nonpregnant) and at two moments of gestation (30 and 60 days). For this, 35 pregnant cows, aged 23 to 99 months, confined in a freestall system, kept in the same environmental conditions (24°46'28”S, 49°56'42”W) and under the same feeding conditions, were selected for this study. The same technician performed the AI of the females, after detection of natural or induced oestrus, with 25mg of Dinoprost (Lutalyse) intramuscularly (IM). We established AFC (antral follicles ≥3mm diameter) using a convex intravaginal transducer at the time of AI and at 30 and 60 days of pregnancy. Females were distributed in groups of low (≤18 follicles, n=11), intermediate (≥24 and ≤28 follicles, n=11), or high AFC (≥30 follicles, n=13). Data were analysed using a generalized linear mixed model (Minitab version 18.1), with significance declared at P ≤ 0.05. In the low AFC group, there was an increase in the mean number of antral follicles in relation to gestation, presenting 14.82±1.36 at the time of AI and 23.45±2.31 and 35.18±3.17 at 30 and 60 days of pregnancy, respectively (P&lt;0.0001). Similarly, the intermediate group presented an increase in mean AFC, from 24.64±0.76 at the time of AI and 29.27±3.32 at 30 days to 43.36±5.91 at 60 days of gestation (P=0.001). There was no difference in the high AFC group at different moments, as these presented 41.92±3.50 at the time of AI, 38.31±3.96 at 30 days, and 50.92±5.49 at 60 days of pregnancy. Although mean AFCs differed (P&lt;0.0001) between the low, intermediate, and high AFC groups at time of AI and at 30 days of gestation, there were no differences in AFC between the three groups at 60 days of gestation (P=0.329). In conclusion, low AFC females presented an increase in the number of antral follicles at gestation moments 30 and 60 days from AI. This finding reinforces the strategy of using this category more efficiently for ovum pickup at the beginning of gestation.


Sign in / Sign up

Export Citation Format

Share Document