PSXIV-22 Selection for low fecal egg count estimated breeding value correlates with greater circulating antibody in sheep

2021 ◽  
Vol 99 (Supplement_3) ◽  
pp. 494-494
Author(s):  
Desirae Smith ◽  
Kelsey Bentley ◽  
Scott A Bowdridge

Abstract Sheep selected for resistance to gastro-intestinal parasites have been shown to have greater survivability to weaning. Data from Katahdin sheep indicates that selection based on post-weaning fecal egg count estimated breeding values (PWFEC EBV) may further improve generalized immunity. However, no data exists to confirm this increased circulating antibody occurs in breeds genetically unrelated to Katahdins. In the fall of 2020 post-weaning blood and fecal samples were collected from Shropshire sheep (n = 42) and Polypay sheep (n = 91). The blood samples were analyzed for total immunoglobulin-G (IgG) using ELISA. Shropshire sheep were sorted into low (PWFEC EBV < 0) and high (PWFEC EBV > 0) groups based on fecal egg count (FEC), which were analyzed via a modified McMaster’s method. Polypay sheep were sorted into three groups by PWEC EBV; A (< -50) B (>-50 < +50) and C (>+50). In Shropshire group, individuals with low FEC had greater average IgG concentration (87.9 µg/mL) than those with high FEC (62.4 µg/mL) (P > 0.05). In the Polypay group, sheep in PWFEC EBV group A had numerically higher IgG concentration (86.2 µg/mL) than sheep in group B (71.2 µg/mL) and group C (53.1 µg/mL) (P > 0.05). While data in either breed were not significant, the trend observed across breeds indicate that sheep with a lower PWFEC EBV have numerically greater circulating antibody.

2019 ◽  
Vol 6 (6) ◽  
pp. 1931
Author(s):  
Nimesh B. Thakkar ◽  
Pranav Patel ◽  
Gautam Sonagra

Background: The present study of evaluation of the use of electrocautery to incise the skin has been done to evaluate and assess basically. The advantages and disadvantages of the electrocautery to incise the skin when compared with that of scalpel. The results of the use of electrocautery on skin wound are then assessed to formulate the criteria for proper case selection for this procedure.Methods: A total of 100 patients were taken for this study. 50 patients underwent electrocautry monopolar mode incision (group A) who were compared with 50 scalpel incision patients (group B). Study was done from 01 January 2016 to 30 September 2017. Variables used in this study were complication like pain, lack of apposition and skin infection at the site of incision, pain, sinus formation and induration. This method was also evaluated with respect to following parameters: days of hospitalization, cosmetic result, rate of infection, wound apposition and requirement of secondary suturing.Results: As per our study, results are in favour of electrocautry by means of hemostasis. But we found that infection rate and complications are more with it. Moreover number of dressings required and hospital stay was also more with patient undergoing skin incision with electrocautry.Conclusions: For locally overlying healthy skin with no compromise of vascularity or any oedema and there is less fat, electrocautery use for skin incision can still be recommended for better cosmetic result and shorter healing time with less complication and rapid surgery.


2020 ◽  
Vol 98 (Supplement_2) ◽  
pp. 71-71
Author(s):  
Andrew R Weaver ◽  
Joan M Burke ◽  
Jim Morgan ◽  
Donald L Wright ◽  
Scott P Greiner ◽  
...  

Abstract Selection for reduced fecal egg count (FEC) is an important management tool to combat anthelmintic resistance in worm populations. To understand consequences of selection for parasite resistance, a divergent mating scheme was established whereby Katahdin rams with high (HiFEC; n = 4) or low fecal egg count (LoFEC; n = 4) estimated breeding values (EBV) were randomly mated to ewes at the Southwest Virginia AREC. Mid-March born lambs (n = 199) were managed as one group until approximately 120 days of age (Weaning: June 4). Beginning at 45 days of age, BW and FAMACHA scores were collected weekly. Deworming occurred as necessary (FAMACHA ≥ 3) and fecal egg counts were taken biweekly. Statistical analysis was performed using SAS with fixed effects of sire EBV type. Fecal egg count was significantly higher in HiFEC-sired lambs one week prior to weaning (665 vs. 427 eggs/g, P < 0.05) and again at three (3398 vs. 2175 eggs/g, P < 0.01) and five (3596 vs. 2209 eggs/g, P < 0.01) weeks post-weaning. These FEC differences corresponded to differences in lamb weaning and post-weaning FEC EBV (43% vs. -43%, P < 0.05 and 82% vs. -66%, P < 0.01; respectively). Lambs sired by HiFEC-rams required more anthelmintic treatment than LoFEC-selected lambs (76% vs. 61%, P < 0.05). The weaning and post-weaning FEC EBV of HiFEC-sired lambs was greater in lambs that died than those that survived to 120 d (37% vs. 83%, P < 0.01 and 73% vs. 138%, P < 0.01; respectively); however, this phenomenon did not occur in LoFEC-sired lambs. Taken together, these data indicate that LoFEC-sired lambs have reduced parasite burden and are more likely to survive to weaning. Thus, sire selection for low FEC EBV can have indirect effects on lamb survival and general immunity.


2019 ◽  
Vol 7 (1) ◽  
pp. 14
Author(s):  
Prodip Kumar Halder ◽  
Biplob Kumar Sarker ◽  
Md. Shah Alam ◽  
Jannatun Nime ◽  
Md. Tareq Mussa ◽  
...  

Background: Parasitic disease constitutes 60-70% diseases affecting the animals and has serious economic implication in livestock entrepreneurship by direct and indirect production loss. Indiscriminate use of anthelmintic drugs has made the situation even more precarious. A similar problem was encountered in goat from Holidhani, Jhenidah, where goats with complain of intermittent diarrhea and loss of body condition was reported despite of routine deworming.Objective:  Determining the efficacy of conventional anthelmintics used and its comparison with some unexploited antiparasitic drugs for the same reason.Methods: Sixty-five goats were divided into five groups. Group A goats were kept as the control, Group B (I, II, III), group C (IV, V, VI), group D (VII, VIII, IX) and group E (X, XI, XII) goats were treated with levamisole, albendazole, fenbendazole and ivermectin respectively. All the treated and control goats were kept, housed for 21 days after the first treatment. Fecal samples were collected and counted on 1st, 7th, 14th and 21st day by using McMaster counting method.Results: Among the three doses of levamisole, albendazole, fenbendazole and ivermectin, the doses of 7.5, 7.5, 5.0 and 0.2 mg/kg body weight, body weight were found to be most effective against gastrointestinal nematodes in goats with a maximum reduction of fecal egg count to the extent of 95.38, 97.13, 98.08 & 99.16 percent respectively.Conclusion: The study revealed low efficacy of levamisole and hence ivermectin is a better drug than albendazole and fenbendazole to control gastrointestinal nematodes in goats. 


2018 ◽  
Author(s):  
Rosanna Coates-Brown ◽  
Josephine Moran ◽  
Pisut Pongchaikul ◽  
Alistair Darby ◽  
Malcolm J. Horsburgh

AbstractThe bacterial genus Staphylococcus comprises diverse species with most being described as colonizers of human and animal skin. A relational analysis of features that discriminate its species and contribute to niche adaptation and survival remains to be fully described. In this study, an interspecies, whole-genome comparative analysis of 21 Staphylococcus species was performed based on their orthologues. Three well-defined multi-species groups were identified: group A (including aureus/epidermidis); group B (including saprophyticus/xylosus) and group C (including pseudintermedius/delphini). The machine learning algorithm Random Forest was applied to identify variable orthologues that drive formation of the Staphylococcus species groups A-C. Orthologues driving staphylococcal infrageneric diversity comprised regulatory, metabolic and antimicrobial resistance proteins. Notably, the BraSR (NsaRS) two-component system (TCS) and its associated BraDE transporters that regulate antimicrobial resistance distinguish group A Staphylococcus species from others in the genus that lack the BraSR TCS. Divergence of BraSR and GraSR antimicrobial peptide survival TCS and their associated transporters was observed across the staphylococci, likely reflecting niche specific evolution of these TCS/transporters and their specificities for AMPs. Experimental evolution, with selection for resistance to the lantibiotic nisin, revealed multiple routes to resistance and differences in the selection outcomes of the BraSR- positive species S. hominis and S. aureus. Selection supported a role for GraSR in nisin survival responses of the BraSR-negative group B species S. saprophyticus. Our study reveals diversification of antimicrobial-sensing TCS across the staphylococci and hints at differential relationships between GraSR and BraSR in those species positive for both TCS.ImportanceThe genus Staphylococcus includes species that are commensals and opportunist pathogens of humans and animals. Identifying the features that discriminate species of staphylococci is relevant to understanding niche selection and the structure of their microbiomes. Moreover, the determinants that structure the community are relevant for strategies to modify the frequency of individual species associated with dysbiosis and disease. In this study, we identify orthologous proteins that discriminate genomes of staphylococci. In particular, species restriction of a major antimicrobial survival system, BraSR (NsaRS), to a group of staphylococci dominated by those that can colonize human skin. The diversity of antimicrobial sensing loci was revealed by comparative analysis and experimental evolution with selection for nisin resistance identified the potential for variation in antimicrobial sensing in BraRS-encoding staphylococci. This study provides insights into staphylococcal species diversity.


2020 ◽  
Vol 98 (Supplement_2) ◽  
pp. 70-70
Author(s):  
Kelsey Bentley ◽  
Andrew R Weaver ◽  
Joan M Burke ◽  
Jim Morgan ◽  
Lee Wright ◽  
...  

Abstract Over the past two years of mating high (Hi) or low (Lo) fecal egg count (FEC) EBV Katahdin rams randomly mated to Katahdin ewes, reduced FEC in lambs has been the hallmark trait observable at weaning. Upon additional analysis, death loss in lambs also segregated with sire FEC EBV; whereas, HiFEC-sired lambs had a death loss of 29.9% in 2018 and 14.5% in 2019. Yet, LoFEC-sired lambs had a death loss of 10% in both years. Increased death loss in 2018 may have been due to an outbreak of Clostridium perfringens Type-A and as a result lambs in 2019 were vaccinated for Clostridium type A. Regardless, this had no impact on death loss in LoFEC-sired lambs. Therefore, it can be hypothesized that sire FEC EBV indirectly selects for enhanced, generalized immunity. To initially test this hypothesis, serum was collected from HiFEC- and LoFEC-sired lambs weekly, prior to and after typical clostridium toxoid vaccination and boostering. Lamb serum was pooled by week and within sire, where there were 4 sires per FEC EBV group. Serum was analyzed for total immunoglobulin-G (IgG) by using absorbance at 450nm as the metric. Data were analyzed by using the general linear model of SAS with fixed effects of sire EBV type and week. A comparison of means was conducted by using the LS means procedure with Bonferroni adjustment. Absorbance of serum from LoFEC-sired lambs was higher across all time points than that of serum from HiFEC-sired lambs (1.66 vs. 1.41 ± 0.04; P < 0.0001), meaning that LoFEC-sired lambs had higher circulating IgG than lambs sired by HiFEC rams. Taken together, these data provide preliminary evidence that indicate segregation of lamb generalized immunity by sire FEC EBV.


Author(s):  
Taber A. Ba-Omar ◽  
Philip F. Prentis

We have recently carried out a study of spermiogenic differentiation in two geographically isolated populations of Aphanius dispar (freshwater teleost), with a view to ascertaining variation at the ultrastructural level. The sampling areas were the Jebel Al Akhdar in the north (Group A) and the Dhofar region (Group B) in the south. Specimens from each group were collected, the testes removed, fixed in Karnovsky solution, post fixed in OsO, en bloc stained with uranyl acetate and then routinely processed to Agar 100 resin, semi and ultrathin sections were prepared for study.


VASA ◽  
2015 ◽  
Vol 44 (3) ◽  
pp. 0220-0228 ◽  
Author(s):  
Marion Vircoulon ◽  
Carine Boulon ◽  
Ileana Desormais ◽  
Philippe Lacroix ◽  
Victor Aboyans ◽  
...  

Background: We compared one-year amputation and survival rates in patients fulfilling 1991 European consensus critical limb ischaemia (CLI) definition to those clas, sified as CLI by TASC II but not European consensus (EC) definition. Patients and methods: Patients were selected from the COPART cohort of hospitalized patients with peripheral occlusive arterial disease suffering from lower extremity rest pain or ulcer and who completed one-year follow-up. Ankle and toe systolic pressures and transcutaneous oxygen pressure were measured. The patients were classified into two groups: those who could benefit from revascularization and those who could not (medical group). Within these groups, patients were separated into those who had CLI according to the European consensus definition (EC + TASC II: group A if revascularization, group C if medical treatment) and those who had no CLI by the European definition but who had CLI according to the TASC II definition (TASC: group B if revascularization and D if medical treatment). Results: 471 patients were included in the study (236 in the surgical group, 235 in the medical group). There was no difference according to the CLI definition for survival or cardiovascular event-free survival. However, major amputations were more frequent in group A than in group B (25 vs 12 %, p = 0.046) and in group C than in group D (38 vs 20 %, p = 0.004). Conclusions: Major amputation is twice as frequent in patients with CLI according to the historical European consensus definition than in those classified to the TASC II definition but not the EC. Caution is required when comparing results of recent series to historical controls. The TASC II definition of CLI is too wide to compare patients from clinical trials so we suggest separating these patients into two different stages: permanent (TASC II but not EC definition) and critical ischaemia (TASC II and EC definition).


VASA ◽  
2015 ◽  
Vol 44 (6) ◽  
pp. 451-457 ◽  
Author(s):  
Vincenzo Gasbarro ◽  
Luca Traina ◽  
Francesco Mascoli ◽  
Vincenzo Coscia ◽  
Gianluca Buffone ◽  
...  

Abstract. Background: Absorbable sutures are not generally accepted by most vascular surgeons for the fear of breakage of the suture line and the risk of aneurysmal formation, except in cases of paediatric surgery or in case of infections. Aim of this study is to provide evidence of safety and efficacy of the use of absorbable suture materials in carotid surgery. Patients and methods: In an 11 year period, 1126 patients (659 male [58.5 %], 467 female [41.5 %], median age 72) underwent carotid endarterectomy for carotid stenosis by either conventional with primary closure (cCEA) or eversion (eCEA) techniques. Patients were randomised into two groups according to the type of suture material used. In Group A, absorbable suture material (polyglycolic acid) was used and in Group B non-absorbable suture material (polypropylene) was used. Primary end-point was to compare severe restenosis and aneurysmal formation rates between the two groups of patients. For statistical analysis only cases with a minimum period of follow-up of 12 months were considered. Results: A total of 868 surgical procedures were considered for data analysis. Median follow-up was 6 years (range 1-10 years). The rate of postoperative complications was better for group A for both cCEA and eCEA procedures: 3.5 % and 2.0 % for group A, respectively, and 11.8 % and 12.9 % for group B, respectively. Conclusions: In carotid surgery, the use of absorbable suture material seems to be safe and effective and with a general lower complications rate compared to the use of non-absorbable materials.


Phlebologie ◽  
2009 ◽  
Vol 38 (04) ◽  
pp. 157-163 ◽  
Author(s):  
A. Franek ◽  
L. Brzezinska-Wcislo ◽  
E. Blaszczak ◽  
A. Polak ◽  
J. Taradaj

SummaryA prospective randomized clinical trial was undertaken to compare a medical compression stockings with two-layer short-stretch bandaging in the management of venous leg ulcers. Study endpoints were number of completely healed wounds and the clinical parameters predicting the outcome. Patients, methods: Eighty patients with venous leg ulcers were included in this study, and ultimately allocated into two comparative groups. Group A consisted of 40 patients (25 women, 15 men). They were treated with the compression stockings (25–32 mmHg) and drug therapy. Group B consisted of 40 patients (22 women, 18 men). They were treated with the short-stretch bandages (30–40 mmHg) and drug therapy, administered identically as in group A. Results: Within two months the 15/40 (37.50%) patients in group A and 5/40 (12.50%) in group B were healed completely (p = 0.01). For patients with isolated superficial reflux, the healing rates at two months were 45.45% (10/22 healed) in group A and 18.18% (4/22 healed) in group B (p = 0.01). For patients with superficial plus deep reflux, the healing rates were 27.77% (5/18 healed) in group A and 5.55% (1/18 healed) in group B (p = 0.002). Comparison of relative change of the total surface area (61.55% in group A vs. 23.66% in group B), length (41.67% in group A vs. 27.99% in group B), width (46.16% in group A vs. 29.33% in group B), and volume (82.03% in group A vs. 40.01% in group B) demonstrated difference (p = 0.002 in all comparisons) in favour of group A. Conclusion: The medical compression stockings are extremely useful therapy in enhancement of venous leg ulcer healing (both for patients with superficial and for patients who had superficial plus deep reflux). Bandages are less effective (especially for patients with superficial plus deep reflux, where the efficiency compared to the stockings of applied compression appeared dramatically low). These findings require confirmation in other randomized clinical trials with long term results.


1989 ◽  
Vol 61 (01) ◽  
pp. 140-143 ◽  
Author(s):  
Yoshitaka Mori ◽  
Hideo Wada ◽  
Yutaka Nagano ◽  
Katsumi Deguch ◽  
Toru Kita ◽  
...  

SummaryBlood coagulation in a strain of rabbits designated as Watanabe heritable hyperlipidemic (WHHL) rabbits was examined. The activities of vitamin K-dependent clotting factors, contact factors and clotting factor VIII (F VIII) and the fibrinogen level were significantly higher in WHHL rabbits than in normolipidemic rabbits (all age groups). Values for vitamin Independent clotting factor were already higher at 2 months of age. Contact factors and fibrinogen levels increased age after 5 to 8 months. F VIII increased between 5 and 8 months and then decreased. At 2 months of age, WHHL rabbits were divided into two groups. Group A was fed standard rabbit chow and group B standard rabbit chow containing 1% probucol. Probucol prevented the progression of atherosclerosis in group B in the absence of a significant reduction in plasma cholesterol level. F VIII and fibrinogen levels were statistically decreased in all rabbits at all ages in group B (P<0.05). These differences in clotting factors between the two groups were most obvious at 8 months (P<0.02).We conclude that vitamin K-dependent clotting factors may increase with hyperlipemia and that increases in F VIII and fibrinogen may be closely related to the progression of throm- boatherosclerosis.


Sign in / Sign up

Export Citation Format

Share Document