scholarly journals Hematological and Serum Biochemical Changes and Their Prognostic Value in Horses Spontaneously Poisoned by Crotalaria spectabilis

2022 ◽  
Vol 8 ◽  
Author(s):  
Antonio Carlos Lopes Câmara ◽  
Verônica Lourença de Sousa Argenta ◽  
Daniella Dianese Alves de Moraes ◽  
Eduardo Ferreira Fonseca ◽  
Tayná Cardim Moraes Fino ◽  
...  

Determining the prognosis of poisoning by plants containing pyrrolizidine alkaloids is usually challenging. This study aimed to identify important prognostic parameters that can determine the severity of spontaneous poisoning by Crotalaria spectabilis in horses. Blood samples from 42 horses spontaneously poisoned by oats contaminated with C. spectabilis seeds were evaluated. Complete blood counts (CBC) and serum biochemical tests [urea, creatinine, total protein, albumin, total and direct bilirubin concentrations, aspartate aminotransferase (AST), γ-glutamyl transferase (GGT), and creatine kinase (CK) activities] were performed. Horses were followed up for 12 months to determine the long-term survival rate; after 12 months, they were divided into two groups: survivors (n = 30) and non-survivors (n = 12). Horses spontaneously poisoned with C. spectabilis had higher levels of urea, globulin, bilirubin (total, direct, and indirect), AST, GGT, and CK than the reference values. Non-survivor horses showed significantly higher (p < 0.05) values of hemoglobin, GGT, and direct bilirubin than the survivor horses. Horses with serum GGT activity higher than 95 U/l had 14.0 times the risk of death compared to animals showing activities equal to or lower than this value, whereas horses with serum direct bilirubin concentration higher than 0.6 mg/dl (10.26 μmol/L) had 5.78 times the risk of death compared to the others. In summary, serum GGT activity and direct bilirubin concentration may be useful prognostic indicators for assessing the severity of C. spectabilis-poisoned horses.

Author(s):  
Jacob C Jentzer ◽  
Benedikt Schrage ◽  
David R Holmes ◽  
Salim Dabboura ◽  
Nandan S Anavekar ◽  
...  

Abstract Aims Cardiogenic shock (CS) is associated with poor outcomes in older patients, but it remains unclear if this is due to higher shock severity. We sought to determine the associations between age and shock severity on mortality among patients with CS. Methods and results Patients with a diagnosis of CS from Mayo Clinic (2007–15) and University Clinic Hamburg (2009–17) were subdivided by age. Shock severity was graded using the Society for Cardiovascular Angiography and Intervention (SCAI) shock stages. Predictors of 30-day survival were determined using Cox proportional-hazards analysis. We included 1749 patients (934 from Mayo Clinic and 815 from University Clinic Hamburg), with a mean age of 67.6 ± 14.6 years, including 33.6% females. Acute coronary syndrome was the cause of CS in 54.0%. The distribution of SCAI shock stages was 24.1%; C, 28.0%; D, 33.2%; and E, 14.8%. Older patients had similar overall shock severity, more co-morbidities, worse kidney function, and decreased use of mechanical circulatory support compared to younger patients. Overall 30-day survival was 53.3% and progressively decreased as age or SCAI shock stage increased, with a clear gradient towards lower 30-day survival as a function of increasing age and SCAI shock stage. Progressively older age groups had incrementally lower adjusted 30-day survival than patients aged <50 years. Conclusion Older patients with CS have lower short-term survival, despite similar shock severity, with a high risk of death in older patients with more severe shock. Further research is needed to determine the optimal treatment strategies for older CS patients.


Author(s):  
Matteo Innocenti ◽  
Francesco Muratori ◽  
Giacomo Mazzei ◽  
Davide Guido ◽  
Filippo Frenos ◽  
...  

Abstract Introduction Burch–Schneider-like antiprotrusio cages (B-SlAC) still remain helpful implants to bridge severe periacetabular bone losses. The purpose of this study was to evaluate outcomes and estimate both cages’ failures and complication risks in a series of B-SlAC implanted in revision of failed total hip arthroplasties (THA) or after resection of periacetabular primary or secondary bone malignancies. Risk factors enhancing the chance of dislocations and infections were checked. Materials and methods We evaluated 73 patients who received a B-SlAC from January 2008 to January 2018. Group A, 40 oncological cases (22 primary tumors; 18 metastases); Group B, 33 failed THAs. We compared both Kaplan–Meier estimates of risk of failure and complication with the cumulative incidence function, taking account the competing risk of death. Cox proportional hazards model was utilized to identify possible predictors of instability and infection. Harris hip score HHS was used to record clinical outcomes. Results Medium follow-up was 80 months (24–137). Average final HHS was 61 (28–92), with no differences within the two groups (p > 0.05). The probabilities of failure and complications were 57% and 26%, respectively, lower in the oncologic group than in the rTHA group (p =0 .176; risk 0.43) (p = 0.52; risk 0.74). Extended ileo-femoral approach and proximal femur replacement (p =0.02, risk ratio = 3.2; p = 0.04, rr = 2.1) were two significant independent predictors for dislocations, while belonging to group B (p = 0.04, rr = 2.6) was predictable for infections. Conclusion Burch–Schneider-like antiprotrusio cages are a classical non-biological acetabular reconstruction method that surgeons should bear in mind when facing gross periacetabular bone losses, independently of their cause. However, dislocation and infection rates are high. Whenever possible, we suggest preserving the proximal femur in revision THA, and to use a less-invasive postero-lateral approach to reduce dislocation rates in non-oncologic cases.


2020 ◽  
Vol 56 (4) ◽  
pp. 376-387
Author(s):  
Lívia Maia Pascoal ◽  
Marcos Venícios de Oliveira Lopes ◽  
Viviane Martins da Silva ◽  
Daniel Bruno Resende Chaves ◽  
Beatriz Amorim Beltrão ◽  
...  

2021 ◽  
Vol 12 (9) ◽  
pp. 79-83
Author(s):  
Noorin Zaidi ◽  
Rasha Zia Usmani ◽  
Kshama Tiwari ◽  
Sumaiya Irfan ◽  
Syed Riaz Mehdi

Background: There is a need to differentiate megaloblastic anemia from mixed deficiency anemia as both require different management protocols. With the acquisition of more information about them, tests such as serum vitamin estimation and Schilling test, were found to have their limitations. Hence there is a need to search newer diagnostic candidates to differentiate between megaloblastic anemia and mixed deficiency anemia. Aims and Objective: The current study was undertaken to find usefulness of serum Lactate Dehydrogenase (LDH) in differentiating megaloblastic anemia from mixed deficiency anemia. Materials and Methods: 100 patients were included in the study. Blood smears were stained and analysed. Complete blood counts were performed. Bone marrow examination was done, where needed. Biochemical tests were performed for estimation of vitamin B12, Folate and for LDH. Results: Out of the 100 cases 51 were diagnosed as megaloblastic anemia and 49 were diagnosed as mixed deficiency anemia. The LDH levels were significantly higher in cases of megaloblastic anemia as compared to mixed deficiency anemia. Conclusion: Serum LDH levels can be used in differentiating megaloblastic anemia from mixed deficiency anemia.


2004 ◽  
Vol 22 (4) ◽  
pp. 640-647 ◽  
Author(s):  
Gunar K. Zagars ◽  
Matthew T. Ballo ◽  
Andrew K. Lee ◽  
Sara S. Strom

Purpose To determine the incidence of potentially treatment-related mortality in long-term survivors of testicular seminoma treated by orchiectomy and radiation therapy (XRT). Patients and Methods From all 477 men with stage I or II testicular seminoma treated at The University of Texas M.D. Anderson Cancer Center (Houston, TX) with postorchiectomy megavoltage XRT between 1951 and 1999, 453 never sustained relapse of their disease. Long-term survival for these 453 men was evaluated with the person-years method to determine the standardized mortality ratio (SMR). SMRs were calculated for all causes of death, cardiac deaths, and cancer deaths using standard US data for males. Results After a median follow-up of 13.3 years, the 10-, 20-, 30-, and 40-year actuarial survival rates were 93%, 79%, 59%, and 26%, respectively. The all-cause SMR over the entire observation interval was 1.59 (99% CI, 1.21 to 2.04). The SMR was not excessive for the first 15 years of follow-up: SMR, 1.30 (95% CI, 0.93 to 1.77); but beyond 15 years the SMR was 1.85 (99% CI, 1.30 to 2.55). The overall cardiac-specific SMR was 1.61 (95% CI, 1.21 to 2.24). The cardiac SMR was significantly elevated only beyond 15 years (P < .01). The overall cancer-specific SMR was 1.91 (99% CI, 1.14 to 2.98). The cancer SMR was also significant only after 15 years of follow-up (P < .01). An increased mortality was evident in patients treated with and without mediastinal XRT. Conclusion Long-term survivors of seminoma treated with postorchiectomy XRT are at significant excess risk of death as a result of cardiac disease or second cancer. Management strategies that minimize these risks but maintain the excellent hitherto observed cure rates need to be actively pursued.


2019 ◽  
Vol 32 (S11) ◽  
pp. 41-46 ◽  
Author(s):  
C. A. Easton‐Jones ◽  
D. D. Cissell ◽  
F. C. Mohr ◽  
M. Chigerwe ◽  
N. Pusterla

2020 ◽  
Vol 20 (1) ◽  
Author(s):  
Tomasz Dziodzio ◽  
Robert Öllinger ◽  
Wenzel Schöning ◽  
Antonia Rothkäppel ◽  
Radoslav Nikolov ◽  
...  

Abstract Background MELD score and MELD score derivates are used to objectify and grade the risk of liver-related death in patients with liver cirrhosis. We recently proposed a new predictive model that combines serum creatinine levels and maximum liver function capacity (LiMAx®), namely the CreLiMAx risk score. In this validation study we have aimed to reproduce its diagnostic accuracy in patients with end-stage liver disease. Methods Liver function of 113 patients with liver cirrhosis was prospectively investigated. Primary end-point of the study was liver-related death within 12 months of follow-up. Results Alcoholic liver disease was the main cause of liver disease (n = 51; 45%). Within 12 months of follow-up 11 patients (9.7%) underwent liver transplantation and 17 (15.1%) died (13 deaths were related to liver disease, two not). Measures of diagnostic accuracy were comparable for MELD, MELD-Na and the CreLiMAx risk score as to power in predicting short and medium-term mortality risk in the overall cohort: AUROCS for liver related risk of death were for MELD [6 months 0.89 (95% CI 0.80–0.98) p < 0.001; 12 months 0.89 (95% CI 0.81–0.96) p < 0.001]; MELD-Na [6 months 0.93 (95% CI 0.85–1.00) p < 0.001 and 12 months 0.89 (95% CI 0.80–0.98) p < 0.001]; CPS 6 months 0.91 (95% CI 0.85–0.97) p < 0.01 and 12 months 0.88 (95% CI 0.80–0.96) p < 0.001] and CreLiMAx score [6 months 0.80 (95% CI 0.67–0.96) p < 0.01 and 12 months 0.79 (95% CI 0.64–0.94) p = 0.001]. In a subgroup analysis of patients with Child-Pugh Class B cirrhosis, the CreLiMAx risk score remained the only parameter significantly differing in non-survivors and survivors. Furthermore, in these patients the proposed score had a good predictive performance. Conclusion The CreLiMAx risk score appears to be a competitive and valid tool for estimating not only short- but also medium-term survival of patients with end-stage liver disease. Particularly in patients with Child-Pugh Class B cirrhosis the new score showed a good ability to identify patients not at risk of death.


2019 ◽  
Vol 54 (5) ◽  
pp. 1900096 ◽  
Author(s):  
Arnaud Roussel ◽  
Edouard Sage ◽  
Gilbert Massard ◽  
Pascal-Alexandre Thomas ◽  
Yves Castier ◽  
...  

IntroductionSince July 2007, the French high emergency lung transplantation (HELT) allocation procedure prioritises available lung grafts to waiting patients with imminent risk of death. The relative impacts of donor, recipient and matching on the outcome following HELT remain unknown. We aimed at deciphering the relative impacts of donor, recipient and matching on the outcome following HELT in an exhaustive administrative database.MethodsAll lung transplantations performed in France were prospectively registered in an administrative database. We retrospectively reviewed the procedures performed between July 2007 and December 2015, and analysed the impact of donor, recipient and matching on overall survival after the HELT procedure by fitting marginal Cox models.ResultsDuring the study period, 2335 patients underwent lung transplantation in 11 French centres. After exclusion of patients with chronic obstructive pulmonary disease/emphysema, 1544 patients were included: 503 HELT and 1041 standard lung transplantation allocations. HELT was associated with a hazard ratio for death of 1.41 (95% CI 1.22–1.64; p<0.0001) in univariate analysis, decreasing to 1.32 (95% CI 1.10–1.60) after inclusion of recipient characteristics in a multivariate model. A donor score computed to predict long-term survival was significantly different between the HELT and standard lung transplantation groups (p=0.014). However, the addition of donor characteristics to recipient characteristics in the multivariate model did not change the hazard ratio associated with HELT.ConclusionsThis exhaustive French national study suggests that HELT is associated with an adverse outcome compared with regular allocation. This adverse outcome is mainly related to the severity status of the recipients rather than donor or matching characteristics.


2017 ◽  
Vol 66 (03) ◽  
pp. 240-247 ◽  
Author(s):  
Hsin-Ling Lee ◽  
Yu-Jen Yang ◽  
Chung-Dann Kan

Background The aim of this study was to compare outcomes and identify factors related to increased mortality of open surgical and endovascular aortic repair (EVAR) of primary mycotic aortic aneurysms complicated by aortoenteric fistula (AEF) or aortobronchial fistula (ABF). Methods Patients with primary mycotic aortic aneurysms complicated by an AEF or ABF treated by open surgery or endovascular repair between January 1993 and January 2014 were retrospectively reviewed. Outcomes were compared between the open surgery and endovascular groups, and a Cox's proportional hazard model was used to determine factors associated with mortality. Results A total of 29 patients included 14 received open surgery and 15 received endovascular repair. Positive initial bacterial blood culture results included Salmonella spp., oxacillin-resistant Staphylococcus aureus, and Klebsiella pneumoniae. Mortality within 1 month of surgery was higher in the open surgery than in the endovascular group (43 vs. 7%, respectively, p = 0.035). Shock, additional surgery to repair gastrointestinal (GI) or airway pathology, and aneurysm rupture were associated with a higher risk of death. Compared with patients without resection surgery, the adjusted hazard ratio of death within 4 years in patients with resection for GI/bronchial disease was 0.25. Survival within 6 months was better in the endovascular group (p = 0.016). Conclusion The results of this study showed that EVAR/thoracic EVAR (TEVAR) is feasible for the management of infected aortic aneurysms complicated by an AEF or ABF, and results in good short-term outcomes. However, EVAR/TEVAR did not benefit long-term survival compared with open surgery.


2001 ◽  
Vol 19 (10) ◽  
pp. 2665-2673 ◽  
Author(s):  
Shinsaku Imashuku ◽  
Kikuko Kuriyama ◽  
Tomoko Teramura ◽  
Eiichi Ishii ◽  
Naoko Kinugawa ◽  
...  

PURPOSE: We sought to identify the clinical variables most critical to successful treatment of Epstein-Barr virus (EBV)–associated hemophagocytic lymphohistiocytosis (HLH). PATIENTS AND METHODS: Among the factors tested were age at diagnosis (< 2 years or ≥ 2 years), time from diagnosis to initiation of treatment with or without etoposide-containing regimens, timing of cyclosporin A (CSA) administration during induction therapy, and the presence or absence of etoposide. RESULTS: By Kaplan-Meier analysis, the overall survival rate for the entire cohort of 47 patients, most of whom had moderately severe to severe disease, was 78.3% ± 6.7% (SE) at 4 years. The probability of long-term survival was significantly higher when etoposide treatment was begun less than 4 weeks from diagnosis (90.2% ± 6.9% v 56.5% ± 12.6% for patients receiving this agent later or not at all; P < .01, log-rank test). Multivariate analysis with the Cox proportional hazards model demonstrated the independent prognostic significance of a short interval from EBV-HLH diagnosis to etoposide administration (relative risk of death for patients lacking this feature, 14.1; 95% confidence interval, 1.16 to 166.7; P = .04). None of the competing variables analyzed had significant predictive strength in the Cox model. However, concomitant use of CSA with etoposide in a subset of patients appears to have prevented serious complications from neutropenia during the first year of treatment. CONCLUSION: We conclude that early administration of etoposide, preferably with CSA, is the treatment of choice for patients with EBV-HLH.


Sign in / Sign up

Export Citation Format

Share Document