Iatrogenic hyperadrenocorticism in 28 dogs

1999 ◽  
Vol 35 (3) ◽  
pp. 200-207 ◽  
Author(s):  
HP Huang ◽  
HL Yang ◽  
SL Liang ◽  
YH Lien ◽  
KY Chen

Twenty-eight dogs with iatrogenic hyperadrenocorticism were studied. The most common clinical signs were cutaneous lesions (27/28), polydipsia (21/28), polyuria (19/28), and lethargy (16/28). The most predominant findings on biochemical profile were elevated alkaline phosphatase (ALP, 15/28) and alanine transferase (ALT, 14/28); hypercholesterolemia (14/28); elevated aspartate transferase (AST, 12/28); and elevated triglycerides (12/18). Baseline cortisol levels of all 28 dogs were at the lower end of the reference range and exhibited suppressed or no response to adrenocorticotropic hormone (ACTH) stimulation. The mean time for each dog to show initial improvement of clinical signs after corticosteroid withdrawal was six weeks, with another mean time of 12 weeks to demonstrate complete remission.

2006 ◽  
Vol 33 (3) ◽  
pp. 181 ◽  
Author(s):  
Amber L. Hooke ◽  
Lee Allen ◽  
Luke K.-P. Leung

Sodium cyanide poison is potentially a more humane method to control wild dogs than sodium fluoroacetate (1080) poison. This study quantified the clinical signs and duration of cyanide toxicosis delivered by the M-44 ejector. The device delivered a nominal 0.88 g of sodium cyanide, which caused the animal to loose the menace reflex in a mean of 43 s, and the animal was assumed to have undergone cerebral hypoxia after the last visible breath. The mean time to cerebral hypoxia was 156 s for a vertical pull and 434 s for a side pull. The difference was possibly because some cyanide may be lost in a side pull. There were three distinct phases of cyanide toxicosis: the initial phase was characterised by head shaking, panting and salivation; the immobilisation phase by incontinence, ataxia and loss of the righting reflex; and the cerebral hypoxia phase by a tetanic seizure. Clinical signs that were exhibited in more than one phase of cyanide toxicosis included retching, agonal breathing, vocalisation, vomiting, altered levels of ocular reflex, leg paddling, tonic muscular spasms, respiratory distress and muscle fasciculations of the muzzle.


2021 ◽  
Vol 8 ◽  
Author(s):  
Elizabeth A. J. Cook ◽  
Tatjana Sitt ◽  
E. Jane Poole ◽  
Gideon Ndambuki ◽  
Stephen Mwaura ◽  
...  

Corridor disease (CD) is a fatal condition of cattle caused by buffalo-derived Theileria parva. Unlike the related condition, East Coast fever, which results from infection with cattle-derived T. parva, CD has not been extensively studied. We describe in detail the clinical and laboratory findings in cattle naturally infected with buffalo-derived T. parva. Forty-six cattle were exposed to buffalo-derived T. parva under field conditions at the Ol Pejeta Conservancy, Kenya, between 2013 and 2018. The first signs of disease observed in all animals were nasal discharge (mean day of onset was 9 days post-exposure), enlarged lymph nodes (10 days post-exposure), and pyrexia (13.7 days post-exposure). Coughing and labored breathing were observed in more than 50% of animals (14 days post-exposure). Less commonly observed signs, corneal edema (22%) and diarrhea (11%), were observed later in the disease progression (19 days post-exposure). All infections were considered clinically severe, and 42 animals succumbed to infection. The mean time to death across all studies was 18.4 days. The mean time from onset of clinical signs to death was 9 days and from pyrexia to death was 4.8 days, indicating a relatively short duration of clinical illness. There were significant relationships between days to death and the days to first temperature (chi2 = 4.00, p = 0.046), and days to peak temperature (chi2 = 25.81, p = 0.001), animals with earlier onset pyrexia died sooner. These clinical indicators may be useful for assessing the severity of disease in the future. All infections were confirmed by the presence of macroschizonts in lymph node biopsies (mean time to parasitosis was 11 days). Piroplasms were detected in the blood of two animals (4%) and 20 (43%) animals seroconverted. In this study, we demonstrate the successful approach to an experimental field study for CD in cattle. We also describe the clinical progression of CD in naturally infected cattle, including the onset and severity of clinical signs and pathology. Laboratory diagnoses based on examination of blood samples are unreliable, and alternatives may not be available to cattle keepers. The rapid development of CD requires recognition of the clinical signs, which may be useful for early diagnosis of the disease and effective intervention for affected animals.


2019 ◽  
Vol 9 (3) ◽  
pp. 298-304
Author(s):  
Leslie A Enane ◽  
Kaede V Sullivan ◽  
Evangelos Spyridakis ◽  
Kristen A Feemster

Abstract Background Children who develop malaria after returning to a setting in which the disease is not endemic are at high risk for critical delays in diagnosis and initiation of antimalarial therapy. We assessed the clinical impact of the implementation of malaria rapid diagnostic testing (RDT) on the management of children with malaria at an urban US children’s hospital that serves a large immigrant population. Methods This was a retrospective cohort study of all children diagnosed with laboratory-confirmed malaria at the Children’s Hospital of Philadelphia (CHOP) between 2000 and 2014. RDT using a US Food and Drug Administration–approved immunochromatographic assay was introduced at CHOP on August 1, 2007. We compared clinical management and outcomes of patients with malaria diagnosed before and after RDT introduction. Results We analyzed 82 pediatric malaria cases (32 before and 50 after RDT implementation). The majority of these patients had traveled to West Africa (91.5%) and were infected with Plasmodium falciparum (80.5%). The mean time to a positive result decreased from 10.4 to 0.9 hours (P < .001) after the introduction of RDT for patients with P falciparum. The mean time to antimalarial therapy decreased from 13.1 to 6.9 hours (P =; .023) in hospitalized patients. We found no significant reduction in the mean number of clinical signs of severe malaria between 0 and 48 hours of hospitalization and no difference in the need for exchange transfusion, time to resolution of parasitemia, or length of hospital stay. Conclusions Implementation of RDT for malaria was associated with shorter times to malaria diagnosis and initiation of antimalarial therapy. The results of this study support RDT in the optimal management of patients with malaria who present in settings in which the disease is not endemic.


2011 ◽  
Vol 5 (1) ◽  
pp. 77-83 ◽  
Author(s):  
Ngamjit Kasetsuwan ◽  
Pinnita Tanthuvanit ◽  
Usanee Reinprayoon

Abstract Background: Bacterial keratitis is a major devastating ocular condition that quickly deteriorates the patient’s vision. Vigorous and prompt treatment of bacterial keratitis with broad-spectrum antibiotic eye-drops is preferred. Objective: Evaluate the efficacy and safety of 0.5% Levofloxacin for the treatment of suspected and cultureproven cases of infectious bacterial keratitis in comparison to fortified Cefazolin and Amikacin ophthalmic solution. Materials and methods: Seventy-one eyes from 69 patients suspected of having infectious bacterial keratitis were enrolled in the study. The patients were randomized into two arms, 0.5% Levofloxacin eye drops (34 eyes) or fortified Cefazolin and Amikacin (37 eyes). Sixty-eight eyes were included in the efficacy analysis. During treatment, on days 2, 7, 14, and 21, the patient’s symptoms and signs were scored from grade 0-3 (absent to severe). Results: At the end of the treatment, 61 out of 71 eyes completely healed. The resolution of the keratitis was not significantly different between both groups. There were no significant differences in the mean time-duration for the ulcer to heal or for the symptoms and clinical signs to disappear between the two groups. No serious adverse events or side effects from the disease were found. The patients compliance was 80% based on the self-reported diaries. Conclusion: The efficacy and safety of 0.5% topical Levofloxacin was comparable to fortified Cefazolin and Amikacin for the treatment of mild-to-moderate bacterial keratitis. Topical Levofloxacin is far superior because of its availability and patient compliance when used as monotherapy for the treatment of infectious bacterial keratitis.


2019 ◽  
Vol 46 (1) ◽  
pp. 89 ◽  
Author(s):  
Paul D. Meek ◽  
Stuart C. Brown ◽  
Jason Wishart ◽  
Heath Milne ◽  
Paul Aylett ◽  
...  

Context Wildlife and pest managers and stakeholders should constantly aim to improve animal-welfare outcomes when foot-hold trapping pest animals. To minimise stress and trauma to trapped animals, traps should be checked at least once every 24h, normally as soon after sunrise as possible. If distance, time, environmental or geographical constraints prevent this, toxins such as strychnine can be fitted to trap jaws to induce euthanasia. However, strychnine is considered to have undesirable animal-welfare outcomes because animals are conscious while clinical signs of intoxication are present. A toxin considered more humane, para-aminopropiophenone (PAPP), is available to induce euthanasia in trapped animals but is untested for presentation and efficacy. Aim We tested the efficacy of two types of lethal trap device (LTD’s), each using a paste formulation of PAPP as the active toxin to replace the use of strychnine on foot-hold jaw traps. Methods Elastomer LTDs and PAPP-cloths were fitted to jaw traps set to capture wild dogs (Canis familiaris). Camera-trap data was used to record animal behaviours after capture and to determine the efficacy of both modalities. Key results Every trapped wild dog (n=117) gnawed at the elastomer LTD’s or PAPP-cloth attached to the trap jaws that restrained them; one dog failed to liberate the toxin. From the dogs caught in the main trial (n=56), a mortality rate of 84% and 87% was reported respectively. The mean time from trap-to-death for elastomer LTDs was 64min and 68min for PAPP-cloths. Conclusions Elastomer LTDs and PAPP cloths combined caused the mortality of 85% of captured dogs. This efficacy could be improved by adopting the recommendations discussed in the present study for deploying PAPP-based LTDs during trap deployment. Implications PAPP-based LTDs offer an alternative option to the use of strychnine and improve the welfare outcomes for trapped predators, especially where traps are not checked within the recommended 24-h period.


2012 ◽  
Vol 48 (6) ◽  
pp. 417-423 ◽  
Author(s):  
Jennifer L. Frankot ◽  
Ellen N. Behrend ◽  
Peter Sebestyen ◽  
Barbara E. Powers

A 10 yr old bichon frise presented with a 3 mo history of polyuria, polydipsia, and hind limb weakness. Serum biochemistry revealed persistent hypokalemia. A left adrenal gland mass with right adrenal atrophy was detected ultrasonographically. Basal serum cortisol concentration was at the low end of normal (30 nmol/L; reference range, 30–140 nmol/L) and adrenocorticotropic hormone (ACTH)-stimulated cortisol concentration was low (199 nmol/L; reference range, 220–470 nmol/L). Basal serum 17-α-OH progesterone concentration was also low (0.03 ng/mL; reference range, 0.06–0.30 ng/mL), but the aldosterone concentration 2 hr after the ACTH stimulation was elevated (> 3,000 pmol/L; reference range, 197–2,103 pmol/L). A left adrenalectomy and nephrectomy were performed. Histopathology revealed an adrenocortical zona glomerulosa carcinoma. Surgical excision was considered incomplete; however, clinical signs resolved. Two years later, basal and ACTH-stimulated aldosterone concentrations were elevated. Computed tomography demonstrated a mass effect in the liver. The left lateral and left medial hepatic lobes were removed. Histopathology confirmed metastatic endocrine carcinoma. The patient was stable 1,353 days postsurgically (when this report was prepared). This is the first case report of a metastatic adrenal carcinoma that was successfully managed surgically for > 3 yr.


1996 ◽  
Vol 75 (05) ◽  
pp. 731-733 ◽  
Author(s):  
V Cazaux ◽  
B Gauthier ◽  
A Elias ◽  
D Lefebvre ◽  
J Tredez ◽  
...  

SummaryDue to large inter-individual variations, the dose of vitamin K antagonist required to target the desired hypocoagulability is hardly predictible for a given patient, and the time needed to reach therapeutic equilibrium may be excessively long. This work reports on a simple method for predicting the daily maintenance dose of fluindione after the third intake. In a first step, 37 patients were delivered 20 mg of fluindione once a day, at 6 p.m. for 3 consecutive days. On the morning of the 4th day an INR was performed. During the following days the dose was adjusted to target an INR between 2 and 3. There was a good correlation (r = 0.83, p<0.001) between the INR performed on the morning of day 4 and the daily maintenance dose determined later by successive approximations. This allowed us to write a decisional algorithm to predict the effective maintenance dose of fluindione from the INR performed on day 4. The usefulness and the safety of this approach was tested in a second prospective study on 46 patients receiving fluindione according to the same initial scheme. The predicted dose was compared to the effective dose soon after having reached the equilibrium, then 30 and 90 days after. To within 5 mg (one quarter of a tablet), the predicted dose was the effective dose in 98%, 86% and 81% of the patients at the 3 times respectively. The mean time needed to reach the therapeutic equilibrium was reduced from 13 days in the first study to 6 days in the second study. No hemorrhagic complication occurred. Thus the strategy formerly developed to predict the daily maintenance dose of warfarin from the prothrombin time ratio or the thrombotest performed 3 days after starting the treatment may also be applied to fluindione and the INR measurement.


Author(s):  
P. R. Chavelikar ◽  
G. and Neha Rao C. Mandali ◽  
Neha Rao

Ruminal acidosis is an important clinical emergency in small ruminants. In this study, eight healthy farm goats and 24 goats presented at TVCC of the college of Veterinary Sciences and A.H., Anand with clinical signs of ruminal acidosis and having rumen liquor pH below 6 were examined for alterations in the ruminal fluid and serum biochemical parameters. Among various rumen fluid parameters evaluated, the mean values of rumen fluid pH decreased significantly (4.71±0.11 vs. 6.90±0.10), while sediment activity time (46.67±1.20 vs. 24.50±0.78 min) and methylene blue reduction time (29.50±0.73 vs. 10.03±0.27 min) increased significantly in acidotic goats. The normal greenish, aromatic viscous color, odour and consistency of rumen fluid of healthy goats also changed to milky grey/creamy, sour/pungent watery in acidotic goats. The rumen protozoal activity decreased to nil in acidotic goats as compared to the healthy goats. Among various serum biochemical constituents, the mean values of glucose (92.43±1.37 vs. 74.13±1.83 mg/dl), BUN (26.49±0.47 vs. 22.63±1.19 mg/dl), serum creatinine (01.01±0.02 vs. 00.83±0.02 mg/dl) and albumin (03.22±0.03 vs. 03.05±0.05 g/dl), ALT (56.75±1.55 vs. 27.88±1.14 IU/L) and AST (93.25±1.82 vs. 54.00±1.75 IU/L), increased significantly, while there was significant decrease in serum calcium (09.09±0.14 vs. 10.29±0.08 mg/dl) in acidotic goats. The mean values of alkaline phosphatase (IU/L) in acidotic goats increased non-significantly from the base values of healthy goats.


Author(s):  
P. R. Chavelikar ◽  
G. C. Mandali ◽  
D. M. Patel

Ruminal acidosis is one of the most important clinical emergencies in sheep and goats resulting into high mortality rate. In the present study, eight healthy farm goats and 24 goats presented to the TVCC of the college with clinical signs of ruminal acidosis like anorexia, tympany, increased pulse and respiratory rate, reduced body temperature, doughy rumen, enteritis, oliguria, grinding of teeth, purulent nasal discharge, muscle twitching, arched back, dehydration and recumbency with rumen liquor pH below 6 were examined for haematological alterations using autohaematoanalyzer. Among various haematological parameters evaluated from acidotic goats, the mean values of Hb (12.21±0.17 vs. 10.86±0.15 g/dl), TEC (14.28±0.16 vs. 12.04±0.36 ×106/ μl), TLC (13.43±0.11 vs. 11.11±0.27 ×103/μl), PCV (36.91±0.53 vs. 29.88±0.55%), neutrophils (64.54±0.93 vs. 28.13±0.92%), MCV (23.38±0.37 vs. 19.38±1.34 fl) and MCH (7.03±0.08 vs. 6.31±0.25 pg) were found significantly increased, while the mean values of lymphocytes (28.00±0.82 vs. 65.38±0.80%) and MCHC (24.55 ±0.26 vs. 34.88±0.97 g/dl) were decreased significantly from the base values of healthy goats. It was concluded that ruminal acidosis induced due to accidental heavy ingestion of readily fermentable carbohydrate rich grains and food waste significantly altered the haematological profile concurrent with clinical manifestations in goats, and hence can be used to assess the severity of the disease.


2021 ◽  
pp. 107815522110160
Author(s):  
Bernadatte Zimbwa ◽  
Peter J Gilbar ◽  
Mark R Davis ◽  
Srinivas Kondalsamy-Chennakesavan

Purpose To retrospectively determine the rate of death occurring within 14 and 30 days of systemic anticancer therapy (SACT), compare this against a previous audit and benchmark results against other cancer centres. Secondly, to determine if the introduction of immune checkpoint inhibitors (ICI), not available at the time of the initial audit, impacted mortality rates. Method All adult solid tumour and haematology patients receiving SACT at an Australian Regional Cancer Centre (RCC) between January 2016 and July 2020 were included. Results Over a 55-month period, 1709 patients received SACT. Patients dying within 14 and 30 days of SACT were 3.3% and 7.0% respectively and is slightly higher than our previous study which was 1.89% and 5.6%. Mean time to death was 15.5 days. Males accounted for 63.9% of patients and the mean age was 66.8 years. 46.2% of the 119 patients dying in the 30 days post SACT started a new line of treatment during that time. Of 98 patients receiving ICI, 22.5% died within 30 days of commencement. Disease progression was the most common cause of death (79%). The most common place of death was the RCC (38.7%). Conclusion The rate of death observed in our re-audit compares favourably with our previous audit and is still at the lower end of that seen in published studies in Australia and internationally. Cases of patients dying within 30 days of SACT should be regularly reviewed to maintain awareness of this benchmark of quality assurance and provide a feedback process for clinicians.


Sign in / Sign up

Export Citation Format

Share Document