scholarly journals Plasma Adrenocorticotropin (ACTH) Values and Cortisol Response to 250 and 1 μg ACTH Stimulation in Patients with Hyperthyroidism before and after Carbimazole Therapy: Case-Control Comparative Study

2007 ◽  
Vol 92 (5) ◽  
pp. 1693-1696 ◽  
Author(s):  
Sunil Kumar Mishra ◽  
Nandita Gupta ◽  
Ravinder Goswami

Abstract Context: Although the production and metabolic clearance rate of cortisol is increased during thyrotoxic state, the net effect on adrenocortical reserves is not clear. Objective: We assessed circulating ACTH levels, cortisol binding globulin (CBG), and adrenocortical reserves in hyperthyroid patients (before and after carbimazole therapy) and healthy controls. Design and Setting: This was a case-control investigative study in a tertiary care setting. Patients and Methods: Plasma ACTH and free cortisol index (FCI; serum cortisol/CBG) were measured in 49 consecutive patients with hyperthyroidism and 50 controls. ACTH1–24 stimulation tests (250 and 1 μg) were carried out in the first 29 patients and 15 controls. Peak FCI less than the mean −3 sd of healthy controls was considered subnormal. ACTH1–24 stimulation tests were repeated in 24 patients in the euthyroid state. Results: The mean basal plasma ACTH and FCI were higher and CBG was lower in thyrotoxic patients in comparison with controls. The peak cortisol was less than 18 μg/dl in 10 of 29 and 14 of 29 on 250 and 1 μg ACTH1–24 stimulation. Peak FCI was subnormal only in three of 27 (11.1%) and two of 21 (7.4%) on 250 and 1 μg ACTH1–24 stimulation, respectively. The mean plasma ACTH, basal FCI, and subnormal peak FCI (two of the three) normalized after euthyroidism. Plasma ACTH and FCI did not correlate with severity of thyrotoxicosis. Conclusions: Up to 11% of thyrotoxics have subnormal peak FCI on ACTH1–24 stimulation. Such changes occur despite high basal plasma ACTH and FCI. Use of FCI, rather than total cortisol, is required for the interpretation of cortisol values in thyrotoxicosis due to the variation in CBG.

2010 ◽  
Vol 162 (1) ◽  
pp. 43-48 ◽  
Author(s):  
Massimo Scacchi ◽  
Leila Danesi ◽  
Agnese Cattaneo ◽  
Elena Valassi ◽  
Francesca Pecori Giraldi ◽  
...  

ObjectiveWe previously described in young thalassaemic patients an altered cortisol and ACTH responsiveness suggesting an impaired adrenocortical reserve. Owing to iron overload, a worsening of adrenal function should be expected in adult patients.DesignIn 124 adults with β-thalassaemia, urinary free cortisol (UFC) and plasma ACTH levels were determined and compared with those measured in 150 controls. In 45 patients, cortisol was measured in response to: i) tetracosactide 1 μg as an i.v. bolus (low-dose test, LDT) and ii) tetracosactide 250 μg infused i.v. over 8 h (high-dose test, HDT).ResultsUFC and serum cortisol were within the reference range in all patients. Conversely, basal plasma ACTH values were above the upper limit of the normal range in 19 patients. There were no statistically significant differences in the mean values of UFC, basal serum cortisol and plasma ACTH between patients and controls. A subnormal cortisol response to the LDT was registered in 18 out of 56 patients. Three of these patients also displayed a subnormal response to the HDT, together with elevated baseline plasma ACTH levels. In the LDT, a positive correlation was found between basal and peak cortisol values (P<0.0001). The latter were negatively correlated with basal ACTH values in both LDT (P<0.0001) and HDT (P<0.0001).ConclusionsAdult thalassaemic patients often present a subtle impairment of adrenocortical function. This may become clinically relevant in case of major stressful events. Thus, we recommend an assessment of adrenocortical function in all adult thalassaemic patients.


CJEM ◽  
2020 ◽  
Vol 22 (S1) ◽  
pp. S114-S115
Author(s):  
A. Albina ◽  
F. Kegel ◽  
F. Dankoff ◽  
G. Clark

Background: Emergency department (ED) overcrowding is associated with a broad spectrum of poor medical outcomes, including medical errors, mortality, higher rates of leaving without being seen, and reduced patient and physician satisfaction. The largest contributor to overcrowding is access block – the inability of admitted patients to access in-patient beds from the ED. One component to addressing access block involves streamlining the decision process to rapidly determine which hospital service will admit the patient. Aim Statement: As of Sep 2011, admission algorithms at our institution were supported and formalised. The pancreatitis algorithm clarified whether general surgery or internal medicine would admit ED patients with pancreatitis. We hypothesize that this prior uncertainty delayed the admission decision and prolonged ED length of stay (LOS) for patients with pancreatitis. Our project evaluates whether implementing a pancreatitis admission algorithm at our institution reduced ED time to disposition (TTD) and LOS. Measures & Design: A retrospective review was conducted in a tertiary care academic hospital in Montreal for all adult ED patients diagnosed with pancreatitis from Apr 2010 to Mar 2014. The data was used to plot separate run charts for ED TTD and LOS. Serial measurements of each outcome were used to monitor change and evaluate for special cause variation. The mean ED LOS and TTD before and after algorithm implementation were also compared using the Student's t test. Evaluation/Results: Over four years, a total of 365 ED patients were diagnosed with pancreatitis and 287 (79%) were admitted. The mean ED LOS for patients with pancreatitis decreased following the implementation of an admission algorithm (1616 vs. 1418 mins, p = 0.05). The mean ED TTD was also reduced (1171 vs. 899 mins, p = 0.0006). A non-random signal of change was suggested by a shift above the median prior to algorithm implementation and one below the median following. Discussion/Impact: This project demonstrates that in a busy tertiary care academic hospital, an admission algorithm helped reduce ED TTD and LOS for patients with pancreatitis. This proves especially valuable when considering the potential applicability of such algorithms to other disease processes, such as gastrointestinal bleeding and congestive heart failure, among others. Future studies demonstrating this external applicability, and the impact of such decision algorithms on physician decision fatigue and within non-academic institutions, proves warranted.


2020 ◽  
pp. 112067212097533
Author(s):  
Merve Beyza Yildiz ◽  
Elvin Yildiz ◽  
Sevcan Balci ◽  
Buse Rahime Hasirci Bayir ◽  
Yılmaz Çetinkaya

Purpose: To evaluate the pupil size, accommodation, and ocular higher-order aberrations (HOAs) in patients with migraine during migraine attacks and compare them with interictal period and healthy controls. Methods: This prospective, case–control study included 48 eyes of 24 patients with migraine and 48 eyes of 24 age and sex-matched healthy controls. Measurements were performed using a Hartmann Shack aberrometer. Accommodative responses to accommodative stimulus ranging from 0 to 5 diopters (D) in increments of 0.5 D were recorded. Spherical, coma, trefoil aberration, and root mean square (RMS) of total HOAs were assessed. Patients with migraine were measured twice during the interictal phase and during migraine attack. Results: The mean pupil size significantly decreased during migraine attack (5.85 ± 0.19 mm) compared with the interictal phase (6.05 ± 0.19 mm) in the patients with migraine ( p = 0.012). There was a significant increase in the accommodative response to accommodative stimulus of 1.5 to 5 D during migraine attack. No significant change was observed in HOAs during migraine attack. In addition, no ictal or interictal period measurements were statistically significantly different from the controls. Comparing symptomatic and non-symptomatic sides in 17 migraine patients with unilateral headache, no significant difference was found in any of the measurements in both ictal and interictal periods. Conclusion: Our results suggest the presence of a subtle oculosympathetic hypofunction in patients with migraine during the ictal period compared to the interictal period. The accommodation status of the eye seems to be affected by this autonomic dysfunction.


2020 ◽  
Vol 20 (1) ◽  
Author(s):  
Najlaa M. Alamoudi ◽  
Farah A. Alsadat ◽  
Azza A. El-Housseiny ◽  
Osama M. Felemban ◽  
Amani A. Al Tuwirqi ◽  
...  

Abstract Background Celiac disease (CD) is an immune-related enteropathy triggered by gluten ingestion in susceptible individuals. Oral manifestations of CD have been frequently described, although reports on dental maturity (DM) are scant. Thus, the aim of this study is to assess the prevalence of DM in CD patients and to test for possible predictors. Methods This is a case–control study of children with CD and healthy controls between 2017 and 2020. A panoramic radiograph and comprehensive oral examination were performed for each participant. Dental age (DA) was measured according to Demirjian’s method and DM was calculated by subtracting the chronological age (CA) from the DA. Statistical analysis was performed to compare the DM between CD patients and controls, and a multivariate analysis was utilized to look for predictors of DM. Results Two-hundred and eight participants (104 children with CD, and 104 healthy controls) were incorporated. The mean age for CD patients was 10.67 ± 2.40 years, and 10.69 ± 2.37 years for healthy controls (P = 0.971). CD patients had a higher prevalence of delayed DM than controls (62.5% vs. 3%, respectively). They also had a greater delay in DM than controls (− 7.94 ± 10.94 vs. 6.99 ± 8.77, P < 0.001). A multivariate analysis identified age between 6 and 7 years (β ± SE = 16.21 ± 2.58, P < 0.001), as the only predictor for DM. Conclusions CD patients had a greater prevalence of delayed DM than controls. No predictors for DM could be found, except young age.


2021 ◽  
Vol 71 (6) ◽  
pp. 1993-96
Author(s):  
Marrium Shafi ◽  
Muhammad Akmal Khan ◽  
Yaseen Lodhi ◽  
Asma Aftab ◽  
Muhammad Haroon Sarfraz

Objective: To determine the mean change in central macular thickness after cataract surgery and to compare the mean change in central macular thickness after cataract surgery in non-diabetics and diabetics without diabetic retinopathy Study design: Case control   Study settings and duration: A case control study was carried out at Ophthalmology department, POF hospital, Wah Cantt. Study duration was 6 months (April 2019-September 2019)   Material and methods: A sample size of 60 patients was calculated by using Open Epi Software. We used non probability consecutive sampling. Patients were divided into two groups; Cases (Diabetic) and controls (non-Diabetic). All patients underwent phacoemulsification and observed after 4 weeks for macular thickness measurement using optical coherence tomography before and after surgery. Data analysis was done with SPSS version 20. Post stratification t test was applied. P value ≤0.05 was considered significant.   Results: Total 60 patients were included. Mean age of patients was 65.31 ±7. 63SD.There were 35 (58.3%) males and 25 (41.7%) female patients in the study. We found a significant increase in central macular thickness in cases and controls [(223.100±15.86SD vs 227.2667±17.9SD, p=0.000) and (221.200±12.16SD vs 226.289±16.7861SD, p =0.001)] before and after phacoemulsification in cases and controls respectively. However, no significant difference was found between the groups (p=0.486).   Conclusion: Central macular thickness was increased after uncomplicated phacoemulsification in both diabetics and non-diabetics without retinopathy for up to a follow-up period of 4 weeks but the thickness did not differ between the two groups.


Author(s):  
Saad M. Al-Qahtani ◽  
Henry Baffoe-Bonnie ◽  
Aiman El-Saed ◽  
Majid Alshamrani ◽  
Abdullah Algwizani ◽  
...  

Abstract Background Most septic patients managed by critical care response teams (CCRT) are prescribed antimicrobials. Nevertheless, data evaluating their appropriateness are lacking both locally and internationally. The objective was to assess antimicrobial use among septic and non-septic patients managed by CCRT. Setting Case-control design was used to compare septic (cases) and non-septic (controls) CCRT patients at tertiary care setting. The frequency of antimicrobial use was assessed before and after CCRT activation. The appropriateness of antimicrobial use was assessed at day four post-CCRT, based on standard recommendations, clinical assessment, and culture results. Main results A total of 157 cases and 158 controls were included. The average age was 61.1 ± 20.4 years, and 54.6% were males, with minor differences between groups. The use of any antimicrobial was 100.0% in cases and 87.3% in controls (p < 0.001). The use of meropenem (68.2% versus 34.8%, p < 0.001) and vancomycin (56.7% versus 25.9%, p < 0.001) were markedly higher in cases than controls. The overall appropriateness was significantly lower in cases than controls (50.7% versus 59.6%, p = 0.047). Individual appropriateness was lowest with meropenem (16.7%) and imipenem (25.0%), and highest with piperacillin/tazobactam (87.1%) and colistin (78.3%). Only 48.5% of antimicrobials prescribed by CCRT were de-escalated by a primary team within four days. Individual appropriateness and de-escalations were not different between groups. Conclusions Empiric use and inadequate de-escalation of broad-spectrum antimicrobials were major causes for inappropriate antimicrobial use in CCRT patients. Our findings highlight the necessity of urgent implementation of an antimicrobial stewardship program, including training and auditing of antimicrobial prescriptions.


1986 ◽  
Vol 112 (3) ◽  
pp. 329-335 ◽  
Author(s):  
Fawzi Bakiri ◽  
Anne M. Riondel ◽  
Moulai Benmiloud ◽  
Michel B. Vallotton

Abstract. To appreciate the aldosterone secretion status in panhypopituitarism, the steroid response to stimulation was studied in a homogeneous group of 20 female patients presenting with global hypopituitarism. Specific effects of glucocorticoid and thyroid hormone deficiencies were also assessed by studying the same patients before and after cortisol (F) and cortisol plus thyroid hormone (F + T) substitution. The patients were submitted to two stimulation tests before and after each treatment: the orthostasis test (O-T) and the furosemide test (Furo-T). The results obtained in the 3 situations were compared, each patient serving as her own control. Comparison was also established with the results obtained in healthy women serving as control group. Basal plasma aldosterone levels in the untreated patients were not significantly different from those of the control group (5.43 ± 0.51 vs 7.16 ±0.80 ng/100 ml, mean ± sem). They were significantly lower after F (3.91 ± 0.42) and F + T substitution (3.31 ± 0.23) than those of untreated patients and controls. Response to both stimulations was blunted in the untreated patients (O-T: 14.10 ± 2.81; Furo-T: 9.78 ± 1.35) as compared to the control group (O-T: 26.46 ± 4.67; Furo-T: 23.96 ± 3.30). F treatment did not improve the response to either tests, (O-T: 11.42 ± 2.55; Furo-T: 10.32 ± 1.23). F + T treatment normalized the orthostasis response (20.83 ± 3.59) and increased the response to furosemide which remained, however, lower (15.28 ± 1.83) than in the control group. These results are in favour of a minor role of the pituitary in the regulation of aldosterone secretion. They emphasize the role of thyroid hormones which may act partly directly, partly through their effect on renin secretion.


2020 ◽  
Vol 9 (1) ◽  
pp. PM01-PM04
Author(s):  
Shailendra Kumar Jain

Background: This report analyses the outcomes of a case control investigation shelled in an eventual legion learning of domestic contamination (HHCs) of TB patients. Further, these data was pooled with other available probable learning of status of vitamin D and TB hazard to demeanor asingle-participant data (IPD). Subjects and Methods: In the study recently diagnosed pulmonary tuberculosis patients were involved with a total number of 28 with a male female ration of 18 : 10 and on the other side  28 healthy controls were selected with a male female ratio of 16:12 according to inclusion and elimination criterion through non-probability purposive sampling.Results: The mean age of cases with tuberculosis was 38.8±7.5years whereas the represent age of controls was 36±5.04 years. Remarkable differences were observed between the patients with tuberculosis and controls. The differences were very significant in RBC counts, Hemoglobin,and Platelet counts. Squat standard hemoglobin values were found in the majority of study subjects in common and specifically in patients with pulmonary tuberculosis.Conclusion: It has been found in the study that squat serum 25– (OH) D levels were related through amplified threat of future succession to TB disease in a dose-depending method.


1981 ◽  
Vol 89 (3) ◽  
pp. 443-449 ◽  
Author(s):  
M. MANIN ◽  
P. DELOST

Cortisol metabolism was studied in conscious adult male guinea-pigs subjected to a neurotrophic stress (immobilization and stimulation by light for 3 h). The disappearance curves of tracer quantities of [3H]cortisol were represented by a two-pool model. In stressed animals, there was a marked increase in the mean plasma level of cortisol (184% of control value; P <0·001) and in the metabolic clearance rate (MCR; 17% of control value; 0·001 <P <0·01). This rise in the MCR of plasma cortisol resulted from an increase in the mean total apparent volume of distribution (49%, P < 0·001). The lack of significant differences in the slopes of the second exponential phase of the disappearance curves indicated that the stress did not significantly increase the half-life of cortisol. The mean binding capacity of transcortin for cortisol (ST) was significantly higher in the animals which had been subjected to the neurotrophic stress than in the control guinea-pigs (0·02 < P <0·05). However, ST values remained very low and accounted for the very high levels of free cortisol found after the stress. The results suggest that the raised concentrations of unbound cortisol found in the plasma of conscious adult male guinea-pigs in response to neurotrophic stress reflect a hypersecretion of corticosteroid.


1981 ◽  
Vol 240 (2) ◽  
pp. E131-E135
Author(s):  
M. E. Thompson ◽  
G. A. Hedge

Systemic indomethacin (Ind) administration decreased prostaglandin F (PGF) content of the rat adrenal to less than 1.4 pg/mg. This was less than 5% of the adrenal PGF content in the gelatin-treated (Gel) control group (34 pg/mg). Basal plasma corticosterone levels were increased by the Ind treatment. Since the calculated metabolic clearance rate for corticosterone was unchanged, this increase was attributed to an enhanced adrenal secretion rate that was secondary to elevated plasma ACTH concentration. Ether exposure in the presence of Ind did not stimulate a normal rise in plasma corticosterone or adrenal corticosteroidogenesis. Adrenal responsiveness to exogenous ACTH was reduced after Ind treatment. There was a normal rise in plasma ACTH levels following ether exposure confirming the adrenal as the site of inhibition. Systemic Ind treatment thus appears to have two sites of action in altering plasma corticosterone levels: 1) a direct effect on the adrenal, inhibiting normal secretion in response to acute elevations of plasma ACTH, and 2) an action at the pituitary or hypothalamic level, eliciting an increase in basal ACTH secretion.


Sign in / Sign up

Export Citation Format

Share Document