scholarly journals The Use of Prednisolone versus Dual-Release Hydrocortisone in the Treatment of Hypoadrenalism

2021 ◽  
Author(s):  
Sirazum Choudhury ◽  
Tricia Tan ◽  
Katharine Lazarus ◽  
Karim Meeran

The introduction of adrenocortical extract in 1930, improved life expectancy to between two and five years with further increases seen with the introduction of cortisone acetate from 1948. Most patients are now treated with synthetic hydrocortisone, and incremental advances have been made with optimisation of daily dosing and the introduction of multi-dose regimens. Today there remains a significant mortality gap between individuals with treated hypoadrenalism and the general population. It is unclear whether this gap is a result of glucocorticoid over-replacement, under-replacement or loss of the circadian and ultradian rhythm of cortisol secretion, with detrimental risk of excess glucocorticoid at later times in the day. The way forwards involves replacement of the diurnal cortisol rhythm with better glucocorticoid replacement regimens. The steroid profile produced by both prednisolone and dual-release hydrocortisone (Plenadren), provide a smoother glucocorticoid profile than standard oral multidose regimens of hydrocortisone and cortisone acetate. The individualisation of prednisolone doses and lower bioavailability of Plenadren offer reductions in total steroid exposure. Although there is emerging evidence of both treatments offering better cardiometabolic outcomes than standard glucocorticoid replacement regimens, there is a paucity of evidence involving very low dose prednisolone (2-4 mg daily) compared to the larger doses (~7.5 mg) historically used. Data from upcoming clinical studies on prednisolone will therefore be of key importance in informing future practice.

2012 ◽  
Vol 30 (24) ◽  
pp. 2995-3001 ◽  
Author(s):  
Malin Hultcrantz ◽  
Sigurdur Yngvi Kristinsson ◽  
Therese M.-L. Andersson ◽  
Ola Landgren ◽  
Sandra Eloranta ◽  
...  

PurposeReported survival in patients with myeloproliferative neoplasms (MPNs) shows great variation. Patients with primary myelofibrosis (PMF) have substantially reduced life expectancy, whereas patients with polycythemia vera (PV) and essential thrombocythemia (ET) have moderately reduced survival in most, but not all, studies. We conducted a large population-based study to establish patterns of survival in more than 9,000 patients with MPNs.Patients and MethodsWe identified 9,384 patients with MPNs (from the Swedish Cancer Register) diagnosed from 1973 to 2008 (divided into four calendar periods) with follow-up to 2009. Relative survival ratios (RSRs) and excess mortality rate ratios were computed as measures of survival.ResultsPatient survival was considerably lower in all MPN subtypes compared with expected survival in the general population, reflected in 10-year RSRs of 0.64 (95% CI, 0.62 to 0.67) in patients with PV, 0.68 (95% CI, 0.64 to 0.71) in those with ET, and 0.21 (95% CI, 0.18 to 0.25) in those with PMF. Excess mortality was observed in patients with any MPN subtype during all four calendar periods (P < .001). Survival improved significantly over time (P < .001); however, the improvement was less pronounced after the year 2000 and was confined to patients with PV and ET.ConclusionWe found patients with any MPN subtype to have significantly reduced life expectancy compared with the general population. The improvement over time is most likely explained by better overall clinical management of patients with MPN. The decreased life expectancy even in the most recent calendar period emphasizes the need for new treatment options for these patients.


Sexual Health ◽  
2011 ◽  
Vol 8 (4) ◽  
pp. 485 ◽  
Author(s):  
Claire Naftalin ◽  
Bavithra Nathan ◽  
Lisa Hamzah ◽  
Frank A. Post

Acute renal failure and chronic kidney disease are more common in HIV-infected patients compared with the general population. Several studies have shown age to be a risk factor for HIV-associated kidney disease. The improved life expectancy of HIV-infected patients as a result of widespread use of antiretroviral therapy has resulted in progressive aging of HIV cohorts in the developed world, and an increased burden of cardiovascular and kidney disease. Consequently, HIV care increasingly needs to incorporate strategies to detect and manage these non-infectious co-morbidities.


1989 ◽  
Vol 29 (2) ◽  
pp. 95-102 ◽  
Author(s):  
Kyriakos S. Markides

Increased survival by blacks and Hispanics is causing a widening of the sex imbalance of the elderly population much like we have observed in the general population. These demographic trends point toward greater widowhood among minority women and continuing high rates of poverty. In addition, we can expect increased rates of disability in minority elderly women, increased dependency, worsening intergenerational relationships, and higher rates of institutionalization.


2016 ◽  
Vol 47 (1) ◽  
pp. 1-10 ◽  
Author(s):  
Gordon W. Fuller ◽  
Jeanine Ransom ◽  
Jay Mandrekar ◽  
Allen W. Brown

Background: Long-term mortality may be increased following traumatic brain injury (TBI); however, the degree to which survival could be reduced is unknown. We aimed at modelling life expectancy following post-acute TBI to provide predictions of longevity and quantify differences in survivorship with the general population. Methods: A population-based retrospective cohort study using data from the Rochester Epidemiology Project (REP) was performed. A random sample of patients from Olmsted County, Minnesota with a confirmed TBI between 1987 and 2000 was identified and vital status determined in 2013. Parametric survival modelling was then used to develop a model to predict life expectancy following TBI conditional on age at injury. Survivorship following TBI was also compared with the general population and age- and gender-matched non-head injured REP controls. Results: Seven hundred and sixty nine patients were included in complete case analyses. The median follow-up time was 16.1 years (interquartile range 9.0-20.4) with 120 deaths occurring in the cohort during the study period. Survival after acute TBI was well represented by a Gompertz distribution. Victims of TBI surviving for at least 6 months post-injury demonstrated a much higher ongoing mortality rate compared to the US general population and non-TBI controls (hazard ratio 1.47, 95% CI 1.15-1.87). US general population cohort life table data was used to update the Gompertz model's shape and scale parameters to account for cohort effects and allow prediction of life expectancy in contemporary TBI. Conclusions: Survivors of TBI have decreased life expectancy compared to the general population. This may be secondary to the head injury itself or result from patient characteristics associated with both the propensity for TBI and increased early mortality. Post-TBI life expectancy estimates may be useful to guide prognosis, in public health planning, for actuarial applications and in the extrapolation of outcomes for TBI economic models.


Author(s):  
A. V. Nikulin ◽  
I. V. Pashkov ◽  
Y. S. Yakunin

According to the International Agency for Research on Cancer, there were an estimated 19,292,789 new cancer cases in various localizations and 9,958,133 cancer deaths worldwide in 2020. These frightening figures clearly show that malignancies among the population is a pressing matter. The risk of post-transplant malignancy in solid organ recipients is 2–6-times higher than in the general population. Given the steadily increasing number of solid organ transplants worldwide and the gradual increase in life expectancy among organ recipients, studying the issues concerning risk factors and development mechanisms becomes a crucial task.


Author(s):  
Natalie Glaser ◽  
Michael Persson ◽  
Anders Franco‐Cereceda ◽  
Ulrik Sartipy

Background Prior studies showed that life expectancy in patients who underwent surgical aortic valve replacement (AVR) was lower than in the general population. Explanations for this shorter life expectancy are unknown. The aim of this nationwide, observational cohort study was to investigate the cause‐specific death following surgical AVR. Methods and Results We included 33 018 patients who underwent primary surgical AVR in Sweden between 1997 and 2018, with or without coronary artery bypass grafting. The SWEDEHEART (Swedish Web‐System for Enhancement and Development of Evidence‐Based Care in Heart Disease Evaluated According to Recommended Therapies) register and other national health‐data registers were used to obtain and characterize the study cohort and to identify causes of death, categorized as cardiovascular mortality, cancer mortality, or other causes of death. The relative risks for cause‐specific mortality in patients who underwent AVR compared with the general population are presented as standardized mortality ratios. During a mean follow‐up period of 7.3 years (maximum 22.0 years), 14 237 (43%) patients died. The cumulative incidence of death from cardiovascular, cancer‐related, or other causes was 23.5%, 8.3%, and 11.6%, respectively, at 10 years, and 42.8%, 12.8%, and 23.8%, respectively, at 20 years. Standardized mortality ratios for cardiovascular, cancer‐related, and other causes of death were 1.79 (95% CI, 1.75–1.83), 1.00 (95% CI, 0.97–1.04), and 1.08 (95% CI, 1.05–1.12), respectively. Conclusions We found that life expectancy following AVR was lower than in the general population. Lower survival after AVR was explained by an increased relative risk of cardiovascular death. Future studies should focus on the role of earlier surgery in patients with asymptomatic aortic stenosis and on optimizing treatment and follow‐up after AVR. Registration URL: https://www.clinicaltrials.gov ; Unique identifier: NCT02276950.


2013 ◽  
Vol 5 (1) ◽  
pp. e2013050 ◽  
Author(s):  
Elihu Estey

Although “less intense” therapies are finding more use in AML, the principal problem in AML remains lack of efficacy rather than toxicity. Hence less intense therapies are of little use if they are not more effective as well as less toxic than standard therapies.. Assignment of patients to less intense therapies should be based on other factors in addition to age. Azacitidine and decitabine, the most commonly used less intense therapies in AML very probably produce better OS than best “supportive care” or “low-dose” ara-C. However improvement is relatively small when compared to expected life expectancy in the absence of disease. Accordingly, while azacitidine or decitabine should be considered the standards against which newer therapies are compared, continued investigation of potentially more effective therapies needs to continue. Better means for evaluating the large number of these therapies (and their combinations) are also needed.   


Blood ◽  
1997 ◽  
Vol 89 (7) ◽  
pp. 2319-2327 ◽  
Author(s):  
Yves Najean ◽  
Jean-Didier Rain

Abstract Despite myelosuppression, polycythemic (PV) patients greater than 65 years of age have a high risk of vascular complications, and the leukemic risk exceeds 15% after 12 years. Is the addition of low-dose maintenance treatment with hydroxyurea (HU) after radiophosphorus (32P) myelosuppression able to decrease these complications? Since the end of 1979, 461 patients were randomized to receive (or not) low-dose HU (5 to 10 mg/kg/d), after the first 32P-induced remission, and were observed until death or June 1996. Maintenance treatment very significantly prolonged the duration of 32P-induced remissions and reduced the annual mean dose received to one-third. However, despite this maintenance, 25% of the patients had an excessive platelet count and the rate of serious vascular complications was not decreased, except in the most severe cases with short-term relapse of polycythemia. Furthermore, the leukemia rate was significantly increased beyond 8 years and a significant excess of carcinomas was also observed. The continuous use of HU did not decrease the risk of progression to myelofibrosis (incidence of 20% after 15 years). Life expectancy was shorter (a median of 9.3 years v 10.9 years with 32P alone), except in the most severe cases (initial 32P-induced remission lasting &lt;2 years) in which maintenance treatment moderately prolonged the survival by reducing the vascular risk. In most cases of PV, in which the duration of the first 32P-induced remission exceeded 2 years, the introduction of HU maintenance did not reduce the vascular risk. Although it considerably decreased the mean dose of 32P received, HU maintenance therapy significantly increased the leukemia and cancer risks and reduced the mean life expectancy by 15%. However, in cases with more rapid recurrence, the introduction of maintenance treatment reduced the vascular risks and moderately prolonged survival. The use of HU as a maintenance therapy is therefore only justified in this situation.


2019 ◽  
Vol 6 (6) ◽  
pp. 347-355 ◽  
Author(s):  
Talip E Eroglu ◽  
Grimur H Mohr ◽  
Marieke T Blom ◽  
Arie O Verkerk ◽  
Patrick C Souverein ◽  
...  

Abstract Aims Various drugs increase the risk of out-of-hospital cardiac arrest (OHCA) in the general population by impacting cardiac ion channels, thereby causing ventricular tachycardia/fibrillation (VT/VF). Dihydropyridines block L-type calcium channels, but their association with OHCA risk is unknown. We aimed to study whether nifedipine and/or amlodipine, often-used dihydropyridines, are associated with increased OHCA risk, and how these drugs impact on cardiac electrophysiology. Methods and results We conducted a case–control study with VT/VF-documented OHCA cases with presumed cardiac cause from ongoing population-based OHCA registries in the Netherlands and Denmark, and age/sex/index date-matched non-OHCA controls (Netherlands: PHARMO Database Network, Denmark: Danish Civil Registration System). We included 2503 OHCA cases, 10 543 non-OHCA controls in Netherlands, and 8101 OHCA cases, 40 505 non-OHCA controls in Denmark. To examine drug effects on cardiac electrophysiology, we performed single-cell patch-clamp studies in human-induced pluripotent stem cell-derived cardiomyocytes. Use of high-dose nifedipine (≥60 mg/day), but not low-dose nifedipine (&lt;60 mg/day) or amlodipine (any-dose), was associated with higher OHCA risk than non-use of dihydropyridines [Netherlands: adjusted odds ratios (ORadj) 1.45 (95% confidence interval 1.02–2.07), Denmark: 1.96 (1.18–3.25)] or use of amlodipine [Netherlands: 2.31 (1.54–3.47), Denmark: 2.20 (1.32–3.67)]. Out-of-hospital cardiac arrest risk of (high-dose) nifedipine use was not further increased in patients using nitrates, or with a history of ischaemic heart disease. Nifedipine and amlodipine blocked L-type calcium channels at similar concentrations, but, at clinically used concentrations, nifedipine caused more L-type calcium current block, resulting in more action potential shortening. Conclusion High-dose nifedipine, but not low-dose nifedipine or any-dose amlodipine, is associated with increased OHCA risk in the general population. Careful titration of nifedipine dose should be considered.


Sign in / Sign up

Export Citation Format

Share Document