adjusted hazard ratio
Recently Published Documents


TOTAL DOCUMENTS

211
(FIVE YEARS 150)

H-INDEX

17
(FIVE YEARS 7)

2022 ◽  
Vol 11 (2) ◽  
pp. 342
Author(s):  
Sejoong Ahn ◽  
Jonghak Park ◽  
Juhyun Song ◽  
Jooyeong Kim ◽  
Hanjin Cho ◽  
...  

Detecting sepsis patients who are at a high-risk of mechanical ventilation is important in emergency departments (ED). The respiratory rate oxygenation (ROX) index is the ratio of tissue oxygen saturation/fraction of inspired oxygen to the respiratory rate. This study aimed to investigate whether the ROX index could predict mechanical ventilator use in sepsis patients in an ED. This retrospective observational study included quick sequential organ failure assessment (qSOFA) ≥ 2 sepsis patients that presented to the ED between September 2019 and April 2020. The ROX and ROX-heart rate (HR) indices were significantly lower in patients with mechanical ventilator use within 24 h than in those without the use of a mechanical ventilator (4.0 [3.2–5.4] vs. 10.0 [5.9–15.2], p < 0.001 and 3.9 [2.7–5.8] vs. 10.1 [5.4–16.3], p < 0.001, respectively). The area under the receiver operating characteristic (ROC) curve of the ROX and ROX-HR indices were 0.854 and 0.816 (both p < 0.001). The ROX and ROX-HR indices were independently associated with mechanical ventilator use within 24 h (adjusted hazard ratio = 0.78, 95% CI: 0.68–0.90, p < 0.001 and adjusted hazard ratio = 0.87, 95% CI 0.79–0.96, p = 0.004, respectively). The 28-day mortality was higher in the low ROX and low ROX-HR groups. The ROX and ROX-HR indices were associated with mechanical ventilator use within 24 h in qSOFA ≥ 2 patients in the ED.


BMJ ◽  
2021 ◽  
pp. e068665
Author(s):  
Anders Husby ◽  
Jørgen Vinsløv Hansen ◽  
Emil Fosbøl ◽  
Emilia Myrup Thiesson ◽  
Morten Madsen ◽  
...  

AbstractObjectiveTo investigate the association between SARS-CoV-2 vaccination and myocarditis or myopericarditis.DesignPopulation based cohort study.SettingDenmark.Participants4 931 775 individuals aged 12 years or older, followed from 1 October 2020 to 5 October 2021.Main outcome measuresThe primary outcome, myocarditis or myopericarditis, was defined as a combination of a hospital diagnosis of myocarditis or pericarditis, increased troponin levels, and a hospital stay lasting more than 24 hours. Follow-up time before vaccination was compared with follow-up time 0-28 days from the day of vaccination for both first and second doses, using Cox proportional hazards regression with age as an underlying timescale to estimate hazard ratios adjusted for sex, comorbidities, and other potential confounders.ResultsDuring follow-up, 269 participants developed myocarditis or myopericarditis, of whom 108 (40%) were 12-39 years old and 196 (73%) were male. Of 3 482 295 individuals vaccinated with BNT162b2 (Pfizer-BioNTech), 48 developed myocarditis or myopericarditis within 28 days from the vaccination date compared with unvaccinated individuals (adjusted hazard ratio 1.34 (95% confidence interval 0.90 to 2.00); absolute rate 1.4 per 100 000 vaccinated individuals within 28 days of vaccination (95% confidence interval 1.0 to 1.8)). Adjusted hazard ratios among female participants only and male participants only were 3.73 (1.82 to 7.65) and 0.82 (0.50 to 1.34), respectively, with corresponding absolute rates of 1.3 (0.8 to 1.9) and 1.5 (1.0 to 2.2) per 100 000 vaccinated individuals within 28 days of vaccination, respectively. The adjusted hazard ratio among 12-39 year olds was 1.48 (0.74 to 2.98) and the absolute rate was 1.6 (1.0 to 2.6) per 100 000 vaccinated individuals within 28 days of vaccination. Among 498 814 individuals vaccinated with mRNA-1273 (Moderna), 21 developed myocarditis or myopericarditis within 28 days from vaccination date (adjusted hazard ratio 3.92 (2.30 to 6.68); absolute rate 4.2 per 100 000 vaccinated individuals within 28 days of vaccination (2.6 to 6.4)). Adjusted hazard ratios among women only and men only were 6.33 (2.11 to 18.96) and 3.22 (1.75 to 5.93), respectively, with corresponding absolute rates of 2.0 (0.7 to 4.8) and 6.3 (3.6 to 10.2) per 100 000 vaccinated individuals within 28 days of vaccination, respectively. The adjusted hazard ratio among 12-39 year olds was 5.24 (2.47 to 11.12) and the absolute rate was 5.7 (3.3 to 9.3) per 100 000 vaccinated individuals within 28 days of vaccination.ConclusionsVaccination with mRNA-1273 was associated with a significantly increased risk of myocarditis or myopericarditis in the Danish population, primarily driven by an increased risk among individuals aged 12-39 years, while BNT162b2 vaccination was only associated with a significantly increased risk among women. However, the absolute rate of myocarditis or myopericarditis after SARS-CoV-2 mRNA vaccination was low, even in younger age groups. The benefits of SARS-CoV-2 mRNA vaccination should be taken into account when interpreting these findings. Larger multinational studies are needed to further investigate the risks of myocarditis or myopericarditis after vaccination within smaller subgroups.


2021 ◽  
Vol 22 (1) ◽  
Author(s):  
Hui-Chun Chang ◽  
Chung-Han Ho ◽  
Shu-Chen Kung ◽  
Wan-Lin Chen ◽  
Ching-Min Wang ◽  
...  

Abstract Background Driving pressure (∆P) is an important factor that predicts mortality in acute respiratory distress syndrome (ARDS). We test the hypothesis that serial changes in daily ΔP rather than Day 1 ΔP would better predict outcomes of patients with ARDS. Methods This retrospective cohort study enrolled patients admitted to five intensive care units (ICUs) at a medical center in Taiwan between March 2009 and January 2018 who met the criteria for ARDS and received the lung-protective ventilation strategy. ∆P was recorded daily for 3 consecutive days after the diagnosis of ARDS, and its correlation with 60-day survival was analyzed. Results A total of 224 patients were enrolled in the final analysis. The overall ICU and 60-day survival rates were 52.7% and 47.3%, respectively. ∆P on Days 1, 2, and 3 was significantly lower in the survival group than in the nonsurvival group (13.8 ± 3.4 vs. 14.8 ± 3.7, p = 0.0322, 14 ± 3.2 vs. 15 ± 3.5, p = 0.0194, 13.6 ± 3.2 vs. 15.1 ± 3.4, p = 0.0014, respectively). The patients were divided into four groups according to the daily changes in ∆P, namely, the low ∆P group (Day 1 ∆P < 14 cmH2O and Day 3 ∆P < 14 cmH2O), decrement group (Day 1 ∆P ≥ 14 cmH2O and Day 3 ∆P < 14 cmH2O), high ∆P group (Day 1 ∆P ≥ 14 cmH2O and Day 3 ∆P ≥ 14 cmH2O), and increment group (Day 1 ∆P < 14 cmH2O and Day 3 ∆P ≥ 14 cmH2O). The 60-day survival significantly differed among the four groups (log-rank test, p = 0.0271). Compared with the low ΔP group, patients in the decrement group did not have lower 60-day survival (adjusted hazard ratio 0.72; 95% confidence interval [CI] 0.31–1.68; p = 0.4448), while patients in the increment group had significantly lower 60-day survival (adjusted hazard ratio 1.96; 95% CI 1.11–3.44; p = 0.0198). Conclusions Daily ∆P remains an important predicting factor for survival in patients with ARDS. Serial changes in daily ΔP might be more informative than a single Day 1 ΔP value in predicting survival of patients with ARDS.


Stroke ◽  
2021 ◽  
Author(s):  
Chien-Heng Lin ◽  
Jung-Nien Lai ◽  
Inn-Chi Lee ◽  
I-Ching Chou ◽  
Wei-De Lin ◽  
...  

Background and Purpose: Previous epidemiological investigations examining the association between Kawasaki disease (KD) and cerebrovascular disease have had conflicting results. We analyzed the association between KD and cerebrovascular disease by conducting a population-based retrospective cohort study designed to investigate the hypothesis that KD could be a risk factor for subsequent cerebrovascular disease. Methods: From the National Health Insurance Research Database of Taiwan, the data of children (aged 0–18 years old) with KD (n=8467) were collected. Starting with the first year of study observation (referred to as the baseline year), data was collected for each child with KD, and 4 non-KD patients matched for sex, urbanization level of residence, and parental occupation were randomly selected to form the non-KD cohort (n=33 868) for our analysis. For the period from January 1, 2000, to December 31, 2012, we calculated the follow-up person-years for each patient, which is the time from the index date to the diagnosis of cerebrovascular disease, death, or the end of 2012. Furthermore, we compared the incidence, the incidence rate ratio, and the 95% CI of cerebrovascular disease between the KD and non-KD cohorts. Results: The overall cerebrovascular disease incidence rate was found to be 3.19-fold higher, which is significantly higher, in the KD cohort than in the non-KD cohort (14.73 versus 4.62 per 100 000 person-years), and the overall risk of cerebrovascular disease remained higher in the KD cohort (adjusted hazard ratio, 3.16 [95% CI, 1.46–6.85]). Furthermore, children aged <5 years showed a significantly higher risk of subsequent cerebrovascular disease in the KD cohort (adjusted hazard ratio, 3.14 [95% CI, 1.43–6.92]). Conclusions: This nationwide retrospective cohort study shows that KD may increase the risk of subsequent cerebrovascular disease, especially in those with KD aged <5 years old.


2021 ◽  
Vol 11 ◽  
Author(s):  
Haorui Zhang ◽  
Bocen Chen ◽  
Zixiu Zou ◽  
Jian Feng ◽  
Yutao Li ◽  
...  

BackgroundThe 5-year survival rate of patients with lung cancer in China is less than 20% and predicting their prognosis is challenging. We investigated the association between a common non-synonymous single nucleotide polymorphism (SNP), rs7214723, in the Ca2+/calmodulin-dependent protein kinase kinase 1 (CAMKK1) gene and the prognosis of patients with lung cancer.MethodsGenomic DNA was extracted from the blood samples of 839 patients with lung cancer, recruited from Changhai Hospital (n = 536) and Taizhou Institute of Health Sciences (n = 352), and genotyped using the SNPscan technique. The association between patient prognosis and the genotypic data for CAMKK1 was analyzed using a multivariate Cox proportional hazards model adjusted for multiple potential confounders. The CRISPR/Cas9 gene-editing system was used to introduce point mutations in the CAMKK1 rs7214723 of A549 and NCI-H358 cells. Subsequently, Cell proliferation and migration ability were assessed with the Cell Counting Kit-8 and scratch assay. The Annexin V-FITC apoptosis detection kit was used to detect cell apoptosis.ResultsThe CAMKK1 rs7214723 recessive CC genotype conferred significantly better overall survival (CC vs. TT + TC: adjusted hazard ratio = 0.78, 95% confidence interval [CI], 0.61-1.00, P = 0.049) than the TT + TC genotypes. Stratified analysis showed that the CAMKK1 rs7214723 CC genotype and recessive CC genotype conferred a significantly decreased risk of death in patients who were male, had a smoking history, or had stage III + IV cancer, compared with the TT and TT + TC genotypes. Relative to the TT + TC genotypes, the rs7214723 recessive CC genotype was also associated with a decreased risk of death in patients aged &lt; 60 years (CC vs. TT + TC: adjusted hazard ratio = 0.59, 95% CI, 0.37-0.93, P = 0.024) and patients with squamous cell carcinoma (CC vs. TT + TC: adjusted hazard ratio = 0.65, 95% CI, 0.44-0.98, P = 0.038). Remarkably, CRISPR/Cas9-guided single nucleotide editing demonstrated that CAMKK1 rs7214723 T &gt; C mutation significantly inhibits cell proliferation and migration and promotes cell apoptosis.ConclusionsCAMKK1 SNP rs7214723 may be a significant prognostic factor for the risk of death among patients with lung cancer.


2021 ◽  
Vol 20 (1) ◽  
Author(s):  
Anne Marie Ladehoff Thomsen ◽  
Cecilia Høst Ramlau-Hansen ◽  
Jörg Schullehner ◽  
Ninna Hinchely Ebdrup ◽  
Zeyan Liew ◽  
...  

Abstract Background Nitrosatable drugs commonly prescribed during pregnancy can react with nitrite to form N-nitroso compounds which have been associated with an increased risk of stillbirth. Whether maternal residential drinking water nitrate modifies this association is unknown. We investigated, if household drinking water nitrate was associated with stillbirth, and if it modified the association between nitrosatable prescription drug intake and the risk of stillbirth. Methods We conducted an individual-level register- and population-based cohort study using 652,810 women with the first recorded singleton pregnancy in the Danish Medical Birth Registry between 1997 and 2017. Nitrosatable drug exposure was recorded by use of the Danish National Patient Registry defined as women with a first redeemed prescription of a nitrosatable drug the first 22 weeks of pregnancy. The reference group was women with no redeemed prescription of a nitrosatable drug in this period. The average individual drinking water nitrate concentration level (mg/L) was calculated in the same period. We categorized nitrosatable drugs as secondary amines, tertiary amines, and amides. Cox hazard regression was used to estimate crude and adjusted hazard ratios with 95% confidence intervals for stillbirth stratified into five categories of nitrate concentrations: ≤1 mg/L, > 1- ≤ 2 mg/L, > 2- ≤ 5 mg/L, > 5- ≤ 25 mg/L, and > 25 mg/L. Results Drinking water nitrate exposure in the population was not associated with the risk of stillbirth. Among 100,244 women who had a nitrosatable prescription drug redeemed ≤22 weeks of pregnancy of pregnancy, 418 (0.42%) had a stillbirth compared to 1993 stillbirths (0.36%) among 552,566 referent women. Women with any nitrosatable prescription drug intake and > 1- ≤ 2 mg/L nitrate concentration had an increased risk of stillbirth [adjusted hazard ratio 1.55 (95% confidence interval, 1.15–2.09)] compared with referent women. In the stratified analyses, the highest risk of stillbirth was found among women with secondary amine intake and > 25 mg/L nitrate concentrations [adjusted hazard ratio 3.11 (95% CI, 1.08–8.94)]. Conclusions The association between nitrosatable prescription drug intake and the risk of stillbirth may depend on the level of nitrate in household drinking water. Evaluations of the effect of nitrosatable drug intake on perinatal outcomes might consider nitrate exposure from drinking water.


Blood ◽  
2021 ◽  
Vol 138 (Supplement 1) ◽  
pp. 364-364
Author(s):  
Sujith Samarasinghe ◽  
Ajay Vora ◽  
Nicholas John Goulden ◽  
Grace Antony ◽  
Anthony V. Moorman

Abstract The UKALL 2003 trial aimed to safely reduce treatment intensity in low-risk patients and intensify therapy in high-risk patients based on minimal residual disease (MRD) stratification. The MRD risk patients were randomly assigned to standard (Regimen A/B) or augmented (Regimen C) post remission therapy, whilst the MRD low risk patients were randomly allocated to receive one or a standard two delayed intensifications (DI). In the original analysis, the five-year event free survival in the MRD risk patients was superior in the augmented group whilst in the MRD low risk group there was no significant difference between one or two delayed intensifications. As late relapses may influence these results, particularly in the low-risk patients, we analysed ten-year outcomes for patients in the trial overall and by the randomisations. There were a total of a total of 3113 eligible patients for analysis. The median follow up time was 9.4 years. In the overall trial population, 10-year relapse risk was 10.7 % (95%CI 9.6-11.92 %), with a 10-year event free survival (EFS) of 84.8 % (95 % CI 83.5-86.1 %) and overall survival (OS) of 89.6% (95%CI 88.4-90.6%). There was a higher risk of relapse on univariate and multivariate cox analysis with male gender, increasing age, increasing white cell count, MRD (high vs low), NCI Risk Group (High vs standard) and immunophenotype (T vs B cell). All except gender were also significant on univariate and multivariate analysis for event free and overall survival. Cytogenetic high risk patients treated on regimen C had a lower 10 year relapse risk (22.1 % (95% CI 15.1-31.6) compared to those who remained on regimen A/B (52.4 % (95% CI 28.9-80.1, p=0.016), although the OS rates were not significantly different (75.3 % (95% CI 65.8-82.5) vs 66.7 % (95%CI 37.5-84.6), p=0.3). The ten year cumulative incidence of second tumours was 1.16 %( 95 % CI 0.74-1.82). 521 MRD low risk patients were randomised (260 assigned to one delayed intensification and 261 to two delayed intensifications). The 10-year EFS was 91.7 % (95% CI 85.7-94.0) with one course of delayed intensification vs 93.7 % (95% CI 90.0-96.1) with two delayed intensifications (adjusted hazard ratio 0.73, (95% CI 0.38-1.40) p=0.3). The 10-year overall survival was 97.1 % (95 % CI 94.0-98.6) with one delayed intensification and 97.6 % (95 % CI 94.7-98.9) with two delayed intensifications (adjusted hazard ratio 0.69 % (95 % CI 0.24-1.99) P=0.5. 533 MRD high risk patients were randomised (266 assigned standard therapy and 267 assigned to augmented therapy). The 10-year EFS (was 82.1% (95 % CI 76.9-86.2) with standard therapy vs 87.1 % (95 % CI 82.4-90.6) with augmented therapy (adjusted hazard ratio 0.68 (95 % CI 0.44-1.06) p=0.09). The 10-year OS was 87.9 % (95% CI 83.2-91.4) with standard therapy vs 90.7 % (95 % CI 86.4-93.7) (adjusted hazard ratio 0.74(95%CI 0.44-1.27) p=0.3. The loss of significance in EFS between 5 and 10 years was due to additional relapses since the original publication, in the augmented arm. Nevertheless, there remained a benefit for augmented therapy in reducing marrow relapses: cumulative incidence of marrow relapse was 10.4% (95% CI 7.2-14.9) in standard arm vs 5.9% (95% CI 3.6-9.6) (adjusted hazard ratio 0.55 (0.28-1.03) p=0.06. Long term outcome of UKALL 2003 confirms that low risk patients can safely de-escalate therapy and intensified therapy benefits high risk patients, especially those with high-risk cytogenetics. Disclosures Samarasinghe: AMGEN,JAZZ: Honoraria.


2021 ◽  
Vol 25 (71) ◽  
pp. 1-174
Author(s):  
Jonathan Bedford ◽  
Laura Drikite ◽  
Mark Corbett ◽  
James Doidge ◽  
Paloma Ferrando-Vivas ◽  
...  

Background New-onset atrial fibrillation occurs in around 10% of adults treated in an intensive care unit. New-onset atrial fibrillation may lead to cardiovascular instability and thromboembolism, and has been independently associated with increased length of hospital stay and mortality. The long-term consequences are unclear. Current practice guidance is based on patients outside the intensive care unit; however, new-onset atrial fibrillation that develops while in an intensive care unit differs in its causes and the risks and clinical effectiveness of treatments. The lack of evidence on new-onset atrial fibrillation treatment or long-term outcomes in intensive care units means that practice varies. Identifying optimal treatment strategies and defining long-term outcomes are critical to improving care. Objectives In patients treated in an intensive care unit, the objectives were to (1) evaluate existing evidence for the clinical effectiveness and safety of pharmacological and non-pharmacological new-onset atrial fibrillation treatments, (2) compare the use and clinical effectiveness of pharmacological and non-pharmacological new-onset atrial fibrillation treatments, and (3) determine outcomes associated with new-onset atrial fibrillation. Methods We undertook a scoping review that included studies of interventions for treatment or prevention of new-onset atrial fibrillation involving adults in general intensive care units. To investigate the long-term outcomes associated with new-onset atrial fibrillation, we carried out a retrospective cohort study using English national intensive care audit data linked to national hospital episode and outcome data. To analyse the clinical effectiveness of different new-onset atrial fibrillation treatments, we undertook a retrospective cohort study of two large intensive care unit databases in the USA and the UK. Results Existing evidence was generally of low quality, with limited data suggesting that beta-blockers might be more effective than amiodarone for converting new-onset atrial fibrillation to sinus rhythm and for reducing mortality. Using linked audit data, we showed that patients developing new-onset atrial fibrillation have more comorbidities than those who do not. After controlling for these differences, patients with new-onset atrial fibrillation had substantially higher mortality in hospital and during the first 90 days after discharge (adjusted odds ratio 2.32, 95% confidence interval 2.16 to 2.48; adjusted hazard ratio 1.46, 95% confidence interval 1.26 to 1.70, respectively), and higher rates of subsequent hospitalisation with atrial fibrillation, stroke and heart failure (adjusted cause-specific hazard ratio 5.86, 95% confidence interval 5.33 to 6.44; adjusted cause-specific hazard ratio 1.47, 95% confidence interval 1.12 to 1.93; and adjusted cause-specific hazard ratio 1.28, 95% confidence interval 1.14 to 1.44, respectively), than patients who did not have new-onset atrial fibrillation. From intensive care unit data, we found that new-onset atrial fibrillation occurred in 952 out of 8367 (11.4%) UK and 1065 out of 18,559 (5.7%) US intensive care unit patients in our study. The median time to onset of new-onset atrial fibrillation in patients who received treatment was 40 hours, with a median duration of 14.4 hours. The clinical characteristics of patients developing new-onset atrial fibrillation were similar in both databases. New-onset atrial fibrillation was associated with significant average reductions in systolic blood pressure of 5 mmHg, despite significant increases in vasoactive medication (vasoactive-inotropic score increase of 2.3; p < 0.001). After adjustment, intravenous beta-blockers were not more effective than amiodarone in achieving rate control (adjusted hazard ratio 1.14, 95% confidence interval 0.91 to 1.44) or rhythm control (adjusted hazard ratio 0.86, 95% confidence interval 0.67 to 1.11). Digoxin therapy was associated with a lower probability of achieving rate control (adjusted hazard ratio 0.52, 95% confidence interval 0.32 to 0.86) and calcium channel blocker therapy was associated with a lower probability of achieving rhythm control (adjusted hazard ratio 0.56, 95% confidence interval 0.39 to 0.79) than amiodarone. Findings were consistent across both the combined and the individual database analyses. Conclusions Existing evidence for new-onset atrial fibrillation management in intensive care unit patients is limited. New-onset atrial fibrillation in these patients is common and is associated with significant short- and long-term complications. Beta-blockers and amiodarone appear to be similarly effective in achieving cardiovascular control, but digoxin and calcium channel blockers appear to be inferior. Future work Our findings suggest that a randomised controlled trial of amiodarone and beta-blockers for management of new-onset atrial fibrillation in critically ill patients should be undertaken. Studies should also be undertaken to provide evidence for or against anticoagulation for patients who develop new-onset atrial fibrillation in intensive care units. Finally, given that readmission with heart failure and thromboembolism increases following an episode of new-onset atrial fibrillation while in an intensive care unit, a prospective cohort study to demonstrate the incidence of atrial fibrillation and/or left ventricular dysfunction at hospital discharge and at 3 months following the development of new-onset atrial fibrillation should be undertaken. Trial registration Current Controlled Trials ISRCTN13252515. Funding This project was funded by the National Institute for Health Research (NIHR) Health Technology Assessment programme and will be published in full in Health Technology Assessment; Vol. 25, No. 71. See the NIHR Journals Library website for further project information.


Author(s):  
Simon Correa ◽  
Xavier E. Guerra-Torres ◽  
Sushrut S. Waikar ◽  
Finnian R. Mc Causland

Magnesium is involved in the regulation of blood pressure (BP). Abnormalities in serum magnesium are common in chronic kidney disease (CKD), yet its association with the development of hypertension and CKD progression in patients with CKD is unclear. We analyzed data from 3866 participants from the CRIC Study (Chronic Renal Insufficiency Cohort). Linear regression assessed the association of serum magnesium with baseline systolic BP (SBP) and diastolic BP (DBP). Logistic regression explored the association of serum magnesium with various definitions of hypertension. Cox proportional hazards models explored assessed the risk of incident hypertension and CKD progression. Mean serum magnesium was 2.0 mEq/L (±0.3 mEq/L). Higher magnesium was associated with lower SBP (−3.4 mm Hg [95% CI, −5.8 to −1.0 per 1 mEq/L]) and lower DBP (−2.9 mm Hg [95% CI, −4.3 to −1.5 per 1 mEq/L]). Higher magnesium was associated with a lower risk of American Heart Association–defined hypertension (SBP≥130 mm Hg or DBP≥80 mm Hg) at baseline (adjusted hazard ratio, 0.65 [95% CI, 0.49–0.86 per 1 mEq/L]), a lower risk of suboptimally controlled BP (SBP≥120 mm Hg or DBP≥80 mm Hg; adjusted odds ratio, 0.58 [95% CI, 0.43–0.78 per 1 mEq/L]). In time-to-event analyses, higher baseline serum magnesium was associated with a nominally lower risk of incident CRIC-defined hypertension (adjusted hazard ratio, 0.77 [95% CI, 0.46–1.31 per 1 mEq/L]). Higher magnesium was associated with a significantly lower risk of CKD progression (adjusted hazard ratio, 0.68 [95% CI, 0.54–0.86 per 1 mEq/L]). In patients with CKD, higher serum magnesium is associated with lower SBP and DBP, and with a lower risk of hypertension and CKD progression. In patients with CKD, whether magnesium supplementation could optimize BP control and prevent disease progression deserves further investigation.


2021 ◽  
Vol 16 (1) ◽  
Author(s):  
Laurence Weinberg ◽  
Bobby Ou Yang ◽  
Luka Cosic ◽  
Sarah Klink ◽  
Peter Le ◽  
...  

Abstract Background The outcomes of nonagenarian patients undergoing orthopaedic surgery are not well understood. We investigated the 30-day mortality after surgical treatment of unilateral hip fracture. The relationship between postoperative complications and mortality was evaluated. Methods We performed a single-centre retrospective cohort study of nonagenarian patients undergoing hip fracture surgery over a 6-year period. Postoperative complications were graded according to the Clavien–Dindo classification. Correlation analyses were performed to evaluate the relationship between mortality and pre-specified mortality risk predictors. Survival analyses were assessed using Cox proportional hazards regression modelling. Results The study included 537 patients. The 30-day mortality rate was 7.4%. The mortality rate over a median follow-up period of 30 months was 18.2%. Postoperative complications were observed in 459 (85.5%) patients. Both the number and severity of complications were related to mortality (p < 0.001). Compared to patients who survived, deceased patients were more frail (p = 0.034), were at higher ASA risk (p = 0.010) and were more likely to have preoperative congestive heart failure (p < 0.001). The adjusted hazard ratio for mortality according to the number of complications was 1.3 (95% CI 1.1, 1.5; p = 0.003). Up to 21 days from admission, any increase in complication severity was associated significantly greater mortality [adjusted hazard ratio: 3.0 (95% CI 2.4, 3.6; p < 0.001)]. Conclusion In a nonagenarian cohort of patients undergoing hip fracture surgery, 30-day mortality was 7.4%, but 30-month mortality rates approached one in five patients. Postoperative complications were independently associated with a higher mortality, particularly when occurring early.


Sign in / Sign up

Export Citation Format

Share Document