scholarly journals Underweight increases the risk of early death in tuberculosis patients

2017 ◽  
Vol 118 (12) ◽  
pp. 1052-1060 ◽  
Author(s):  
Yung-Feng Yen ◽  
Fu-I Tung ◽  
Bo-Lung Ho ◽  
Yun-Ju Lai

AbstractEvidence regarding the association between BMI and mortality in tuberculosis (TB) patients is limited and inconsistent. We investigated the impact of BMI on TB-specific and non-TB-specific mortality with respect to different timing of death. All Taiwanese adults with TB in Taipei were included in a retrospective cohort study in 2012–2014. Multinomial Cox proportional hazards regression was used to evaluate the associations between BMI, cause-specific mortality and timing of death. Of 2410 eligible patients, 86·0 % (2061) were successfully treated, and TB-specific and non-TB-specific mortality occurred for 2·2 % (54) and 13·9 % (335), respectively. After controlling for potential confounders, underweight was significantly associated with a higher risk of all-cause mortality (adjusted hazard ratio (AHR) 1·57; 95 % CI 1·26, 1·95), whereas overweight was not. When cause-specific death was considered, underweight was associated with an increased risk of either TB-specific (AHR 1·85; 95 % CI 1·03, 3·33) or non-TB-specific death (AHR 1·52; 95 % CI 1·19, 1·95) during treatment. With joint consideration of cause-specific and timing of death, underweight only significantly increased the risk of TB-specific (AHR 2·23; 95 % CI 1·09, 4·59) and non-TB-specific mortality (AHR 1·81; 95 % CI 1·29, 2·55) within the first 8 weeks of treatment. This study suggests that underweight increases the risk of early death in TB patients during treatment.

2021 ◽  
Vol 10 (7) ◽  
pp. 1514
Author(s):  
Hilde Espnes ◽  
Jocasta Ball ◽  
Maja-Lisa Løchen ◽  
Tom Wilsgaard ◽  
Inger Njølstad ◽  
...  

The aim of this study was to explore sex-specific associations between systolic blood pressure (SBP), hypertension, and the risk of incident atrial fibrillation (AF) subtypes, including paroxysmal, persistent, and permanent AF, in a general population. A total of 13,137 women and 11,667 men who participated in the fourth survey of the Tromsø Study (1994–1995) were followed up for incident AF until the end of 2016. Cox proportional hazards regression analysis was conducted using fractional polynomials for SBP to provide sex- and AF-subtype-specific hazard ratios (HRs) for SBP. An SBP of 120 mmHg was used as the reference. Models were adjusted for other cardiovascular risk factors. Over a mean follow-up of 17.6 ± 6.6 years, incident AF occurred in 914 (7.0%) women (501 with paroxysmal/persistent AF and 413 with permanent AF) and 1104 (9.5%) men (606 with paroxysmal/persistent AF and 498 with permanent AF). In women, an SBP of 180 mmHg was associated with an HR of 2.10 (95% confidence interval [CI] 1.60–2.76) for paroxysmal/persistent AF and an HR of 1.80 (95% CI 1.33–2.44) for permanent AF. In men, an SBP of 180 mmHg was associated with an HR of 1.90 (95% CI 1.46–2.46) for paroxysmal/persistent AF, while there was no association with the risk of permanent AF. In conclusion, increasing SBP was associated with an increased risk of both paroxysmal/persistent AF and permanent AF in women, but only paroxysmal/persistent AF in men. Our findings highlight the importance of sex-specific risk stratification and optimizing blood pressure management for the prevention of AF subtypes in clinical practice.


2018 ◽  
Vol 31 (2) ◽  
pp. 322-342 ◽  
Author(s):  
Shanna L. Burke ◽  
Tianyan Hu ◽  
Christine E. Spadola ◽  
Aaron Burgess ◽  
Tan Li ◽  
...  

Objective: This study explored two research questions: (a) Does sleep medication neutralize or provide a protective effect against the hazard of Alzheimer’s disease (AD)? (b) Do apolipoprotein (APOE) e4 carriers reporting a sleep disturbance experience an increased risk of AD? Method: This study is a secondary analysis of the National Alzheimer’s Coordinating Center’s Uniform Data Set ( n = 6,782) using Cox proportional hazards regression. Results: Sleep disturbance was significantly associated with eventual AD development. Among the subset of participants taking general sleep medications, no relationship between sleep disturbance and eventual AD was observed. Among individuals not taking sleep medications, the increased hazard between the two variables remained. Among APOE e4 carriers, sleep disturbance and AD were significant, except among those taking zolpidem. Discussion: Our findings support the emerging link between sleep disturbance and AD. Our findings also suggest a continued need to elucidate the mechanisms that offer protective factors against AD development.


2020 ◽  
Vol 105 (9) ◽  
pp. 3005-3014
Author(s):  
Brittany R Lapin ◽  
Kevin M Pantalone ◽  
Alex Milinovich ◽  
Shannon Morrison ◽  
Andrew Schuster ◽  
...  

Abstract Purpose Type 2 diabetes–related polyneuropathy (DPN) is associated with increased vascular events and mortality, but determinants and outcomes of pain in DPN are poorly understood. We sought to examine the effect of neuropathic pain on vascular events and mortality in patients without DPN, DPN with pain (DPN + P), and DPN without pain (DPN-P). Methods A retrospective cohort study was conducted within a large health system of adult patients with type 2 diabetes from January 1, 2009 through December 31, 2016. Using an electronic algorithm, patients were classified as no DPN, DPN + P, or DPN-P. Primary outcomes included number of vascular events and time to mortality. Independent associations with DPN + P were evaluated using multivariable negative binomial and Cox proportional hazards regression models, adjusting for demographics, socioeconomic characteristics, and comorbidities. Results Of 43 945 patients with type 2 diabetes (age 64.6 ± 14.0 years; 52.1% female), 13 910 (31.7%) had DPN: 9104 DPN + P (65.4%) vs 4806 DPN-P (34.6%). Vascular events occurred in 4538 (15.1%) of no DPN patients, 2401 (26.4%) DPN + P, and 1006 (20.9%) DPN-P. After adjustment, DPN + P remained a significant predictor of number of vascular events (incidence rate ratio [IRR] = 1.55, 95% CI, 1.29-1.85), whereas no DPN was protective (IRR = 0.70, 95% CI, 0.60-0.82), as compared to DPN-P. Compared to DPN-P, DPN + P was also a significant predictor of mortality (hazard ratio = 1.42, 95% CI, 1.25-1.61). Conclusions Our study found a significant association between pain in DPN and an increased risk of vascular events and mortality. This observation warrants longitudinal study of the risk factors and natural history of pain in DPN.


2014 ◽  
Vol 58 (12) ◽  
pp. 7468-7474 ◽  
Author(s):  
W. Picard ◽  
F. Bazin ◽  
B. Clouzeau ◽  
H.-N. Bui ◽  
M. Soulat ◽  
...  

ABSTRACTTo assess the risk of acute kidney injury (AKI) attributable to aminoglycosides (AGs) in patients with severe sepsis or septic shock, we performed a retrospective cohort study in one medical intensive care unit (ICU) in France. Patients admitted for severe sepsis/septic shock between November 2008 and January 2010 were eligible. A propensity score for AG administration was built using day 1 demographic and clinical characteristics. Patients still on the ICU on day 3 were included. Patients with renal failure before day 3 or endocarditis were excluded. The time window for assessment of renal risk was day 3 to day 15, defined according to the RIFLE (risk, injury, failure, loss, and end-stage renal disease) classification. The AKI risk was assessed by means of a propensity-adjusted Cox proportional hazards regression analysis. Of 317 consecutive patients, 198 received AGs. The SAPS II (simplified acute physiology score II) score and nosocomial origin of infection favored the use of AGs, whereas a preexisting renal insufficiency and the neurological site of infection decreased the propensity for AG treatment. One hundred three patients with renal failure before day 3 were excluded. AGs were given once daily over 2.6 ± 1.1 days. AKI occurred in 16.3% of patients in a median time of 6 (interquartile range, 5 to 10) days. After adjustment to the clinical course and exposure to other nephrotoxic agents between day 1 and day 3, a propensity-adjusted Cox proportional hazards regression analysis showed no increased risk of AKI in patients receiving AGs (adjusted relative risk = 0.75 [0.32 to 1.76]). In conclusion, in critically septic patients presenting without early renal failure, aminoglycoside therapy for less than 3 days was not associated with an increased risk of AKI.


2019 ◽  
Vol 37 (15_suppl) ◽  
pp. 10051-10051
Author(s):  
Danielle Novetsky Friedman ◽  
Pamela J Goodman ◽  
Wendy Leisenring ◽  
Lisa Diller ◽  
Susan Lerner Cohn ◽  
...  

10051 Background: Infants with neuroblastoma typically have low-risk disease with excellent survival. Therapy has been de-intensified over time to minimize late effects, however the impact on survivors’ risk of late mortality, subsequent malignant neoplasms (SMN), and chronic health conditions (CHC) is unclear. Methods: We evaluated late mortality, SMNs and CHCs (graded according to CTCAE v4.03), overall and by diagnosis era, among 990 5-year neuroblastoma survivors diagnosed at < 1 year of age between 1970-1999. Cumulative mortality, standardized mortality ratios (SMR), and standardized incidence ratios (SIR) of SMNs were estimated using the National Death Index and SEER rates, respectively. Cox proportional hazards estimated hazard ratios (HR) and 95% confidence intervals (CI) for CHC, compared to 5,051 CCSS siblings. Results: Among survivors (48% female; median attained age: 24 years, range 6-46), there was increased treatment with surgery alone across the 1970s, 1980s and 1990s (21.5%, 35.3%, 41.1%, respectively), but decreased treatment with combination surgery + radiation (22.5%, 5.3%, 0.3%, respectively) and surgery + radiation + chemotherapy (28.7%, 14.7%, 9.3%, respectively). The 20-year cumulative mortality was 2.3% (95% CI, 1.4-3.8), primarily due to SMNs (SMRSMN= 10.0, 95% CI, 4.5-22.3). The 20-year cumulative incidence of SMN was 1.2% (95% CI, 0.3-3.2), 2.5% (95% CI, 1.3-4.4), and zero for those diagnosed in the 1970s, 1980s, and 1990s, respectively. SIR was highest for renal SMNs (SIR 12.5, 95% CI, 1.7-89.4). Compared to siblings, survivors were at increased risk for grade 1-5 CHC (HR 2.1, 95% CI, 1.9-2.3) with similar HR across eras (HR1970s= 1.9, 95% CI, 1.6-2.2; HR1980s= 2.2, 95% CI, 1.9-2.6; HR1990s= 2.0, 95% CI, 1.7-2.4). The HR of severe, disabling, life-threatening and fatal CHC (grades 3-5) decreased in more recent eras (HR1970s= 4.7, 95% CI, 3.4-6.6; HR1980s= 4.4, 95% CI, 3.2-6.2; HR1990s= 2.9, 95% CI, 2.0-4.3). Conclusions: Survivors of infant neuroblastoma remain at increased risk for late mortality, SMN, and CHCs many years after diagnosis. However, the risk of grade 3-5 CHCs has declined in more recent eras, likely reflecting de-intensification of therapy.


2012 ◽  
Vol 30 (36) ◽  
pp. 4493-4500 ◽  
Author(s):  
John M. McLaughlin ◽  
Roger T. Anderson ◽  
Amy K. Ferketich ◽  
Eric E. Seiber ◽  
Rajesh Balkrishnan ◽  
...  

Purpose To determine the impact of longer periods between biopsy-confirmed breast cancer diagnosis and the initiation of treatment (Dx2Tx) on survival. Patients and Methods This study was a noninterventional, retrospective analysis of adult female North Carolina Medicaid enrollees diagnosed with breast cancer from January 1, 2000, through December, 31, 2002, in the linked North Carolina Central Cancer Registry–Medicaid Claims database. Follow-up data were available through July 31, 2006. Cox proportional hazards regression models were constructed to evaluate the impact on survival of delaying treatment ≥ 60 days after a confirmed diagnosis of breast cancer. Results The study cohort consisted of 1,786 low-income, adult women with a mean age of 61.6 years. A large proportion of the patients (44.3%) were racial minorities. Median time from biopsy-confirmed diagnosis to treatment initiation was 22 days. Adjusted Cox proportional hazards regression showed that although Dx2Tx length did not affect survival among those diagnosed at early stage, among late-stage patients, intervals between diagnosis and first treatment ≥ 60 days were associated with significantly worse overall survival (hazard ratio [HR], 1.66; 95% CI, 1.00 to 2.77; P = .05) and breast cancer–specific survival (HR, 1.85; 95% CI, 1.04 to 3.27; P = .04). Conclusion One in 10 women waited ≥ 60 days to initiate treatment after a diagnosis of breast cancer. Waiting ≥ 60 days to initiate treatment was associated with a significant 66% and 85% increased risk of overall and breast cancer–related death, respectively, among late-stage patients. Interventions designed to increase the timeliness of receiving breast cancer treatments should target late-stage patients, and clinicians should strive to promptly triage and initiate treatment for patients diagnosed at late stage.


2020 ◽  
Author(s):  
Hayley Martin ◽  
Kelly Thevenet-Morrison ◽  
Ann Dozier

Abstract Background: It is well established that mothers with above-normal pre-pregnancy BMI are at increased risk of breastfeeding cessation; however, the impact of pregnancy weight-gain (PWG) is less well-defined. Excess PWG may alter the hormonal preparation of breast tissue for lactation, increase the risk of complications that negatively impact breastfeeding (e.g. Cesarean-section, gestational diabetes), and may make effective latch more difficult to achieve. Methods: Our objective was to determine the impact of PWG and pre-pregnancy BMI on the risk of breastfeeding cessation utilizing the Institute of Medicine’s 2009 recommendations. Cox proportional hazards models were utilized to estimate the risk of cessation of exclusive breastfeeding, and cessation of any breastfeeding among women who initiated exclusive and any breastfeeding, respectively, in a cross sectional sample of survey respondents from a New York county (N=1207). PWG category was interacted with pre-pregnancy BMI (3 levels of pre-pregnancy BMI, 3 levels of PWG). Confounders of the relationship of interest were evaluated using directed acyclic graphs and bivariate analyses; variables not on the proposed causal pathway and associated with the exposure and outcome were included in multivariate models. Results: After adjustment, women of normal and obese pre-pregnancy BMI with greater-than-recommended PWG had 1.39 (1.03-1.86) and 1.48 (1.06-2.07) times the risk of any breastfeeding cessation within the first 3 months postpartum compared to women with normal pre-pregnancy BMI who gained within PWG recommendations. Overweight women with greater-than-recommended PWG were at increased risk of cessation, although not significantly (aHR[95% CI]: 1.29 [0.95 – 1.75]). No significant relationship was observed for exclusive breastfeeding cessation. Conclusions: Pre-pregnancy BMI and PWG may be modifiable risk factors for early breastfeeding cessation. Understanding the mechanism behind this risk should be ascertained by additional studies aimed at understanding the physiological, social, logistical (positioning) and other issues that may lead to early breastfeeding cessation.


2020 ◽  
Author(s):  
Hayley Martin ◽  
Kelly Thevenet-Morrison ◽  
Ann Dozier

Abstract Background: It is well established that mothers with above-normal pre-pregnancy BMI are at increased risk of breastfeeding cessation; however, the impact of pregnancy weight-gain (PWG) is less well-defined. Excess PWG may alter the hormonal preparation of breast tissue for lactation, increase the risk of complications that negatively impact breastfeeding (e.g. Cesarean-section, gestational diabetes), and may make effective latch more difficult to achieve. Methods: Our objective was to determine the impact of PWG and pre-pregnancy BMI on the risk of breastfeeding cessation utilizing the Institute of Medicine’s 2009 recommendations. Cox proportional hazards models were utilized to estimate the risk of cessation of exclusive breastfeeding, and cessation of any breastfeeding among women who initiated exclusive and any breastfeeding, respectively, in a cross sectional sample of survey respondents from a New York county (N=1207). PWG category was interacted with pre-pregnancy BMI (3 levels of pre-pregnancy BMI, 3 levels of PWG). Confounders of the relationship of interest were evaluated using directed acyclic graphs and bivariate analyses; variables not on the proposed causal pathway and associated with the exposure and outcome were included in multivariate models. Results: After adjustment, women of normal and obese pre-pregnancy BMI with greater-than-recommended PWG had 1.39 (1.03-1.86) and 1.48 (1.06-2.07) times the risk of any breastfeeding cessation within the first 3 months postpartum compared to women with normal pre-pregnancy BMI who gained within PWG recommendations. Overweight women with greater-than-recommended PWG were at increased risk of cessation, although not significantly (aHR[95% CI]: 1.29 [0.95 – 1.75]). No significant relationship was observed for exclusive breastfeeding cessation. Conclusions: Pre-pregnancy BMI and PWG may be modifiable risk factors for early breastfeeding cessation. Understanding the mechanism behind this risk should be ascertained by additional studies aimed at understanding the physiological, social, logistical (positioning) and other issues that may lead to early breastfeeding cessation.


2021 ◽  
Vol 5 (Supplement_1) ◽  
pp. 701-702
Author(s):  
Samuel Miller ◽  
Lauren Wilson ◽  
Melissa Greiner ◽  
Jessica Pritchard ◽  
Tian Zhang ◽  
...  

Abstract Renal dysfunction is a driver of dementia. It is also associated with renal cell carcinoma, possibly the result of the tumor itself or from cancer treatment. This study evaluates metastatic renal cell carcinoma (mRCC) as a risk factor for developing mild cognitive impairment or dementia (MCI/D) as well as the impact of RCC-directed therapies on the development of MCI/D. We identified all patients diagnosed with mRCC in SEER-Medicare from 2007-2015. The main outcome was incident MCI/D within one year of mRCC diagnosis or cohort entry. Exclusion criteria included age &lt;65 at mRCC diagnosis and diagnosis of MCI/D within preceding year of mRCC diagnosis. Patients with mRCC (n=2,533) were matched to non-cancer controls (n=7,027) on age, sex, race, comorbidities and year. Cox proportional hazards regression showed that having mRCC (HR 8.52, 95% MCI/D 6.49-11.18, p&lt;0.001) and being older (HR 1.05 for 1-year age increase, 95% MCI/D 1.03-1.07, p&lt;0.001) were predictive of developing MCI/D. A second Cox proportional hazards regression of only patients with mRCC revealed that neither those initiating treatment with oral anticancer agents (OAAs) nor those who underwent nephrectomy were more likely to develop MCI/D. Black patients had a higher risk of dementia compared to white patients (HR 1.92, 95% MCI/D 1.02-3.59, p=0.047). In conclusion, patients with mRCC were more likely to develop MCI/D than those without mRCC. The medical and surgical therapies evaluated were not associated with increased incidence of MCI/D. The increased incidence of MCI/D in older adults with mRCC may be the result of the pathology itself.


2020 ◽  
Vol 163 (3) ◽  
pp. 522-530
Author(s):  
Gregory J. Wiet ◽  
Ellen S. Deutsch ◽  
Sonya Malekzadeh ◽  
Amanda J. Onwuka ◽  
Nathan W. Callender ◽  
...  

Objective To test the feasibility and impact of a simulation training program for myringotomy and tube (M&T) placement. Study Design Prospective randomized controlled. Setting Multi-institutional. Subjects and Methods An M&T simulator was used to assess the impact of simulation training vs no simulation training on the rate of achieving competency. Novice trainees were assessed using posttest simulator Objective Structured Assessment of Technical Skills (OSATS) scores, OSATS score for initial intraoperative tube insertion, and number of procedures to obtain competency. The effect of simulation training was analyzed using χ2 tests, Wilcoxon-Mann-Whitney tests, and Cox proportional hazards regression. Results A total of 101 residents and 105 raters from 65 institutions were enrolled; however, just 63 residents had sufficient data to be analyzed due to substantial breaches in protocol. There was no difference in simulator pretest scores between intervention and control groups; however, the intervention group had better OSATS global scores on the simulator (17.4 vs 13.7, P = .0003) and OSATS task scores on the simulator (4.5 vs 3.6, P = .02). No difference in OSATS scores was observed during initial live surgery rating ( P = .73 and P = .41). OSATS scores were predictive of the rate at which residents achieved competence in performing myringotomy; however, the intervention was not associated with subsequent OSATS scores during live surgeries ( P = .44 and P = .91) or the rate of achieving competence ( P = .16). Conclusions A multi-institutional simulation study is feasible. Novices trained using the M&T simulator achieved higher scores on simulator but not initial intraoperative OSATS, and they did not reach competency sooner than those not trained on the simulator.


Sign in / Sign up

Export Citation Format

Share Document