Risk factors for disease progression in low-teens normal-tension glaucoma

2019 ◽  
Vol 104 (1) ◽  
pp. 81-86 ◽  
Author(s):  
Sung Uk Baek ◽  
Ahnul Ha ◽  
Dai Woo Kim ◽  
Jin Wook Jeoung ◽  
Ki Ho Park ◽  
...  

Background/AimsTo investigate the risk factors for disease progression of normal-tension glaucoma (NTG) with pretreatment intraocular pressure (IOP) in the low-teens.MethodsOne-hundred and two (102) eyes of 102 patients with NTG with pretreatment IOP≤12 mm Hg who had been followed up for more than 60 months were retrospectively enrolled. Patients were divided into progressor and non-progressor groups according to visual field (VF) progression as correlated with change of optic disc or retinal nerve fibre layer defect. Baseline demographic and clinical characteristics including diurnal IOP and 24 hours blood pressure (BP) were compared between the two groups. The Cox proportional hazards model was used to identify the risk factors for disease progression.ResultsThirty-six patients (35.3%) were classified as progressors and 66 (64.7%) as non-progressors. Between the two groups, no significant differences were found in the follow-up periods (8.7±3.4 vs 7.7±3.2 years; p=0.138), baseline VF mean deviation (−4.50±5.65 vs −3.56±4.30 dB; p=0.348) or pretreatment IOP (11.34±1.21 vs 11.17±1.06 mm Hg; p=0.121). The multivariate Cox proportional hazards model indicated that greater diurnal IOP at baseline (HR=1.609; p=0.004), greater fluctuation of diastolic BP (DBP; HR=1.058; p=0.002) and presence of optic disc haemorrhage during follow-up (DH; HR=3.664; p=0.001) were risk factors for glaucoma progression.ConclusionIn the low-teens NTG eyes, 35.3% showed glaucoma progression during the average 8.7 years of follow-up. Fluctuation of DBP and diurnal IOP as well as DH were significantly associated with greater probability of disease progression.

2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Maryam Farhadian ◽  
Sahar Dehdar Karsidani ◽  
Azadeh Mozayanimonfared ◽  
Hossein Mahjub

Abstract Background Due to the limited number of studies with long term follow-up of patients undergoing Percutaneous Coronary Intervention (PCI), we investigated the occurrence of Major Adverse Cardiac and Cerebrovascular Events (MACCE) during 10 years of follow-up after coronary angioplasty using Random Survival Forest (RSF) and Cox proportional hazards models. Methods The current retrospective cohort study was performed on 220 patients (69 women and 151 men) undergoing coronary angioplasty from March 2009 to March 2012 in Farchshian Medical Center in Hamadan city, Iran. Survival time (month) as the response variable was considered from the date of angioplasty to the main endpoint or the end of the follow-up period (September 2019). To identify the factors influencing the occurrence of MACCE, the performance of Cox and RSF models were investigated in terms of C index, Integrated Brier Score (IBS) and prediction error criteria. Results Ninety-six patients (43.7%) experienced MACCE by the end of the follow-up period, and the median survival time was estimated to be 98 months. Survival decreased from 99% during the first year to 39% at 10 years' follow-up. By applying the Cox model, the predictors were identified as follows: age (HR = 1.03, 95% CI 1.01–1.05), diabetes (HR = 2.17, 95% CI 1.29–3.66), smoking (HR = 2.41, 95% CI 1.46–3.98), and stent length (HR = 1.74, 95% CI 1.11–2.75). The predictive performance was slightly better by the RSF model (IBS of 0.124 vs. 0.135, C index of 0.648 vs. 0.626 and out-of-bag error rate of 0.352 vs. 0.374 for RSF). In addition to age, diabetes, smoking, and stent length, RSF also included coronary artery disease (acute or chronic) and hyperlipidemia as the most important variables. Conclusion Machine-learning prediction models such as RSF showed better performance than the Cox proportional hazards model for the prediction of MACCE during long-term follow-up after PCI.


Author(s):  
Yuko Yamaguchi ◽  
Marta Zampino ◽  
Toshiko Tanaka ◽  
Stefania Bandinelli ◽  
Yusuke Osawa ◽  
...  

Abstract Background Anemia is common in older adults and associated with greater morbidity and mortality. The causes of anemia in older adults have not been completely characterized. Although elevated circulating growth and differentiation factor 15 (GDF-15) has been associated with anemia in older adults, it is not known whether elevated GDF-15 predicts the development of anemia. Methods We examined the relationship between plasma GDF-15 concentrations at baseline in 708 non-anemic adults, aged 60 years and older, with incident anemia during 15 years of follow-up among participants in the Invecchiare in Chianti (InCHIANTI) Study. Results During follow-up, 179 (25.3%) participants developed anemia. The proportion of participants who developed anemia from the lowest to highest quartile of plasma GDF-15 was 12.9%, 20.1%, 21.2%, and 45.8%, respectively. Adults in the highest quartile of plasma GDF-15 had an increased risk of developing anemia (Hazards Ratio 1.15, 95% Confidence Interval 1.09, 1.21, P<.0001) compared to those in the lower three quartiles in a multivariable Cox proportional hazards model adjusting for age, sex, serum iron, soluble transferrin receptor, ferritin, vitamin B12, congestive heart failure, diabetes mellitus, and cancer. Conclusions Circulating GDF-15 is an independent predictor for the development of anemia in older adults.


2020 ◽  
Vol 41 (Supplement_2) ◽  
Author(s):  
A Fujino ◽  
H Ogawa ◽  
S Ikeda ◽  
K Doi ◽  
Y Hamatani ◽  
...  

Abstract Background Atrial fibrillation (AF) commonly progresses from paroxysmal type to sustained type in the natural course of the disease, and we previously demonstrated that the progression of AF was associated with increased risk of clinical adverse events. There are some patients, though less frequently, who regress from sustained to paroxysmal AF, but the clinical impact of the regression of AF remains unknown. Purpose We sought to investigate whether regression from sustained to paroxysmal AF is associated with better clinical outcomes. Methods Using the dataset of the Fushimi AF Registry, patients who were diagnosed as sustained (persistent or permanent) AF at baseline were studied. Conversion of sustained AF to paroxysmal AF during follow-up was defined as regression of AF. Major adverse cardiac events (MACE) were defined as the composite of cardiac death, stroke, and hospitalization for heart failure (HF). Event rates were compared between the patients with and without regression of AF. In patients with sustained AF at baseline, predictors of MACE were identified using Cox proportional hazards model. Results Among 2,253 patients who were diagnosed as sustained AF at baseline, regression of AF was observed in 9.0% (202/2,253, 2.0 per 100 patient-years) during a median follow-up of 4.0 years. Of these, 24.3% (49/202, 4.6 per 100 patient-years) of the patients finally recurred to sustained AF during follow-up. The proportion of asymptomatic patients was lower in patients with regression of AF than those without (with vs without regression; 49.0% vs 69.5%, p<0.01). The percentage of beta-blocker use at baseline was similar between the two groups (37.2% vs 33.8%, p=0.34). The prevalence of patients who underwent catheter ablation or electrical cardioversion during follow-up was higher in patients with regression of AF (catheter ablation: 15.8% vs 5.5%; p<0.01, cardioversion: 4.0% vs 1.4%; p<0.01, respectively). The rate of MACE was significantly lower in patients with regression of AF as compared with patients who maintained sustained AF (3.7 vs 6.2 per 100 patient-years, log-rank p<0.01). Figure shows the Kaplan-Meier curves for MACE, cardiac death, hospitalization for heart failure, and stroke. In patients with sustained AF at baseline, multivariable Cox proportional hazards model demonstrated that regression of AF was an independent predictor of lower MACE (adjusted hazard ratio [HR]: 0.50, 95% confidence interval [CI]: 0.28 to 0.88, p=0.02), stroke (HR: 0.51, 95% CI: 0.30 to 0.88, p=0.02), and hospitalization for HF (HR: 0.50, 95% CI: 0.29 to 0.85, p=0.01). Conclusion Regression from sustained to paroxysmal AF was associated with a lower incidence of adverse cardiac events. Funding Acknowledgement Type of funding source: None


2019 ◽  
Vol 2019 ◽  
pp. 1-11 ◽  
Author(s):  
Zhang Haiyu ◽  
Pei Xiaofeng ◽  
Mo Xiangqiong ◽  
Qiu Junlan ◽  
Zheng Xiaobin ◽  
...  

Purpose. The morbidity of esophageal adenocarcinoma (EAC) has significantly increased in Western countries. We aimed to identify trends in incidence and survival in patients with EAC in the recent 30 years and then analyzed potential risk factors, including race, sex, age, and socioeconomic status (SES). Methods. All data were collected from the Surveillance, Epidemiology, and End Results or SEER database. Kaplan–Meier analysis and the Cox proportional hazards model were conducted to compare the differences in survival between variables, including sex, race, age, and SES, as well as to evaluate the association of these factors with prognosis. Results. A total of 16,474 patients with EAC were identified from 1984 to 2013 in the United States. Overall incidence increased every 10 years from 1.8 to 3.1 to 3.9 per 100. Overall survival gradually improved (p<0.0001), which was evident in male patients ((hazard ratio (HR) = 1.111; 95% confidence interval (CI) (1.07, 1.15)); however, the 5-year survival rate remained low (20.1%). The Cox proportional hazards model identified old age, black ethnicity, and medium/high poverty as risk factors for EAC (HR = 1.018; 95% CI (1.017, 1.019; HR = 1.240, 95% CI (1.151,1.336), HR = 1.000, 95% CI (1.000, 1.000); respectively). Conclusions. The incidence of EAC in the United States increased over time. Survival advantage was observed in white patients and patients in the low-poverty group. Sex was an independent prognostic factor for EAC, but this finding has to be confirmed by further research.


2015 ◽  
Vol 35 (2) ◽  
pp. 199-205 ◽  
Author(s):  
Fan Zhang ◽  
Hong Liu ◽  
Xiaoli Gong ◽  
Fuyou Liu ◽  
Youming Peng ◽  
...  

ObjectiveThe intent of this study was to evaluate the clinical outcome and risk factors affecting mortality of the continuous ambulatory peritoneal dialysis (CAPD) patients in a single peritoneal dialysis (PD) center over a period of 10 years.Patients and methodsWe retrospectively analyzed patients on PD from June 2001 to June 2011. The clinical and biochemical data were collected from the medical records. Clinical variables included gender, age at the start of PD, smoking status, body mass index (BMI), cause of end-stage renal disease (ESRD), presence of diabetes mellitus and blood pressure. Biochemical variables included hemoglobin, urine volume, residual renal function (RRF), serum albumin, blood urea nitrogen (BUN), creatinine, total cholesterol, triglyceride, comorbidities, and outcomes. Survival curves were made by the Kaplan-Meier method. Univariate and multivariate analyses to identify mortality risk factors were performed using the Cox proportional hazard regression model.ResultsA total of 421 patients were enrolled, 269 of whom were male (63.9%). The mean age at the start of PD was 57.9 ± 14.8 years. Chronic glomerulonephritis was the most common cause of ESRD (39.4%). Estimation of patient survival by Kaplan-Meier was 92.5%, 80.2%, 74.4%, and 55.7% at 1, 3, 5, and 10 years, respectively. Patient survival was associated with age (hazard ratio [HR]: 1.641 [1.027 – 2.622], p = 0.038), cardiovascular disease (HR: 1.731 [1.08 – 2.774], p = 0.023), hypertriglyceridemia (HR: 1.782 [1.11 – 2.858], p = 0.017) in the Cox proportional hazards model analysis. Estimation of technique survival by Kaplan-Meier was 86.7%, 68.8%, 55.7%, and 37.4% at 1, 3, 5, and 10 years, respectively. In the Cox proportional hazards model analysis, age (HR: 1.672 [1.176 – 2.377], p = 0.004) and hypertriglyceridemia (HR: 1.511 [1.050 – 2.174], p = 0.026) predicted technique failure.ConclusionThe PD patients in our center exhibited comparable or even superior patient survival and technical survival rates, compared with reports from other centers in China and other countries.


2017 ◽  
Vol 35 (15_suppl) ◽  
pp. 8014-8014
Author(s):  
Arjun Lakshman ◽  
Muhamad Alhaj Moustafa ◽  
S.Vincent Rajkumar ◽  
Angela Dispenzieri ◽  
Morie A. Gertz ◽  
...  

8014 Background: t(11;14) is a standard risk cytogenetic marker in MM. Methods: We reviewed 366 patients with MM who had t(11;14) by FISH and 732 age and period-matched controls without t(11;14), seen at our institution from 2004 to 2014 and outcomes were analyzed using time to first progression or death (PFS1) and overall survival (OS). Results: For the t(11;14) group at diagnosis, the median age was 63.7 yr (range, 22.1-95.4) with 64.5% of patients being male. Eighty nine (24.3%) patients were above 70 yr of age at diagnosis. 33.8%, 40.3% and 25.9% patients belonged to ISS 1, II and III stages respectively. 13% patients had elevated LDH. Monosomy 17 or del 17p were identified in 10.6% patients. The median follow up period was 56.9 months (m) (95% CI: 54.6-62.2) and 209 (57.1%) patients were alive at last follow-up. Among patients receiving proteasome inhibitor (PI)-based, immunomodulator (IMiD)-based, PI+IMiD based or other agent based induction therapy, 71.2%, 70.3%, 90.4% and 37.5% patients respectively attained ≥PR as best response to induction (p < 0.01). During their course, 223 (60.9%) patients underwent stem cell transplant. Median PFS1 and OS were 23.1 (CI: 20.8-27.9) and 78.6 (CI: 66.7-105.9) m respectively. Among the controls, high risk cytogenetics (HRC) was present in 142 (19.4%), and the median OS was 83.8 m (CI: 70.8-97.0) being comparable to t(11;14) group (p = 0.8). For all 1098 patients, using a Cox-proportional hazards model with age > 70 years, induction therapy (novel agent-based vs others), cytogenetics [HRC vs t(11;14) without HRC vs no HRC or t(11;14)], and ISS stage III vs II/I as predictors, age > 70 years [HR-2.2 (CI: 1.8-2.8) and p < 0.01], ISS III vs ISS II/I [HR-1.4 (CI: 1.1-1.8) and p < 0.01] and HRC [HR of 2.1 (CI: 1.6-2.8) vs no HRC or t(11;14) (p < 0.01) and 1.9 (CI = 1.4-2.6) for t(11;14) without HRC (p < 0.01)were associated with reduced OS. The risk for reduced OS did not differ between t(11;14) without HRC, and those without t(11;14) or HRC [HR-1.1 (CI: 0.9-1.4), p = 0.4]. Conclusions: Our study characterizes the outcomes of a large cohort of MM patients with t(11;14) at diagnosis. Advanced age, HRC and advanced stage at diagnosis were associated with worse OS in our cohort. t(11;14) MM without HRC does not differ in outcome compared to non-t(11;14) MM without HRC.


2016 ◽  
Author(s):  
Michael S. Lauer

AbstractTo inform the retirement of NIH-owned chimpanzees, we analyzed the outcomes of 764 NIH-owned chimpanzees that were located at various points in time in at least one of 4 specific locations. All chimpanzees considered were alive and at least 10 years of age on January 1, 2005; transfers to a federal sanctuary began a few months later. During a median follow-up of just over 7 years, there were 314 deaths. In a Cox proportional hazards model that accounted for age, sex, and location (which was treated as a time-dependent covariate), age and sex were strong predictors of mortality, but location was only marginally predictive. Among 273 chimpanzees who were transferred to the federal sanctuary, we found no material increased risk in mortality in the first 30 days after arrival. During a median follow-up at the sanctuary of 3.5 years, age was strongly predictive of mortality, but other variables – sex, season of arrival, and ambient temperature on the day of arrival – were not predictive. We confirmed our regression findings using random survival forests. In summary, in a large cohort of captive chimpanzees, we find no evidence of materially important associations of location of residence or recent transfer with premature mortality.


2021 ◽  
Author(s):  
Kenichiro Asano ◽  
Yoji Yamashita ◽  
Takahiro Ono ◽  
Manabu Natsumeda ◽  
Takaaki Beppu ◽  
...  

Abstract Introduction The number of elderly patients with primary central nervous system malignant lymphoma(EL-PCNSL) has been increasing. However, due to their poor pre-treatment Karnofsky Performance Status(KPS) and many comorbidities, it is possible that sufficient treatment has not been performed. We therefore conducted a retrospective cohort study to evaluate risk factors associated with a poor prognosis of the Real-World status of EL-PCNSL in the Tohoku Brain Tumor Study Group. Methods Patients aged ≥ 71 years with PCNSL were enrolled from 8 centers. Univariate analysis was performed by the log-rank test. A Cox proportional hazards model was used for multivariate analysis. Results Three of total 142 cases received best supportive care(BSC) from the beginning. Treatment was given to 30 cases without a pathological diagnosis, 3 cases with a cerebrospinal fluid diagnosis, and 100 cases with CD20-positive DLBCL diagnosis. Total 133 cases(median age 76 years) were included. The median pre-treatment KPS was 50%. There were 117(88.0%) patients with 213 pre-treatment comorbidities(1.8 comorbidities per patient). PFS and OS were 16 months and 24 months, respectively. Risk factors associated with poor prognosis on Cox proportional hazards model were pre-treatment cardiovascular disease and central nervous system disease comorbidities, post-treatment pneumonia and other infections, and the absence of radiation or chemotherapy. Conclusions EL-PCNSL was actively treated and BSC was only a few. Pre-treatment comorbidities and post-treatment complications would influence the prognosis. Radiation and chemotherapy were found to be effective, but no conclusions could be drawn regarding the content of chemotherapy and whether additional radiation therapy should be used.


2005 ◽  
Vol 32 (3) ◽  
pp. 302-328 ◽  
Author(s):  
Mike Stoolmiller ◽  
Elaine A. Blechman

How well does substance use predict adolescent recidivism? When the Cox proportional hazards model was applied to officially recorded first rearrest of 505 juvenile offenders, a best-fitting complex multivariate model indicated that: (a) parent reports that youths “often” use substances more than doubles first rearrest risk, (b) averaged youth and parent substance use reports predict recidivism better than a single source, (c) parent or youth denial of youth substance use predicts recidivism, (d) age at first arrest does not predict recidivism, (e) non-White/non-Asians have a 79% higher recidivism risk than peers, (f) parent-reported delinquency predicts recidivism with declining accuracy, and (g) substance use robustly predicts recidivism despite prior reported delinquency, gender, ethnicity, age, follow-up time, or data source. Findings are related to host-provocation theory.


Cancers ◽  
2021 ◽  
Vol 13 (7) ◽  
pp. 1542
Author(s):  
Kenji Imai ◽  
Koji Takai ◽  
Takao Miwa ◽  
Toshihide Maeda ◽  
Tatsunori Hanai ◽  
...  

We investigated the factors affecting recurrence-free survival in patients with non-B non-C hepatocellular carcinoma (HCC) who received curative treatment. Decision-tree analysis was performed in 72 curative cases of non-B non-C HCC to extract the risk factors for recurrence. The reliability of the extracted risk factors was evaluated using the Kaplan–Meier method and the Cox proportional hazards model. The decision-tree analysis extracted three factors—visceral adipose tissue (VAT) index (VATI; <71 and ≥71 cm2/m2), which was the cross-sectional areas of VAT on the computed tomographic image at the umbilical level, normalized by the square of the height, fasting immunoreactive insulin (FIRI; <5.5 and ≥5.5 µU/mL), and alpha-fetoprotein (AFP; <11 and ≥11 ng/mL). The Cox proportional hazards model showed that VATI (hazard ratio (HR): 2.556, 95% confidence interval (CI): 1.191–5.486, p = 0.016), FIRI (HR: 3.149, 95% CI: 1.156–8.575, p = 0.025), and AFP (HR: 3.362, 95% CI: 1.550–7.288, p = 0.002) were all independent risk factors for HCC recurrence. Non-B non-C HCC patients with a higher VATI (≥71 cm2/m2) or higher FIRI (≥5.5 µU/mL) and AFP (≥11 ng/mL) if VATI was <71 cm2/m2 are prone to recurrence after curative treatment.


Sign in / Sign up

Export Citation Format

Share Document