scholarly journals Association of autonomic symptoms with disease progression and survival in progressive supranuclear palsy

2018 ◽  
Vol 90 (5) ◽  
pp. 555-561 ◽  
Author(s):  
Marcos C B Oliveira ◽  
Helen Ling ◽  
Andrew J Lees ◽  
Janice L Holton ◽  
Eduardo De Pablo-Fernandez ◽  
...  

BackgroundDevelopment of autonomic failure is associated with more rapid disease course and shorter survival in patients with Parkinson’s disease and multiple system atrophy. However, autonomic symptoms have not been specifically assessed as a prognostic factor in progressive supranuclear palsy (PSP). We evaluated whether development of autonomic symptoms is associated with disease progression and survival in PSP.MethodsA retrospective review of clinical data from consecutive patients with autopsy-confirmed PSP from the Queen Square Brain Bank between January 2012 and November 2016 was performed. Time from disease onset to four autonomic symptoms (constipation, urinary symptoms, erectile dysfunction and orthostatic hypotension) were noted. Time from diagnosis to five disease milestones and survival were calculated to assess disease progression, and their risk was estimated through a Cox proportional hazards model.ResultsA total of 103 PSP patients were included. Urinary symptoms and constipation were present in 81% and 71% of cases, respectively. Early development of constipation and urinary symptoms were associated with higher risk of reaching the first disease milestone (respectively, HR: 0.88; 95% CI 0.83 to 0.92; p<0.001; and HR: 0.80; 95% CI 0.75 to 0.86; p<0.001) and with a shorter survival in these patients (respectively, HR: 0.73; 95% CI 0.64 to 0.84; p<0.001; and HR: 0.88; 95% CI 0.80 to 0.96; p=0.004). On multivariate analysis, Richardson syndrome phenotype was the other variable independently associated with shorter survival.ConclusionsEarlier urinary symptoms and constipation are associated with a more rapid disease progression and reduced survival in patients with PSP.

2019 ◽  
Vol 104 (1) ◽  
pp. 81-86 ◽  
Author(s):  
Sung Uk Baek ◽  
Ahnul Ha ◽  
Dai Woo Kim ◽  
Jin Wook Jeoung ◽  
Ki Ho Park ◽  
...  

Background/AimsTo investigate the risk factors for disease progression of normal-tension glaucoma (NTG) with pretreatment intraocular pressure (IOP) in the low-teens.MethodsOne-hundred and two (102) eyes of 102 patients with NTG with pretreatment IOP≤12 mm Hg who had been followed up for more than 60 months were retrospectively enrolled. Patients were divided into progressor and non-progressor groups according to visual field (VF) progression as correlated with change of optic disc or retinal nerve fibre layer defect. Baseline demographic and clinical characteristics including diurnal IOP and 24 hours blood pressure (BP) were compared between the two groups. The Cox proportional hazards model was used to identify the risk factors for disease progression.ResultsThirty-six patients (35.3%) were classified as progressors and 66 (64.7%) as non-progressors. Between the two groups, no significant differences were found in the follow-up periods (8.7±3.4 vs 7.7±3.2 years; p=0.138), baseline VF mean deviation (−4.50±5.65 vs −3.56±4.30 dB; p=0.348) or pretreatment IOP (11.34±1.21 vs 11.17±1.06 mm Hg; p=0.121). The multivariate Cox proportional hazards model indicated that greater diurnal IOP at baseline (HR=1.609; p=0.004), greater fluctuation of diastolic BP (DBP; HR=1.058; p=0.002) and presence of optic disc haemorrhage during follow-up (DH; HR=3.664; p=0.001) were risk factors for glaucoma progression.ConclusionIn the low-teens NTG eyes, 35.3% showed glaucoma progression during the average 8.7 years of follow-up. Fluctuation of DBP and diurnal IOP as well as DH were significantly associated with greater probability of disease progression.


PLoS ONE ◽  
2021 ◽  
Vol 16 (12) ◽  
pp. e0260730
Author(s):  
Fumie Kinoshita ◽  
Isao Yokota ◽  
Hiroki Mieno ◽  
Mayumi Ueta ◽  
John Bush ◽  
...  

This study aimed to clarify the etiologic factors predicting acute ocular progression in SJS/TEN, and identify patients who require immediate and intensive ophthalmological treatment. We previously conducted two Japanese Surveys of SJS/TEN (i.e., cases arising between 2005–2007 and between 2008–2010), and obtained the medical records, including detailed dermatological and ophthalmological findings, of 230 patients. Acute ocular severity was evaluated as none, mild, severe, and very severe. A multi-state model assuming the Markov process based on the Cox proportional hazards model was used to elucidate the specific factors affecting the acute ocular progression. Our findings revealed that of the total 230 patients, 23 (24%) of 97 cases that were mild at initial presentation worsened to severe/very severe. Acute ocular progression developed within 3 weeks from disease onset. Exposure to nonsteroidal anti-inflammatory drugs (NSAIDs) and younger patient age were found to be statistically significant for the progression of ocular severity from mild to severe/very severe [hazard ratio (HR) 3.83; 95% confidence interval (CI) 1.48 to 9.91] and none to severe/very severe [HR 0.98; 95% CI 0.97 to 0.99], respectively. The acute ocular severity score at worst-condition was found to be significantly correlated with ocular sequelae. Thus, our detailed findings on acute ocular progression revealed that in 24% of SJS/TEN cases with ocular involvement, ocular severity progresses even after initiating intensive treatment, and that in younger-age patients with a history of exposure to NSAIDs, very strict attention must be given to their ophthalmological appearances.


2020 ◽  
Author(s):  
Agnes Martine Nielsen ◽  
Rikke Linnemann Nielsen ◽  
Louise Donnelly ◽  
Kaixin Zhou ◽  
Anders Dahl ◽  
...  

Abstract Background: In recent years, a variety of new machine learning methods are being employed in prediction of disease progression, e.g. random forest or neural networks, but how do they compare to and are they direct substitutes for the more traditional statistical methods like the Cox proportional hazards model? In this paper, we compare three of the most commonly used approaches to model prediction of disease progression. We consider a type 2 diabetes case from a cohort-based population in Tayside, UK. In this study, the time until a patient goes onto insulin treatment is of interest; in particular discriminating between slow and fast progression. This means that we are both interested in the results as a raw time-to-insulin prediction but also in a dichotomized outcome making the prediction a classification.Methods: Three different methods for prediction are considered: A Cox proportional hazards model, random forest for survival data and a neural network on the dichotomized outcome. The performance is evaluated using survival performance measures (concordance indices and the integrated Brier score) and using the accuracy, sensitivity, specificity, and Matthews correlation.coefficient for the corresponding classification problems.Results: We found no improvement when using the conditional inference forest over the Cox model. The neural network out performed the conditional inference forest in the classification problem. We discuss the limitations of the three approaches and where they each excel in terms of prediction performance, interpretation, and how they handle data imbalance.


2015 ◽  
Vol 2015 ◽  
pp. 1-6 ◽  
Author(s):  
Gustavo Costa Fernandes ◽  
Mariana Peixoto Socal ◽  
Artur Francisco Schumacher Schuh ◽  
Carlos R. M. Rieder

Background. Prognosis of PD is variable. Most studies show higher mortality rates in PD patients compared to the general population. Clinical and epidemiologic factors predicting mortality are poorly understood.Methods. Clinical and epidemiologic features including patient history and physical, functional, and cognitive scores were collected from a hospital-based cohort of PD patients using standardized protocols and clinical scales. Data on comorbidities and mortality were collected on follow-up.Results. During a mean follow-up of 4.71 years (range 1–10), 43 (20.9%) of the 206 patients died. Those who died had higher mean age at disease onset than those still alive at the last follow-up (67.7 years versus 56.3 years;p<0.01). In the univariate analysis, age at baseline was associated with decreased survival. In the adjusted Cox proportional hazards model, age at disease onset and race/ethnicity were predictors of mortality.Conclusions. Late age at disease onset and advanced chronological age are associated with decreased survival. Comorbidities and PD characteristics were not associated with decreased survival in our sample. Race/ethnicity was found in our study to be associated with increased hazard of mortality. Our findings indicate the importance of studying survival among different populations of PD patients.


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Maryam Farhadian ◽  
Sahar Dehdar Karsidani ◽  
Azadeh Mozayanimonfared ◽  
Hossein Mahjub

Abstract Background Due to the limited number of studies with long term follow-up of patients undergoing Percutaneous Coronary Intervention (PCI), we investigated the occurrence of Major Adverse Cardiac and Cerebrovascular Events (MACCE) during 10 years of follow-up after coronary angioplasty using Random Survival Forest (RSF) and Cox proportional hazards models. Methods The current retrospective cohort study was performed on 220 patients (69 women and 151 men) undergoing coronary angioplasty from March 2009 to March 2012 in Farchshian Medical Center in Hamadan city, Iran. Survival time (month) as the response variable was considered from the date of angioplasty to the main endpoint or the end of the follow-up period (September 2019). To identify the factors influencing the occurrence of MACCE, the performance of Cox and RSF models were investigated in terms of C index, Integrated Brier Score (IBS) and prediction error criteria. Results Ninety-six patients (43.7%) experienced MACCE by the end of the follow-up period, and the median survival time was estimated to be 98 months. Survival decreased from 99% during the first year to 39% at 10 years' follow-up. By applying the Cox model, the predictors were identified as follows: age (HR = 1.03, 95% CI 1.01–1.05), diabetes (HR = 2.17, 95% CI 1.29–3.66), smoking (HR = 2.41, 95% CI 1.46–3.98), and stent length (HR = 1.74, 95% CI 1.11–2.75). The predictive performance was slightly better by the RSF model (IBS of 0.124 vs. 0.135, C index of 0.648 vs. 0.626 and out-of-bag error rate of 0.352 vs. 0.374 for RSF). In addition to age, diabetes, smoking, and stent length, RSF also included coronary artery disease (acute or chronic) and hyperlipidemia as the most important variables. Conclusion Machine-learning prediction models such as RSF showed better performance than the Cox proportional hazards model for the prediction of MACCE during long-term follow-up after PCI.


Author(s):  
Yuko Yamaguchi ◽  
Marta Zampino ◽  
Toshiko Tanaka ◽  
Stefania Bandinelli ◽  
Yusuke Osawa ◽  
...  

Abstract Background Anemia is common in older adults and associated with greater morbidity and mortality. The causes of anemia in older adults have not been completely characterized. Although elevated circulating growth and differentiation factor 15 (GDF-15) has been associated with anemia in older adults, it is not known whether elevated GDF-15 predicts the development of anemia. Methods We examined the relationship between plasma GDF-15 concentrations at baseline in 708 non-anemic adults, aged 60 years and older, with incident anemia during 15 years of follow-up among participants in the Invecchiare in Chianti (InCHIANTI) Study. Results During follow-up, 179 (25.3%) participants developed anemia. The proportion of participants who developed anemia from the lowest to highest quartile of plasma GDF-15 was 12.9%, 20.1%, 21.2%, and 45.8%, respectively. Adults in the highest quartile of plasma GDF-15 had an increased risk of developing anemia (Hazards Ratio 1.15, 95% Confidence Interval 1.09, 1.21, P&lt;.0001) compared to those in the lower three quartiles in a multivariable Cox proportional hazards model adjusting for age, sex, serum iron, soluble transferrin receptor, ferritin, vitamin B12, congestive heart failure, diabetes mellitus, and cancer. Conclusions Circulating GDF-15 is an independent predictor for the development of anemia in older adults.


2020 ◽  
Vol 4 (Supplement_1) ◽  
pp. 161-161
Author(s):  
Jane Banaszak-Holl ◽  
Xiaoping Lin ◽  
Jing Xie ◽  
Stephanie Ward ◽  
Henry Brodaty ◽  
...  

Abstract Research Aims: This study seeks to understand whether those with dementia experience higher risk of death, using data from the ASPREE (ASPirin in Reducing Events in the Elderly) clinical trial study. Methods: ASPREE was a primary intervention trial of low-dose aspirin among healthy older people. The Australian cohort included 16,703 dementia-free participants aged 70 years and over at enrolment. Participants were triggered for dementia adjudication if cognitive test results were poorer than expected, self-reporting dementia diagnosis or memory problems, or dementia medications were detected. Incidental dementia was adjudicated by an international adjudication committee using the Diagnostic and Statistical Manual for Mental Disorders (DSM-IV) criteria and results of a neuropsychological battery and functional measures with medical record substantiation. Statistical analyses used a cox proportional hazards model. Results: As previously reported, 1052 participants (5.5%) died during a median of 4.7 years of follow-up and 964 participants had a dementia trigger, of whom, 575 (60%) were adjucated as having dementia. Preliminary analyses has shown that the mortality rate was higher among participants with a dementia trigger, regardless of dementia adjudication outcome, than those without (15% vs 5%, Χ2 = 205, p &lt;.001). Conclusion: This study will provide important analyses of differences in the hazard ratio for mortality and causes of death among people with and without cognitive impairment and has important implications on service planning.


Risks ◽  
2021 ◽  
Vol 9 (7) ◽  
pp. 121
Author(s):  
Beata Bieszk-Stolorz ◽  
Krzysztof Dmytrów

The aim of our research was to compare the intensity of decline and then increase in the value of basic stock indices during the SARS-CoV-2 coronavirus pandemic in 2020. The survival analysis methods used to assess the risk of decline and chance of rise of the indices were: Kaplan–Meier estimator, logit model, and the Cox proportional hazards model. We observed the highest intensity of decline in the European stock exchanges, followed by the American and Asian plus Australian ones (after the fourth and eighth week since the peak). The highest risk of decline was in America, then in Europe, followed by Asia and Australia. The lowest risk was in Africa. The intensity of increase was the highest in the fourth and eleventh week since the minimal value had been reached. The highest odds of increase were in the American stock exchanges, followed by the European and Asian (including Australia and Oceania), and the lowest in the African ones. The odds and intensity of increase in the stock exchange indices varied from continent to continent. The increase was faster than the initial decline.


BMC Nutrition ◽  
2021 ◽  
Vol 7 (1) ◽  
Author(s):  
Akiko Nakanishi ◽  
Erika Homma ◽  
Tsukasa Osaki ◽  
Ri Sho ◽  
Masayoshi Souri ◽  
...  

Abstract Background Dairy products are known as health-promoting foods. This study prospectively examined the association between milk and yogurt intake and mortality in a community-based population. Methods The study population comprised of 14,264 subjects aged 40–74 years who participated in an annual health checkup. The frequency of yogurt and milk intake was categorized as none (< 1/month), low (< 1/week), moderate (1–6/week), and high (> 1/day) intake. The association between yogurt and milk intake and total, cardiovascular, and cancer-related mortalities was determined using the Cox proportional hazards model. Results During the follow-up period, there were 265 total deaths, 40 cardiovascular deaths and 90 cancer-related deaths. Kaplan–Meier analysis showed that the total mortality in high/moderate/low yogurt intake and moderate/low milk intake groups was lower than that in none group (log-rank, P < 0.01). In the multivariate Cox proportional hazard analysis adjusted for possible confounders, the hazard ratio (HR) for total mortality significantly decreased in high/moderate yogurt intake group (HR: 0.62, 95% confidence interval [CI]: 0.42–0.91 for high intake, HR: 0.70, 95%CI: 0.49–0.99 for moderate intake) and moderate milk intake group (HR: 0.67, 95% CI: 0.46–0.97) compared with the none yogurt and milk intake groups. A similar association was observed for cancer-related mortality, but not for cardiovascular mortality. Conclusions Our study showed that yogurt and milk intake was independently associated with a decrease in total and cancer-related mortalities in the Japanese population.


2020 ◽  
Vol 41 (Supplement_2) ◽  
Author(s):  
A Fujino ◽  
H Ogawa ◽  
S Ikeda ◽  
K Doi ◽  
Y Hamatani ◽  
...  

Abstract Background Atrial fibrillation (AF) commonly progresses from paroxysmal type to sustained type in the natural course of the disease, and we previously demonstrated that the progression of AF was associated with increased risk of clinical adverse events. There are some patients, though less frequently, who regress from sustained to paroxysmal AF, but the clinical impact of the regression of AF remains unknown. Purpose We sought to investigate whether regression from sustained to paroxysmal AF is associated with better clinical outcomes. Methods Using the dataset of the Fushimi AF Registry, patients who were diagnosed as sustained (persistent or permanent) AF at baseline were studied. Conversion of sustained AF to paroxysmal AF during follow-up was defined as regression of AF. Major adverse cardiac events (MACE) were defined as the composite of cardiac death, stroke, and hospitalization for heart failure (HF). Event rates were compared between the patients with and without regression of AF. In patients with sustained AF at baseline, predictors of MACE were identified using Cox proportional hazards model. Results Among 2,253 patients who were diagnosed as sustained AF at baseline, regression of AF was observed in 9.0% (202/2,253, 2.0 per 100 patient-years) during a median follow-up of 4.0 years. Of these, 24.3% (49/202, 4.6 per 100 patient-years) of the patients finally recurred to sustained AF during follow-up. The proportion of asymptomatic patients was lower in patients with regression of AF than those without (with vs without regression; 49.0% vs 69.5%, p&lt;0.01). The percentage of beta-blocker use at baseline was similar between the two groups (37.2% vs 33.8%, p=0.34). The prevalence of patients who underwent catheter ablation or electrical cardioversion during follow-up was higher in patients with regression of AF (catheter ablation: 15.8% vs 5.5%; p&lt;0.01, cardioversion: 4.0% vs 1.4%; p&lt;0.01, respectively). The rate of MACE was significantly lower in patients with regression of AF as compared with patients who maintained sustained AF (3.7 vs 6.2 per 100 patient-years, log-rank p&lt;0.01). Figure shows the Kaplan-Meier curves for MACE, cardiac death, hospitalization for heart failure, and stroke. In patients with sustained AF at baseline, multivariable Cox proportional hazards model demonstrated that regression of AF was an independent predictor of lower MACE (adjusted hazard ratio [HR]: 0.50, 95% confidence interval [CI]: 0.28 to 0.88, p=0.02), stroke (HR: 0.51, 95% CI: 0.30 to 0.88, p=0.02), and hospitalization for HF (HR: 0.50, 95% CI: 0.29 to 0.85, p=0.01). Conclusion Regression from sustained to paroxysmal AF was associated with a lower incidence of adverse cardiac events. Funding Acknowledgement Type of funding source: None


Sign in / Sign up

Export Citation Format

Share Document