scholarly journals Predicting the development of normal tension glaucoma and related risk factors in normal tension glaucoma suspects

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Hae-Young Lopilly Park ◽  
Da Young Shin ◽  
Soo Ji Jeon ◽  
Yong-Chan Kim ◽  
Younhea Jung ◽  
...  

AbstractThis study investigated the predicted risk factors for the development of normal-tension glaucoma (NTG) in NTG suspects. A total of 684 eyes of 379 NTG suspects who were followed-up for at least 5 years were included in the study. NTG suspects were those having (1) intraocular pressure within normal range, (2) suspicious optic disc (neuroretinal rim thinning) or enlarged cup-to-disc ratio (≥ 0.6), but without definite localized retinal nerve fiber layer (RNFL) defects on red-free disc/fundus photographs, and (3) normal visual field (VF). Demographic, systemic, and ocular characteristics were determined at the time of the first visit via detailed history-taking and examination of past medical records. Various ocular parameters were assess using spectral-domain optical coherence tomography and Heidelberg retinal tomography. Conversion to NTG was defined either by the presence of a new localized RNFL defect at the superotemporal or inferotemporal region on disc/fundus red-free photographs, or presence of a glaucomatous VF defect on pattern standard deviation plots on two consecutive tests. Hazard ratios were calculated with the Cox proportional hazard model. In total, 86 (12.6%) of the 684 NTG suspects converted to NTG during the follow-up period of 69.39 ± 7.77 months. Significant (P < 0.05, Cox regression) risk factors included medication for systemic hypertension, longer axial length, worse baseline VF parameters, thinner baseline peripapillary RNFL, greater disc torsion, and lamina cribrosa (LC) thickness < 180.5 μm (using a cut-off value obtained by regression analysis). Significant (P < 0.05, Cox regression) risk factors in the non-myopic NTG suspects included medication for systemic hypertension and a LC thinner than the cut-off value. Significant (P < 0.05, Cox regression) risk factors in the myopic NTG suspects included greater disc torsion. The results indicated that 12.6% of NTG suspects converted to NTG during the 5–6-year follow-up period. NTG suspects taking medication for systemic hypertension, disc torsion of the optic disc in the inferotemporal direction, and thinner LC of the optic nerve head at baseline were at greater risk of NTG conversion. Related baseline risk factors were different between myopic and non-myopic NTG suspects.

2021 ◽  
Vol 14 (10) ◽  
pp. 1553-1559
Author(s):  
Susanne Hopf ◽  
◽  
Irene Schmidtmann ◽  
Norbert Pfeiffer ◽  
Esther Maria Hoffmann ◽  
...  

AIM: To investigate short- and long-term intraocular pressure (IOP) fluctuations and further ocular and demographic parameters as predictors for normal tension glaucoma (NTG) progression. METHODS: This retrospective, longitudinal cohort study included 137 eyes of 75 patients with NTG, defined by glaucomatous optic disc or visual field defect with normal IOP (<21 mm Hg), independently from therapy regimen. IOP fluctuation, mean, and maximum were inspected with a mean follow-up of 38mo [standard deviation (SD) 18mo]. Inclusion criteria were the performance of minimum two 48-hour profiles including perimetry, Heidelberg retina tomograph (HRT) imaging, and optic disc photographs. The impact of IOP parameters, myopia, sex, cup-to-disc-ratio, and visual field results on progression of NTG were analyzed using Cox regression models. A sub-group analysis with results from optical coherence tomography (OCT) was performed. RESULTS: IOP fluctuations, average, and maximum were not risk factors for progression in NTG patients, although maximum IOP at the initial IOP profile was higher in eyes with progression than in eyes without progression (P=0.054). The 46/137 (33.5%) eyes progressed over the follow-up period. Overall progression (at least three progression confirmations) occurred in 28/137 eyes (20.4%). Most progressions were detected by perimetry (36/46). Long-term IOP mean over all pressure profiles was 12.8 mm Hg (SD 1.3 mm Hg); IOP fluctuation was 1.4 mm Hg (SD 0.8 mm Hg). The progression-free five-year rate was 58.2% (SD 6.5%). CONCLUSION: Short- and long-term IOP fluctuations do not result in progression of NTG. As functional changes are most likely to happen, NTG should be monitored with visual field testing more often than with other devices.


Author(s):  
Simo S. A. Miettinen ◽  
Hannu J. A. Miettinen ◽  
Jussi Jalkanen ◽  
Antti Joukainen ◽  
Heikki Kröger

Abstract Introduction This retrospective study investigated the long-term follow-up results of medial opening wedge high tibial osteotomy (MOWHTO) with a pre-countered non-locking steel plate implant (Puddu plate = PP) used for medial knee osteoarthrosis (OA) treatment. Materials and methods Consecutive 70 MOWHTOs (66 patients) were performed between 01.01.2004 and 31.12.2008 with the mean follow-up time of 11.4 (SD 4.5; range 1.2–16.1) years. The Kaplan–Meier survival analysis was used to evaluate the cumulative survival of the implant in terms of age (< 50 years old and ≥ 50 years old) and gender. Adverse events were studied and Cox regression analysis was used to evaluate risk factors [age, gender, body mass index (BMI), preoperative mechanical axis, severity of OA, use of bone grafting or substitution and undercorrection of mechanical axis from varus to valgus] for revisions. Results The estimates for the cumulative survival with no need for TKA after MOWHTO were 86% at 5 years, 67% at 10 years and 58% at 16.1 years (SE 0.6, CI 95% 11.1–13.5). A total of 33/70 (47%) adverse events occurred and 38/70 (54%) knees required some revision surgery during the follow-up. Cox regression did not show any statistically significant risk factors for revision. Conclusions The PP has feasible MOWHTO results with a cumulative survival of 67% at 10 years with no need for conversion to TKA. Many adverse events occurred and revision rate due to any reason was high. Age or gender did not have statistically significant differences in terms of survival.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Li Tan ◽  
Yi Tang ◽  
Gaiqin Pei ◽  
Zhengxia Zhong ◽  
Jiaxing Tan ◽  
...  

AbstractIt was reported that histopathologic lesions are risk factors for the progression of IgA Nephropathy (IgAN). The aim of this study was to investigate the relationships between mesangial deposition of C1q and renal outcomes in IgAN. 1071 patients with primary IgAN diagnosed by renal biopsy were enrolled in multiple study centers form January 2013 to January 2017. Patients were divided into two groups: C1q-positive and C1q-negative. Using a 1: 4 propensity score matching (PSM) method identifying age, gender, and treatment modality to minimize confounding factors, 580 matched (out of 926) C1q-negative patients were compared with 145 C1q-positive patients to evaluate severity of baseline clinicopathological features and renal outcome. Kaplan–Meier and Cox proportional hazards analyses were performed to determine whether mesangial C1q deposition is associated with renal outcomes in IgAN. During the follow-up period (41.89 ± 22.85 months), 54 (9.31%) patients in the C1q negative group and 23 (15.86%) patients in C1q positive group reached the endpoint (50% decline of eGFR and/or ESRD or death) respectively (p = 0.01) in the matched cohort. Significantly more patients in C1q negative group achieved complete or partial remission during the follow up period (P = 0.003) both before and after PSM. Three, 5 and 7-year renal survival rates in C1q-positive patients were significantly lower than C1q-negative patients in either unmatched cohort or matched cohort (all p < 0.05). Furthermore, multivariate Cox regression analysis showed that independent risk factors influencing renal survival included Scr, urinary protein, T1-T2 lesion and C1q deposition. Mesangial C1q deposition is a predictor of poor renal survival in IgA nephropathy.Trial registration TCTR, TCTR20140515001. Registered May 15, 2014, http://www.clinicaltrials.in.th/index.php?tp=regtrials&menu=trialsearch&smenu=fulltext&task=search&task2=view1&id=1074.


Stroke ◽  
2016 ◽  
Vol 47 (suppl_1) ◽  
Author(s):  
Marco M Ferrario ◽  
Giovanni Veronesi ◽  
Kari Kuulasmaa ◽  
Martin Bobak ◽  
Lloyd E Chambless ◽  
...  

Introduction and aim: There are limited comparative data on social inequalities in stroke morbidity across Europe. We aimed to assess the magnitude of educational class inequalities in stroke mortality, incidence and 1-year case-fatality in European populations. Methods: The MORGAM study comprised 45 cohorts from Finland, Denmark, Sweden, Northern Ireland, Scotland, France, Germany, Italy, Lithuania, Poland and Russia, mostly recruited in mid 1980s-early 90s. Baseline data collection and follow-up (median 12 years) for fatal and non-fatal strokes adhered to MONICA-like procedures. Stroke mortality was defined according to the underlying cause of death (ICD-IX codes 430-438 or ICD-X I60-I69). We derived 3 educational classes from population-, sex- and birth year-specific tertiles of years of schooling. We estimated the age-adjusted difference in event rates, and the age- and risk factor-adjusted hazard ratios (HRs), between the bottom and the top of the educational class distribution from sex- and population-specific Poisson and Cox regression models, respectively. The association between 1-year case-fatality and education was estimated through logistic models adjusted for risk factors. Results: Among the 91,563 CVD-free participants aged 35-74 at baseline, 1037 stroke deaths and 3902 incident strokes occurred during follow-up. Low education accounted for 26 additional stroke deaths per 100,000 person-years in men (95%CI: 9 to 42), and 19 (7 to 32) in women. In both genders, inequalities in fatal stroke rates were larger in the East EU and in the Nordic Countries populations. The age-adjusted pooled HRs of first stroke, fatal or non-fatal, for the least educated men and women were 1.52 (95%CI: 1.29-1.78) and 1.51 (1.25-1.81), respectively, consistently across populations. Adjustment for smoking, blood pressure, HDL-cholesterol and diabetes attenuated the pooled HRs to 1.34 (95%CI: 1.14-1.57) in men and 1.29 (1.07-1.55) in women. A significant association between low education and increased 1-year case-fatality was observed in Northern Sweden only. Conclusions: Social inequalities in stroke incidence are widespread in most European populations, and less than half of the gap is explained by major risk factors.


2017 ◽  
Vol 48 (6) ◽  
pp. 974-982 ◽  
Author(s):  
A. R. Sutin ◽  
Y. Stephan ◽  
A. Terracciano

BackgroundMultiple studies have found Conscientiousness to be protective against dementia. The purpose of this study is to identify which specific aspects, or facets, of Conscientiousness are most protective against cognitive impairment and whether these associations are moderated by demographic factors and/or genetic risk.MethodsHealth and Retirement Study participants were selected for analysis if they completed the facets of Conscientiousness measure, scored in the range of normal cognitive functioning at the baseline personality assessment, and had at least one follow-up assessment of cognition over the up to 6-year follow-up (N = 11 181). Cox regression was used to test for risk of incident dementia and risk of incident cognitive impairment not dementia (CIND).ResultsOver the follow-up, 278 participants developed dementia and 2186 participants developed CIND. The facet of responsibility had the strongest and most consistent association with dementia risk: every standard deviation increase in this facet was associated with a nearly 35% decreased risk of dementia; self-control and industriousness were also protective. Associations were generally similar when controlling for clinical, behavioral, and genetic risk factors. These three facets were also independent predictors of decreased risk of CIND.ConclusionsThe present research indicates that individuals who see themselves as responsible, able to control their behavior, and hard workers are less likely to develop CIND or dementia and that these associations persist after accounting for some common clinical, behavioral, and genetic risk factors.


BMJ Open ◽  
2018 ◽  
Vol 8 (11) ◽  
pp. e022987 ◽  
Author(s):  
Yu-Yen Chen ◽  
Yun-Ju Lai ◽  
Yung-Feng Yen ◽  
Ying-Cheng Shen ◽  
Chun-Yuan Wang ◽  
...  

ObjectivesTo investigate a possible association between normal tension glaucoma (NTG) and an increased risk of developing Alzheimer’s disease (AD).DesignRetrospective cohort study.SettingNTG group and the comparison group were retrieved from the whole population of the Taiwan National Health Insurance Research Database from 1 January 2001 to 31 December 2013.ParticipantsA total of 15 317 subjects with NTG were enrolled in the NTG group, and 61 268 age-matched and gender-matched subjects without glaucoma were enrolled in the comparison group.Primary and secondary outcome measuresKaplan-Meier curves were generated to compare the cumulative hazard of AD between the two groups. A multivariable Cox regression analysis was used to estimate the adjusted hazard ratios (HRs) of AD, adjusted for diabetes, hypertension, hyperlipidaemia, coronary artery disease and stroke. Furthermore, risk factors for developing AD among the NTG group were investigated.ResultsThe mean age of the cohort was 62.1±12.5 years. Patients with NTG had significantly higher proportions of diabetes, hypertension, hyperlipidaemia, coronary artery disease and stroke than the comparisons. Patients with NTG had a significantly higher cumulative hazard for AD than the comparisons (p<0.0001). In the multivariable Cox regression after adjustment for confounders, the NTG group had a significantly higher risk of AD (adjusted HR 1.52; 95% CI 1.41 to 1.63). Moreover, in the NTG group, when we compared the effects of different types of glaucoma eye drops, none of the eye drops used were significant risk factors or protective factors for AD.ConclusionsPeople with NTG are at a significantly greater risk of developing AD compared with individuals without glaucoma. Among patients with NTG, none of the glaucoma eye drops used significantly changed the risk of subsequent AD.


2010 ◽  
Vol 30 (4) ◽  
pp. 440-447 ◽  
Author(s):  
Jie Dong ◽  
Yuan Chen

ObjectiveWe studied whether improper bag exchange predicts the first peritonitis episode in continuous ambulatory peritoneal dialysis (CAPD) patients.Patients and MethodsOur single-center prospective observational study of 130 incident urban CAPD patients who started peritoneal dialysis (PD) between March 2005 and August 2008 aimed to determine the relationship between bag exchange procedures examined at the 6th month of PD and risk for a first peritonitis episode. All patients were followed until a first peritonitis episode, censoring, or the end of the study.ResultsThese 130 patients experienced 22 first peritonitis episodes during the 14-month follow-up. During bag exchange evaluation, 51.5% of patients washed their hands improperly, 46.2% failed to check expiration date or bag leakage, and 11.5% forgot to wear a face mask and cap. Patients experiencing peritonitis were more likely to forget to wear a face mask and cap. In multivariate Cox regression model, not wearing a face mask and cap [hazard ratio (HR): 7.26; 95% confidence interval (CI): 2.6 to 20.1; p < 0.001] and having anemia (HR: 0.96; 95% CI: 0.94 to 0.99; p = 0.005) were independent risk factors for a first episode of peritonitis.ConclusionsNot wearing a face mask and cap and having anemia were independent risk factors for peritonitis. A further randomized control study needs to verify the correlation between improper bag exchange technique and peritonitis in PD patients.


Blood ◽  
2019 ◽  
Vol 134 (Supplement_1) ◽  
pp. 4253-4253
Author(s):  
Hanne Rozema ◽  
Robby Kibbelaar ◽  
Nic Veeger ◽  
Mels Hoogendoorn ◽  
Eric van Roon

The majority of patients with myelodysplastic syndromes (MDS) require regular red blood cell (RBC) transfusions. Alloimmunization (AI) against blood products is an adverse event, causing time-consuming RBC compatibility testing. The reported incidence of AI in MDS patients varies greatly. Even though different studies on AI in MDS patients have been performed, there are still knowledge gaps. Current literature has not yet fully identified the risk factors and dynamics of AI in individual patients, nor has the influence of disease modifying treatment (DMT) been explored. Therefore, we performed this study to evaluate the effect of DMT on AI. An observational, population-based study, using the HemoBase registry, was performed including all newly diagnosed MDS patients between 2005 and 2017 in Friesland, a province of the Netherlands. All available information about treatment and transfusions, including transfusion dates, types, and treatment regimens, was collected from the electronic health records and laboratory systems. Follow-up occurred through March 2019. For our patient cohort, blood products were matched for AB0 and RhD, and transfused per the 'type and screen' policy (i.e. electronic matching of blood group phenotype between patient and donor). After a positive antibody screening, antibody identification and Rh/K phenotyping was performed and subsequent blood products were (cross)matched accordingly. The observation period was counted from first transfusion until last transfusion or first AI event. Univariate analyses and cumulative frequency distributions were performed to study possible risk factors and dynamics of AI. DMT was defined as hypomethylating agents, lenalidomide, chemotherapy and monoclonal antibodies. The effect of DMT as a temporary risk period on the risk of AI was estimated with incidence rates, relative risks (RR) and hazard ratios (HR) using a cox regression analysis. Follow-up was limited to 24 months for the cox regression analysis to avoid possible bias by survival differences. Statistical analyses were performed using IBM SPSS 24 and SAS 9.4. Out of 292 MDS patients, 236 patients received transfusions and were included in this study, covering 463 years of follow-up. AI occurred in 24 patients (10%). AI occurred mostly in the beginning of the observation period: Eighteen patients (75%) were alloimmunized after receiving 20 units of RBCs, whereas 22 patients (92%) showed AI after 45 units of RBCs (Figure 1). We found no significant risk factors for AI in MDS patients at baseline. DMT was given to 67 patients (28%) during the observation period. Patients on DMT received more RBC transfusions than patients that did not receive DMT (median of 33 (range: 3-154) and 11 (range: 0-322) RBC units respectively, p<0,001). Four AI events (6%) occurred in patients on DMT and 20 AI events (12%) occurred in patients not on DMT. Cox regression analysis of the first 24 months of follow-up showed an HR of 0.30 (95% CI: 0.07-1.31; p=0.11). The incidence rates per 100 person-years were 3.19 and 5.92 respectively. The corresponding RR was 0.54 (95% CI: 0.16-1.48; p=0.26). Based on our results, we conclude that the incidence of AI in an unselected, real world MDS population receiving RBC transfusions is 10% and predominantly occurred in the beginning of follow-up. Risk factors for AI at baseline could not be identified. Our data showed that patients on DMT received significantly more RBC transfusions but were less susceptible to AI. Therefore, extensive matching of blood products may not be necessary for patients on DMT. Larger studies are needed to confirm the protective effect of DMT on AI. Disclosures Rozema: Celgene: Other: Financial support for visiting MDS Foundation conference.


Circulation ◽  
2020 ◽  
Vol 141 (Suppl_1) ◽  
Author(s):  
Pooja Subedi ◽  
Huaizhen Qin ◽  
Shelley A Cole ◽  
Maria T Plaza ◽  
Arce Domingo-Relloso ◽  
...  

Background: Biological aging assessed by both leukocyte telomere length (LTL) and DNA methylation (DNAm) has been associated with CVD and its risk factors. Moreover, LTL is epigenetically regulated. We hypothesized that LTL-associated epigenetic changes are associated with risk of CVD in the community. Objective: To test whether LTL-associated loci are associated with incident CVD, independent of standard risk factors in multi-ethnic cohorts. Method: We evaluated 3,628 participants with complete LTL and DNAm data in four prospective cohorts, including 1,531 American Indians from the Strong Heart Study (SHS, mean age 56, 60% women), 821 non-Hispanic Whites (NHW) from the Framingham Heart Study (FHS, mean age 60, 51% women), 471 NHW, 150 Hispanics, and 162 African Americans from the Multi-ethnic Study of Atherosclerosis (MESA, mean age 70 , 55% women), and 471 NHW and 342 African Americans from the Women’s Health Initiative (WHI, mean age 65 , all women). LTL was quantified by qPCR (SHS, MESA) or Southern blot (FHS, WHI). DNAm was assayed by Illumina EPIC (SHS) or 450K (FHS, MESA and WHI) arrays. We imputed 450K to 850K using random forest algorithms. CVD events included fatal and nonfatal MI, CHD, heart failure, stroke, peripheral artery diseases, and cardiovascular deaths. Cohort-specific EWAS was conducted to identify CpGs associated with LTL, adjusting for age, sex, race/ethnicity, smoking, alcohol, BMI, site, cell proportion, and batch. Multiple testing was Bonferroni-corrected (genome-wide P < 2.4 x10 -7 ). Results across studies were combined by random-effects meta-analysis. To examine whether LTL-associated epigenetic loci are associated with CVD risk, we used a weighted methylation score to predict incident CVD by Cox regression, adjusting for age, sex, site, smoking, alcohol, BMI, glucose, SBP, LDL-C, and total cholesterol. Results: We ascertained 2,001 CVD events, including 986 in the SHS (average follow-up 15.2 years), 208 in FHS (average follow-up 7.7 years), 74 in MESA (average follow-up 4.9 years), and 733 in WHI (average follow-up 12.2 years). Meta-EWAS identified 22 CpGs (mapped to 17 unique genes) associated with LTL. Of these, 19 loci (15 negatively and 4 positively associated with LTL) had consistent directionality of association across four cohorts. The most significant genes harboring altered CpG sites included TLL2 (cg10549018, P= 2.42 x 10 -12 ) and TPST1 ( cg10691866, P =8.6 x 10 -10 ). A higher composite methylation score, which reflects longer LTL ( P <0.0001), was significantly associated with a reduced risk of CVD in the SHS (HR=0.16, 95% CI: 0.07–0.37), FHS (HR=0.08, 95% CI: 0.01–0.40), and WHI (HR=0.32, 95%CI: 0.13–0.82), but not MESA (HR=0.47, 95% CI: 0.04–5.09). Conclusion: Altered DNA methylation at 19 CpG loci was significantly associated with LTL. Their combined effects may predict a reduced risk of CVD. The observed associations warrant further investigation.


2019 ◽  
Vol 104 (1) ◽  
pp. 81-86 ◽  
Author(s):  
Sung Uk Baek ◽  
Ahnul Ha ◽  
Dai Woo Kim ◽  
Jin Wook Jeoung ◽  
Ki Ho Park ◽  
...  

Background/AimsTo investigate the risk factors for disease progression of normal-tension glaucoma (NTG) with pretreatment intraocular pressure (IOP) in the low-teens.MethodsOne-hundred and two (102) eyes of 102 patients with NTG with pretreatment IOP≤12 mm Hg who had been followed up for more than 60 months were retrospectively enrolled. Patients were divided into progressor and non-progressor groups according to visual field (VF) progression as correlated with change of optic disc or retinal nerve fibre layer defect. Baseline demographic and clinical characteristics including diurnal IOP and 24 hours blood pressure (BP) were compared between the two groups. The Cox proportional hazards model was used to identify the risk factors for disease progression.ResultsThirty-six patients (35.3%) were classified as progressors and 66 (64.7%) as non-progressors. Between the two groups, no significant differences were found in the follow-up periods (8.7±3.4 vs 7.7±3.2 years; p=0.138), baseline VF mean deviation (−4.50±5.65 vs −3.56±4.30 dB; p=0.348) or pretreatment IOP (11.34±1.21 vs 11.17±1.06 mm Hg; p=0.121). The multivariate Cox proportional hazards model indicated that greater diurnal IOP at baseline (HR=1.609; p=0.004), greater fluctuation of diastolic BP (DBP; HR=1.058; p=0.002) and presence of optic disc haemorrhage during follow-up (DH; HR=3.664; p=0.001) were risk factors for glaucoma progression.ConclusionIn the low-teens NTG eyes, 35.3% showed glaucoma progression during the average 8.7 years of follow-up. Fluctuation of DBP and diurnal IOP as well as DH were significantly associated with greater probability of disease progression.


Sign in / Sign up

Export Citation Format

Share Document