Progressive change in peripapillary atrophy in myopic glaucomatous eyes

2018 ◽  
Vol 102 (11) ◽  
pp. 1527-1532 ◽  
Author(s):  
Min Kyung Song ◽  
Kyung Rim Sung ◽  
Joong Won Shin ◽  
Junki Kwon ◽  
Ji Yun Lee ◽  
...  

AimTo evaluate the progressive change in peripapillary atrophy (PPA) according to its shape and to explore the relationship between PPA progression and glaucoma worsening in myopic eyes.MethodsA total of 159 eyes of 159 patients with myopic (axial length (AXL) >24 mm) glaucoma (mean follow-up 4.4 years, 35 eyes with minimal PPA, 40 concentric-type PPA eyes (>270° around the optic disc) and 84 eccentric-type PPA eyes (<270°)) were included. Sequential stereoscopic colour optic disc photographs were evaluated to qualitatively determine PPA progression. Factors associated with PPA progression were explored by Cox proportional hazard modelling in each PPA group.ResultsPatients with concentric PPA were older than patients with eccentric PPA (54.1±11.7 vs 44.1±11.7 years; P<0.001), and AXL was longer in the eccentric group than in the other groups (25.54±1.68 vs 25.28±1.53 vs 26.41±1.29 mm; P<0.001). Twenty-six eyes (65%) in the concentric group and 36 eyes (42.9%) in the eccentric group showed PPA progression. Older age (hazard ratio (HR) 1.059, P=0.008), worse baseline visual field mean deviation (HR 0.857, P=0.009) and greater baseline PPA area (HR 1.000, P=0.012) were associated with PPA progression in the concentric type. Glaucoma progression (HR 3.690, P=0.002) and longer AXL (HR 1.521, P=0.002) were associated with PPA progression in the eccentric type.ConclusionsRelationship between glaucoma worsening and PPA progression was strongest in myopic glaucomatous eyes with eccentric type PPA.

2021 ◽  
pp. ASN.2020081156
Author(s):  
Alexander J. Kula ◽  
David K. Prince ◽  
Joseph T. Flynn ◽  
Nisha Bansal

BackgroundBP is an important modifiable risk factor for cardiovascular events and CKD progression in middle-aged or older adults with CKD. However, studies describing the relationship between BP with outcomes in young adults with CKD are limited.MethodsIn an observational study, we focused on 317 young adults (aged 21–40 years) with mild to moderate CKD enrolled in the Chronic Renal Insufficiency Cohort (CRIC) Study. Exposures included baseline systolic BP evaluated continuously (per 10 mm Hg increase) and in categories (<120, 120–129, and ≥130 mm Hg). Primary outcomes included cardiovascular events (heart failure, myocardial infarction, stroke, or all-cause death) and CKD progression (50% decline of eGFR or ESKD). We used Cox proportional hazard models to test associations between baseline systolic BP with cardiovascular events and CKD progression.ResultsCardiovascular events occurred in 52 participants and 161 had CKD progression during median follow-up times of 11.3 years and 4.1 years, respectively. Among those with baseline systolic BP ≥130 mm Hg, 3%/yr developed heart failure, 20%/yr had CKD progression, and 2%/yr died. In fully adjusted models, baseline systolic BP ≥130 mm Hg (versus systolic BP<120 mm Hg) was significantly associated with cardiovascular events or death (hazard ratio [HR], 2.13; 95% confidence interval [95% CI], 1.05 to 4.32) and CKD progression (HR, 1.68; 95% CI, 1.10 to 2.58).ConclusionsAmong young adults with CKD, higher systolic BP is significantly associated with a greater risk of cardiovascular events and CKD progression. Trials of BP management are needed to test targets and treatment strategies specifically in young adults with CKD.


2019 ◽  
Vol 221 (10) ◽  
pp. 1607-1611
Author(s):  
T Zhang ◽  
I B Wilson ◽  
B Youn ◽  
Y Lee ◽  
T I Shireman

Abstract Background This study was conducted to examine patient characteristics associated with antiretroviral therapy (ART) reinitiation in Medicaid enrollees. Methods This is a retrospective cohort study that uses Cox proportional hazard regression to examine the association between person-level characteristics and time from ART discontinuation to the subsequent reinitiation within 18 months. Results There were 45 409 patients who discontinued ART, and 44% failed to reinitiate. More outpatient visits (3+ vs 0 outpatient visits: adjusted hazard ratio (adjHR), 1.56; 99% confidence interval [CI], 1.45–1.67) and hospitalization (adjHR, 1.18; 99% CI,1.16–1.20) during follow-up were associated with reinitiation. Conclusions Failure to reinitiate ART within 18 months was common in this sample. Care engagement was associated with greater ART reinitiation.


2021 ◽  
Vol 22 (1) ◽  
Author(s):  
Yuqi Yang ◽  
Jingjing Da ◽  
Yi Jiang ◽  
Jing Yuan ◽  
Yan Zha

Abstract Background Serum parathyroid hormone (PTH) levels have been reported to be associated with infectious mortality in peritoneal dialysis (PD) patients. Peritonitis is the most common and fatal infectious complication, resulting in technique failure, hospital admission and mortality. Whether PTH is associated with peritonitis episodes remains unclear. Methods We examined the association of PTH levels and peritonitis incidence in a 7-year cohort of 270 incident PD patients who were maintained on dialysis between January 2012 and December 2018 using Cox proportional hazard regression analyses. Patients were categorized into three groups by serum PTH levels as follows: low-PTH group, PTH < 150 pg/mL; middle-PTH group, PTH 150-300 pg/mL; high-PTH group, PTH > 300 pg/mL. Results During a median follow-up of 29.5 (interquartile range 16–49) months, the incidence rate of peritonitis was 0.10 episodes per patient-year. Gram-positive organisms were the most common causative microorganisms (36.2%), and higher percentage of Gram-negative organisms was noted in patients with low PTH levels. Low PTH levels were associated with older age, higher eGFR, higher hemoglobin, calcium levels and lower phosphate, alkaline phosphatase levels. After multivariate adjustment, lower PTH levels were identified as an independent risk factor for peritonitis episodes [hazard ratio 1.643, 95% confidence interval 1.014–2.663, P = 0.044]. Conclusions Low PTH levels are independently associated with peritonitis in incident PD patients.


BMC Neurology ◽  
2021 ◽  
Vol 21 (1) ◽  
Author(s):  
George Umemoto ◽  
Shinsuke Fujioka ◽  
Hajime Arahata ◽  
Nobutaka Sakae ◽  
Naokazu Sasagasako ◽  
...  

Abstract Background Swallowing dysfunction is related to major cause of adverse events and an indicator of shorter survival among patients with neuromuscular disorders (NMD). It is critical to assess the swallowing function during disease progression, however, there are limited tools that can easily evaluate swallowing function without using videofluoroscopic or videoendoscopic examination. Here, we evaluated the longitudinal changes in tongue thickness (TT) and maximum tongue pressure (MTP) among patients with amyotrophic lateral sclerosis (ALS), myotonic dystrophy type 1 (DM1), and Duchenne muscular dystrophy (DMD). Methods Between 2010 and 2020, TT and MTP were measured from 21 ALS, 30 DM1, and 14 DMD patients (mean ages of 66.9, 44.5, and 21.4 years, respectively) at intervals of more than half a year. TT was measured, by ultrasonography, as the distance from the mylohyoid muscle raphe to the tongue dorsum, and MTP was determined by measuring the maximum compression on a small balloon when pressing the tongue against the palate. Then we examined the relationship between these evaluations and patient background and swallowing function. Results Mean follow-up periods were 24.0 months in the ALS group, 47.2 months in the DM1group, and 61.1 months in the DMD group. The DMD group demonstrated larger first TT than the other groups, while the DM1 group had lower first MTP than the ALS group. The ALS group showed a greater average monthly reduction in mean TT than the DM1 group and greater monthly reductions in mean body weight (BW) and MTP than the other groups. Significant differences between the first and last BW, TT, and MTP measures were found only in the ALS group. Conclusions This study suggests that ALS is associated with more rapid degeneration of tongue function over several years compared to DMD and DM1.


Author(s):  
Jeffrey F Scherrer ◽  
Joanne Salas ◽  
Timothy L Wiemken ◽  
Christine Jacobs ◽  
John E Morley ◽  
...  

Abstract Background Adult vaccinations may reduce risk for dementia. However it has not been established whether tetanus, diphtheria, pertussis (Tdap) vaccination is associated with incident dementia. Methods Hypotheses were tested in a Veterans Health Affairs (VHA) cohort and replicated in a MarketScan medical claims cohort. Patients were ≥65 years of age and free of dementia for 2 years prior to index date. Patients either had or did not have a Tdap vaccination by the start of either of two index periods (2011 or 2012). Follow-up continued through 2018. Controls had no Tdap vaccination for the duration of follow-up. Confounding was controlled using entropy balancing. Competing risk (VHA) and Cox proportional hazard (MarketScan) models estimated the association between Tdap vaccination and incident dementia in all patients and in age sub-groups (65-69, 70-74, ≥75 years of age). Results VHA patients were, on average, 75.6 (SD±7.5) years of age, 4% female, and 91.2% were white race. MarketScan patients were 69.8 (SD±5.6) years of age, on average and 65.4% were female. After controlling for confounding, patients with, compared to without Tdap vaccination, had a significantly lower risk for dementia in both cohorts (VHA: HR=0.58; 95%CI:0.54 - 0.63 and MarketScan: HR=0.58; 95%CI:0.48 - 0.70). Conclusions Tdap vaccination was associated with a 42% lower dementia risk in two cohorts with different clinical and sociodemographic characteristics. Several vaccine types are linked to decreased dementia risk, suggesting that these associations are due to nonspecific effects on inflammation rather than vaccine-induced pathogen-specific protective effects.


2020 ◽  
pp. 140349482096065
Author(s):  
Hanna Rinne ◽  
Mikko Laaksonen

Aims: Most high mortality-risk occupations are manual occupations. We examined to what extent high mortality of such occupations could be explained by education, income, unemployment or industry and whether there were differences in these effects among different manual occupations. Methods: We used longitudinal individual-level register-based data, the study population consisting of employees aged 30–64 at the end of the year 2000 with the follow-up period 2001–2015. We used Cox proportional hazard regression models in 31 male and 11 female occupations with high mortality. Results: There were considerable differences between manual occupations in how much adjusting for education, income, unemployment and industry explained the excess mortality. The variation was especially large among men: controlling for these variables explained over 50% of the excess mortality in 23 occupations. However, in some occupations the excess mortality even increased in relation to unadjusted mortality. Among women, these variables explained a varying proportion of the excess mortality in every occupation. After adjustment of all variables, mortality was no more statistically significantly higher than average in 14 occupations among men and 2 occupations among women. Conclusions: The high mortality in manual occupations was mainly explained by education, income, unemployment and industry. However, the degree of explanation varied widely between occupations, and considerable variation in mortality existed between manual occupations after controlling for these variables. More research is needed on other determinants of mortality in specific high-risk occupations.


Author(s):  
ZhaoHong Han

At the recent CLTA-S2 conference, a spirited debate occurred between critics of second language acquisition (SLA) research and researchers who embraced it. Fascinating as it was, neither camp appeared to have convinced the other, but, more important, the debate left much of the audience flummoxed. In this paper, I intend to provide a follow-up, attempting to clarify a) the relationship between research and teaching in the context of Chinese as a second language (CSL), b) misunderstandings on the part of critics over research findings, and c) potential pitfalls in interpreting the SLA literature. My goal is to encourage, as well as contribute to, further communication between the two camps, for the ultimate good of CSL instruction and learning.


Rheumatology ◽  
2018 ◽  
Vol 58 (4) ◽  
pp. 650-655 ◽  
Author(s):  
Alexander Oldroyd ◽  
Jamie C Sergeant ◽  
Paul New ◽  
Neil J McHugh ◽  
Zoe Betteridge ◽  
...  

Abstract Objectives To characterize the 10 year relationship between anti-transcriptional intermediary factor 1 antibody (anti-TIF1-Ab) positivity and cancer onset in a large UK-based adult DM cohort. Methods Data from anti-TIF1-Ab-positive/-negative adults with verified diagnoses of DM from the UK Myositis Network register were analysed. Each patient was followed up until they developed cancer. Kaplan–Meier methods and Cox proportional hazard modelling were employed to estimate the cumulative cancer incidence. Results Data from 263 DM cases were analysed, with a total of 3252 person-years and a median 11 years of follow-up; 55 (21%) DM cases were anti-TIF1-Ab positive. After 10 years of follow-up, a higher proportion of anti-TIF1-Ab-positive cases developed cancer compared with anti-TIF1-Ab-negative cases: 38% vs 15% [hazard ratio 3.4 (95% CI 2.2, 5.4)]. All the detected malignancy cases in the anti-TIF1-Ab-positive cohort occurred between 3 years prior to and 2.5 years after DM onset. No cancer cases were detected within the following 7.5 years in this group, whereas cancers were detected during this period in the anti-TIF1-Ab-negative cases. Ovarian cancer was more common in the anti-TIF1-Ab-positive vs -negative cohort: 19% vs 2%, respectively (P < 0.05). No anti-TIF1-Ab-positive case <39 years of age developed cancer, compared with 21 (53%) of those ≥39 years of age. Conclusion Anti-TIF1-Ab-positive-associated malignancy occurs exclusively within the 3 year period on either side of DM onset, the risk being highest in those ≥39 years of age. Cancer types differ according to anti-TIF1-Ab status, and this may warrant specific cancer screening approaches.


2020 ◽  
Author(s):  
Gian Maria Campedelli ◽  
Enzo Yaksic

Relying on a sample of 1,394 US-based multiple homicide offenders (MHOs), we study the duration of the careers of this extremely violent category of offenders through Kaplan-Meier estimation and Cox Proportional Hazard regression. We investigate the characteristics of such careers in terms of length and we provide an inferential analysis investigating correlates of career duration. The models indicate that females, MHOs employing multiple methods, younger MHOs and MHOs that acted in more than one US state have higher odds of longer careers. Conversely, those offending with a partner and those targeting victims from a single sexual group have a higher probability of shorter careers.


2021 ◽  
Vol 8 ◽  
Author(s):  
Dana Bielopolski ◽  
Ruth Rahamimov ◽  
Boris Zingerman ◽  
Avry Chagnac ◽  
Limor Azulay-Gitter ◽  
...  

Background: Microalbuminuria is a well-characterized marker of kidney malfunction, both in diabetic and non-diabetic populations, and is used as a prognostic marker for cardiovascular morbidity and mortality. A few studies implied that it has the same value in kidney transplanted patients, but the information relies on spot or dipstick urine protein evaluations, rather than the gold standard of timed urine collection.Methods: We revisited a cohort of 286 kidney transplanted patients, several years after completing a meticulously timed urine collection and assessed the prevalence of major cardiovascular adverse events (MACE) in relation to albuminuria.Results: During a median follow up of 8.3 years (IQR 6.4–9.1) 144 outcome events occurred in 101 patients. By Kaplan-Meier analysis microalbuminuria was associated with increased rate of CV outcome or death (p = 0.03), and this was still significant after stratification according to propensity score quartiles (p = 0.048). Time dependent Cox proportional hazard analysis showed independent association between microalbuminuria and CV outcomes 2 years following microalbuminuria detection (HR 1.83, 95% CI 1.07–2.96).Conclusions: Two years after documenting microalbuminuria in kidney transplanted patients, their CVD risk was increased. There is need for primary prevention strategies in this population and future studies should address the topic.


Sign in / Sign up

Export Citation Format

Share Document