Longitudinal associations of subclinical hearing loss with cognitive decline

Author(s):  
Alexandria L Irace ◽  
Nicole M Armstrong ◽  
Jennifer A Deal ◽  
Alexander Chern ◽  
Luigi Ferrucci ◽  
...  

Abstract Background Several studies have demonstrated that age-related hearing loss is associated with cognitive decline. We investigated whether subclinical hearing loss (SCHL), or imperfect hearing traditionally categorized as normal (pure tone average ≤25 dB), may be similarly linked to cognitive decline and risk of incident mild cognitive impairment (MCI)/dementia. Methods Participants from the Baltimore Longitudinal Study of Aging were cognitively normal adults ≥50 years old with cognitive assessments from 1991-2019 and pure-tone average ≤25 dB measured between 1991-1994 (n=263). The exposure was hearing based on the better ear pure-tone average. Outcomes were test scores in various cognitive domains. Multivariable linear-mixed effects models modeled the association between hearing and change in cognition over time, adjusting for age, sex, education, vascular burden, and race. Kaplan–Meier survival curves and Cox proportional hazards models portrayed associations between hearing and incident MCI/dementia diagnosis based on predefined criteria. Results Of 263 participants, 145 (55.1%) were female; mean age was 68.3 years (standard deviation, SD=8.9). Follow-up ranged up to 27.7 years (mean=11.7 years). Adjusting for multiple comparisons, a 10-dB increase in hearing loss was associated with an annual decline of -0.02 SDs (95% confidence interval, [CI]: -0.03, -0.01) in Letter Fluency. No significant relationships were observed between hearing and incident MCI/dementia. Conclusions A relationship between SCHL and cognitive decline was observed for the Letter Fluency test. Further studies are necessary to determine when in the spectrum of hearing loss there begins to be an observable relationship between hearing and cognitive decline.

2020 ◽  
Vol 4 (Supplement_1) ◽  
pp. 214-215
Author(s):  
Rahul Sharma ◽  
Anil Lalwani ◽  
Justin Golub

Abstract The progression and asymmetry of age-related hearing loss has not been well characterized in those 80 years of age and older because public datasets mask upper extremes of age to protect anonymity. We aimed to model the progression and asymmetry of hearing loss in the older old using a representative, national database. This was a cross-sectional, multicentered US epidemiologic analysis using the National Health and Nutrition Examination Study (NHANES) 2005-2006, 2009-2010, and 2011-2012 cycles. Subjects included non-institutionalized, civilian adults 80 years and older (n=621). Federal security clearance was granted to access publicly-restricted age data. Outcome measures included pure-tone average air conduction thresholds and the 4-frequency pure tone average (PTA). 621 subjects were 80 years old or older (mean=84.2 years, range=80-104 years), representing 10,600,197 Americans. Hearing loss exhibited constant acceleration across the adult lifespan at a rate of 0.0052 dB/year2 (95% CI = 0.0049, 0.0055). Compounded over a lifetime, the velocity of hearing loss would increase five-fold, from 0.2 dB loss/year at age 20 to 1 dB loss/year at age 100. This model predicted mean PTA within 2 dB of accuracy for most ages between 20 and 100 years. There was no change in the asymmetry of hearing loss with increasing age over 80 years (linear regression coefficient of asymmetry over age=0.07 (95% CI=-0.01, 0.24). In conclusion, hearing loss steadily and predictably accelerates across the adult lifespan to at least age 100, becoming near-universal. These population-level statistics will guide treatment and policy recommendations for hearing health in the older old.


2020 ◽  
Vol 41 (Supplement_2) ◽  
Author(s):  
S Kochav ◽  
R.C Chen ◽  
J.M.D Dizon ◽  
J.A.R Reiffel

Abstract Background Theoretical concern exists regarding AV block (AVB) with class I antiarrhythmics (AADs) when bundle branch block (BBB) is present. Whether this is substantiated in real-world populations is unknown. Purpose To determine the relationship between type of AAD and incidence of AVB in patients with preexisting BBB. Methods We retrospectively studied all patients with BBB who received class I and III AADs between 1997–2019 to compare incidence of AVB. We defined index time as first exposure to either drug class and excluded patients with prior AVB or exposed to both classes. Time-at-risk window ended at first outcome occurrence or when patients were no longer observed in the database. We estimated hazard ratios for incident AVB using Cox proportional hazards models with propensity score stratification, adjusting for over 32,000 covariates from the electronic health record. Kaplan-Meier methods were used to determine treatment effects over time. Results Of 40,120 individuals with BBB, 148 were exposed to a class I AAD and 2401 to a class III AAD. Over nearly 4,200 person-years of follow up, there were 22 and 620 outcome events in the class I and class III cohorts, respectively (Figure). In adjusted analyses, AVB risk was markedly lower in patients exposed to class I AADs compared with class III (HR 0.48 [95% CI 0.30–0.75]). Conclusion Among patients with BBB, exposure to class III AADs was strongly associated with greater risk of incident AVB. This likely reflects differences in natural history of patients receiving class I vs class III AADs rather than adverse class III effects, however, the lack of worse outcomes acutely with class I AADs suggests that they may be safer in BBB than suspected. Funding Acknowledgement Type of funding source: None


2017 ◽  
Vol 117 (06) ◽  
pp. 1072-1082 ◽  
Author(s):  
Xiaoyan Li ◽  
Steve Deitelzweig ◽  
Allison Keshishian ◽  
Melissa Hamilton ◽  
Ruslan Horblyuk ◽  
...  

SummaryThe ARISTOTLE trial showed a risk reduction of stroke/systemic embolism (SE) and major bleeding in non-valvular atrial fibrillation (NVAF) patients treated with apixaban compared to warfarin. This retrospective study used four large US claims databases (MarketScan, PharMetrics, Optum, and Humana) of NVAF patients newly initiating apixaban or warfarin from January 1, 2013 to September 30, 2015. After 1:1 warfarin-apixaban propensity score matching (PSM) within each database, the resulting patient records were pooled. Kaplan-Meier curves and Cox proportional hazards models were used to estimate the cumulative incidence and hazard ratios (HRs) of stroke/SE and major bleeding (identified using the first listed diagnosis of inpatient claims) within one year of therapy initiation. The study included a total of 76,940 (38,470 warfarin and 38,470 apixaban) patients. Among the 38,470 matched pairs, 14,563 were from MarketScan, 7,683 were from PharMetrics, 7,894 were from Optum, and 8,330 were from Humana. Baseline characteristics were balanced between the two cohorts with a mean (standard deviation [SD]) age of 71 (12) years and a mean (SD) CHA2DS2-VASc score of 3.2 (1.7). Apixaban initiators had a significantly lower risk of stroke/SE (HR: 0.67, 95 % CI: 0.59–0.76) and major bleeding (HR: 0.60, 95 % CI: 0.54–0.65) than warfarin initiators. Different types of stroke/SE and major bleeding – including ischaemic stroke, haemorrhagic stroke, SE, intracranial haemorrhage, gastrointestinal bleeding, and other major bleeding – were all significantly lower for apixaban compared to warfarin treatment. Subgroup analyses (apixaban dosage, age strata, CHA2DS2-VASc or HAS-BLED score strata, or dataset source) all show consistently lower risks of stroke/SE and major bleeding associated with apixaban as compared to warfarin treatment. This is the largest “real-world” study on apixaban effectiveness and safety to date, showing that apixaban initiation was associated with significant risk reductions in stroke/SE and major bleeding compared to warfarin initiation after PSM. These benefits were consistent across various high-risk subgroups and both the standard-and low-dose apixaban dose regimens.Note: The review process for this manuscript was fully handled by Christian Weber, Editor in Chief.Supplementary Material to this article is available online at www.thrombosis-online.com.


Neurology ◽  
2019 ◽  
Vol 93 (24) ◽  
pp. e2247-e2256 ◽  
Author(s):  
Miguel Arce Rentería ◽  
Jet M.J. Vonk ◽  
Gloria Felix ◽  
Justina F. Avila ◽  
Laura B. Zahodne ◽  
...  

ObjectiveTo investigate whether illiteracy was associated with greater risk of prevalent and incident dementia and more rapid cognitive decline among older adults with low education.MethodsAnalyses included 983 adults (≥65 years old, ≤4 years of schooling) who participated in a longitudinal community aging study. Literacy was self-reported (“Did you ever learn to read or write?”). Neuropsychological measures of memory, language, and visuospatial abilities were administered at baseline and at follow-ups (median [range] 3.49 years [0–23]). At each visit, functional, cognitive, and medical data were reviewed and a dementia diagnosis was made using standard criteria. Logistic regression and Cox proportional hazards models evaluated the association of literacy with prevalent and incident dementia, respectively, while latent growth curve models evaluated the effect of literacy on cognitive trajectories, adjusting for relevant demographic and medical covariates.ResultsIlliterate participants were almost 3 times as likely to have dementia at baseline compared to literate participants. Among those who did not have dementia at baseline, illiterate participants were twice as likely to develop dementia. While illiterate participants showed worse memory, language, and visuospatial functioning at baseline than literate participants, literacy was not associated with rate of cognitive decline.ConclusionWe found that illiteracy was independently associated with higher risk of prevalent and incident dementia, but not with a more rapid rate of cognitive decline. The independent effect of illiteracy on dementia risk may be through a lower range of cognitive function, which is closer to diagnostic thresholds for dementia than the range of literate participants.


2006 ◽  
Vol 24 (18_suppl) ◽  
pp. 560-560 ◽  
Author(s):  
D. A. Patt ◽  
Z. Duan ◽  
G. Hortobagyi ◽  
S. H. Giordano

560 Background: Adjuvant chemotherapy for breast cancer is associated with the development of secondary AML, but this risk in an older population has not been previously quantified. Methods: We queried data from the Surveillance, Epidemiology, and End Results-Medicare (SEER-Medicare) database for women who were diagnosed with nonmetastatic breast cancer from 1992–1999. We compared the risk of AML in patients with and without adjuvant chemotherapy (C), and by differing C regimens. The primary endpoint was a claim with an inpatient or outpatient diagnosis of AML (ICD-09 codes 205–208). Risk of AML was estimated using the method of Kaplan-Meier. Cox proportional hazards models were used to determine factors independently associated with AML. Results: 36,904 patients were included in this observational study, 4,572 who had received adjuvant C and 32,332 who had not. The median patient age was 75.3 (66.0–103.3). The median follow up was 63 months (13–132). Patients who received C were significantly younger, had more advanced stage disease, and had lower comorbidity scores (p<0.001). The unadjusted risk of developing AML at 10 years after any adjuvant C for breast cancer was 1.6% versus 1.1% for women who had not received C. The adjusted HR for AML with adjuvant C was 1.72 (1.16–2.54) compared to women who did not receive C. HR for radiation was 1.21 (0.86–1.70). HR was higher with increasing age but p>0.05. An analysis was performed among women who received C. When compared to other C regimens, anthracycline-based therapy (A) conveyed a significantly higher hazard for AML HR 2.17 (1.08–4.38), while patients who received A plus taxanes (T) did not have a significant increase in risk HR1.29 (0.44–3.82) nor did patients who received T with some other C HR 1.50 (0.34–6.67). Another significant independent predictor of AML included GCSF use HR 2.21 (1.14–4.25). In addition, increasing A dose was associated with higher risk of AML (p<0.05). Conclusions: There is a small but real increase in AML after adjuvant chemotherapy for breast cancer in older women. The risk appears to be highest from A-based regimens, most of which also contained cyclophosphamide, and may be dose-dependent. T do not appear to increase risk. The role of GCSF should be further explored. No significant financial relationships to disclose.


2015 ◽  
Vol 33 (3_suppl) ◽  
pp. 453-453
Author(s):  
Kelly Elizabeth Orwat ◽  
Samuel Lewis Cooper ◽  
Michael Ashenafi ◽  
M. Bret Anderson ◽  
Marcelo Guimaraes ◽  
...  

453 Background: Systemic therapies for unresectable liver malignancies may provide a survival benefit, but eventually prove intolerable or ineffective. TARE provides an additional liver-directed treatment option to improve local control for these patients, but there is limited data on patient factors associated with survival. Methods: All patients that received TARE at the Medical University of South Carolina from March 2006 through May of 2014 were included in this analysis of overall survival (OS) and toxicity. Kaplan-Meier estimates of OS from date of first procedure are reported. Potential prognostic factors for OS were evaluated using log rank tests and Cox proportional hazards models. Results: In 114 patients that received TARE at our institution, median follow-up was 6.4 months [range 0-86], with the following tumor histology: colorectal (CR) n=55, hepatocellular (HC) n=20, cholangiocarcinoma (CC) n=16, neuroendocrine (NE) n=12, breast (BR) n=6, other n=5. At least 1 line of systemic therapy prior to TARE was noted in 79% of patients. Median OS was 6.6 months and 1-year OS was 30.7%. The percentage of patients who died within 3 months of TARE were 46.2% for patients with albumin < 3 but only 20.3% for patients with albumin ≥ 3. Grade ≥ 2 toxicity was observed in 22 patients (19.3%) including 9 (7.9%) with Grade 3 and 1 (0.9%) with Grade 4 toxicity. A single patient with a pre-existing pulmonary arteriovenous malformation experienced Grade 3 pneumonitis that resolved with steroids. No deaths were attributed to radiation-induced liver disease. Conclusions: TARE is a relatively safe and effective treatment for unresectable intrahepatic malignancies. Patients with NE or BR histology as well as those with better hepatic synthetic function were associated with significantly better survival. Our data suggest that patients with albumin below 3 may not benefit from TARE. [Table: see text]


2021 ◽  
Vol 8 ◽  
Author(s):  
Bin Zhou ◽  
Xuerong Sun ◽  
Na Yu ◽  
Shuang Zhao ◽  
Keping Chen ◽  
...  

Background: The results of studies on the obesity paradox in all-cause mortality are inconsistent in patients equipped with an implantable cardioverter-defibrillator (ICD). There is a lack of relevant studies on Chinese populations with large sample size. This study aimed to investigate whether the obesity paradox in all-cause mortality is present among the Chinese population with an ICD.Methods: We conducted a retrospective analysis of multicenter data from the Study of Home Monitoring System Safety and Efficacy in Cardiac Implantable Electronic Device–implanted Patients (SUMMIT) registry in China. The outcome was all-cause mortality. The Kaplan–Meier curves, Cox proportional hazards models, and smooth curve fitting were used to investigate the association between body mass index (BMI) and all-cause mortality.Results: After inclusion and exclusion criteria, 970 patients with an ICD were enrolled. After a median follow-up of 5 years (interquartile, 4.1–6.0 years), in 213 (22.0%) patients occurred all-cause mortality. According to the Kaplan–Meier curves and multivariate Cox proportional hazards models, BMI had no significant impact on all-cause mortality, whether as a continuous variable or a categorical variable classified by various BMI categorization criteria. The fully adjusted smoothed curve fit showed a linear relationship between BMI and all-cause mortality (p-value of 0.14 for the non-linearity test), with the curve showing no statistically significant association between BMI and all-cause mortality [per 1 kg/m2 increase in BMI, hazard ratio (HR) 0.97, 95% CI 0.93–1.02, p = 0.2644].Conclusions: The obesity paradox in all-cause mortality was absent in the Chinese patients with an ICD. Prospective studies are needed to further explore this phenomenon.


2021 ◽  
pp. 1-14 ◽  
Author(s):  
Olga Mitelman ◽  
Hoda Z. Abdel-Hamid ◽  
Barry J. Byrne ◽  
Anne M. Connolly ◽  
Peter Heydemann ◽  
...  

Background: Studies 4658-201/202 (201/202) evaluated treatment effects of eteplirsen over 4 years in patients with Duchenne muscular dystrophy and confirmed exon-51 amenable genetic mutations. Chart review Study 4658-405 (405) further followed these patients while receiving eteplirsen during usual clinical care. Objective: To compare long-term clinical outcomes of eteplirsen-treated patients from Studies 201/202/405 with those of external controls. Methods: Median total follow-up time was approximately 6 years of eteplirsen treatment. Outcomes included loss of ambulation (LOA) and percent-predicted forced vital capacity (FVC%p). Time to LOA was compared between eteplirsen-treated patients and standard of care (SOC) external controls and was measured from eteplirsen initiation in 201/202 or, in the SOC group, from the first study visit. Comparisons were conducted using univariate Kaplan-Meier analyses and log-rank tests, and multivariate Cox proportional hazards models with regression adjustment for baseline characteristics. Annual change in FVC%p was compared between eteplirsen-treated patients and natural history study patients using linear mixed models with repeated measures. Results: Data were included from all 12 patients in Studies 201/202 and the 10 patients with available data from 405. Median age at LOA was 15.16 years. Eteplirsen-treated patients experienced a statistically significant longer median time to LOA by 2.09 years (5.09 vs. 3.00 years, p < 0.01) and significantly attenuated rates of pulmonary decline vs. natural history patients (FVC%p change: –3.3 vs. –6.0 percentage points annually, p < 0.0001). Conclusions: Study 405 highlights the functional benefits of eteplirsen on ambulatory and pulmonary function outcomes up to 7 years of follow-up in comparison to external controls.


PLoS ONE ◽  
2021 ◽  
Vol 16 (1) ◽  
pp. e0242467
Author(s):  
Yu-Chen Yeh ◽  
Joseph C. Cappelleri ◽  
Xiaocong L. Marston ◽  
Ahmed Shelbaya

Objective To examine pregabalin dose titration and its impact on treatment adherence and duration in patients with neuropathic pain (NeP). Methods MarketScan database (2009–2014) was used to extract a cohort of incident adult pregabalin users with NeP who had at least 12 months of follow-up data. Any dose augmentation within 45 days following the first pregabalin claim was defined as dose titration. Adherence (measured by medication possession ratio/MPR) and persistence (measured as the duration of continuous treatment) were compared between the cohorts with and without dose titration. Logistic regressions and Cox proportional hazards models were used to identify the factors associated with adherence (MPR ≥ 0.8) and predictors of time to discontinuation. Results Among the 5,186 patients in the analysis, only 18% of patients had dose titration. Patients who had dose titration were approximately 2.6 times as likely to be adherent (MPR ≥ 0.8) (odds ratio = 2.59, P < 0.001) than those who did not have dose titration. Kaplan-Meier analysis shows that the time to discontinuation or switch was significantly longer among patients who had dose titration (4.99 vs. 4.04 months, P = 0.009). Conclusions Dose titration was associated with improved treatment adherence and persistence among NeP patients receiving pregabalin. The findings will provide valuable evidence to increase physician awareness of dose recommendations in the prescribing information and to educate patients on the importance of titration and adherence.


2021 ◽  
Author(s):  
Elke Wynberg ◽  
Hugo van Willigen ◽  
Maartje Dijkstra ◽  
Anders Boyd ◽  
Neeltje A. Kootstra ◽  
...  

Background: Few longitudinal data on COVID-19 symptoms across the full spectrum of disease severity are available. We evaluated symptom onset, severity and recovery up to nine months after illness onset. Methods: The RECoVERED Study is a prospective cohort study based in Amsterdam, the Netherlands. Participants aged>18 years were recruited following SARS-CoV-2 diagnosis via the local Public Health Service and from hospitals. Standardised symptom questionnaires were completed at recruitment, at one week and month after recruitment, and monthly thereafter. Clinical severity was defined according to WHO criteria. Kaplan-Meier methods were used to compare time from illness onset to symptom recovery, by clinical severity. We examined determinants of time to recovery using multivariable Cox proportional hazards models. Results: Between 11 May 2020 and 31 January 2021, 301 COVID-19 patients (167[55%] male) were recruited, of whom 99/301(32.9%) had mild, 140/301(46.5%) moderate, 30/301(10.0%) severe and 32/301(10.6%) critical disease. The proportion of participants reporting at least one persistent symptom at 12 weeks after illness onset was greater in those with severe/critical disease (81.7%[95%CI=68.7-89.7%]) compared to those with mild or moderate disease (33.0%[95%CI=23.0-43.3%] and 63.8%[95%CI=54.8-71.5%]). At nine months after illness onset, almost half of all participants (42.1%[95%CI=35.6-48.5]) continued to report ≥1 symptom. Recovery was slower in participants with BMI≥30kg/m2 (HR 0.51[95%CI=0.30-0.87]) compared to those with BMI<25kg/m2, after adjusting for age, sex and number of comorbidities. Conclusions: COVID-19 symptoms persisted for nine months after illness onset, even in those with mild disease. Obesity was the most important determinant of time to recovery from symptoms.


Sign in / Sign up

Export Citation Format

Share Document