Benefits of Nonlinear Frequency Compression in Adult Hearing Aid Users

2015 ◽  
Vol 26 (10) ◽  
pp. 838-855 ◽  
Author(s):  
Melissa Kokx-Ryan ◽  
Julie Cohen ◽  
Mary T. Cord ◽  
Therese C. Walden ◽  
Matthew J. Makashay ◽  
...  

Background: Frequency-lowering (FL) algorithms are an alternative method of providing access to high-frequency speech cues. There is currently a lack of independent research addressing: (1) what functional, measureable benefits FL provides; (2) which, if any, FL algorithm provides the maximum benefit, (3) how to clinically program algorithms, and (4) how to verify algorithm settings. Purpose: Two experiments were included in this study. The purpose of Experiment 1 was to (1) determine if a commercially available nonlinear frequency compression (NLFC) algorithm provides benefit as measured by improved speech recognition in noise when fit and verified using standard clinical procedures; and (2) evaluate the impact of acclimatization. The purpose of Experiment 2 was to (1) evaluate the benefit of using enhanced verification procedures to systematically determine the optimal application of a prototype NLFC algorithm, and (2) determine if the optimized prototype NLFC settings provide benefit as measured by improved speech recognition in quiet and in noise. Research Design: A single-blind, within-participant repeated measures design in which participants served as their own controls. Study Sample: Experiment 1 included 26 participants with a mean age of 68.3 yr and Experiment 2 included 37 participants with a mean age of 68.8 yr. Participants were recruited from the Audiology and Speech Pathology Center at Walter Reed National Military Medical Center in Bethesda, MD. Intervention: Participants in Experiment 1 wore bilateral commercially available hearing aids fit using standard clinical procedures and clinician expertise. Participants in Experiment 2 wore a single prototype hearing aid for which FL settings were systematically examined to determine the optimum application. In each experiment, FL-On versus FL-Off settings were examined in a variety of listening situations to determine benefit and possible implications. Data Collection and Analysis: In Experiment 1, speech recognition measures using the QuickSIN and Modified Rhyme Test stimuli were obtained at initial bilateral fitting and 3–5 weeks later during a follow-up visit. In Experiment 2, Modified Rhyme Test, /s/, /∫/ consonant discrimination task, and dual-task cognitive load speech recognition performance measures were conducted. Participants in Experiment 2 received four different systematic hearing aid programs during an initial visit and speech recognition data were collected over 2–3 follow-up sessions. Results: Some adults with hearing loss obtained small-to-moderate benefits from implementation of FL, while others maintained performance without detriment in both experiments. There was no significant difference among FL-On settings systematically obtained in Experiment 2. There was a modest but significant age effect in listeners of both experiments that indicated older listeners (>65 yr) might benefit more on average from FL than younger listeners. In addition, there were reliable improvements in the intelligibility of the phonemes /η/ and /b/ for both groups, and /ð/ for older listeners from the FL in both experiments. Conclusions: Although the optimum settings, application, and benefits of FL remain unclear at this time, there does not seem to be degradation in listener performance when FL is activated. The benefits of FL should be explored in older adult (>65 yr) listeners, as they tended to benefit more from FL applications.

2010 ◽  
Vol 21 (10) ◽  
pp. 618-628 ◽  
Author(s):  
Jace Wolfe ◽  
Andrew John ◽  
Erin Schafer ◽  
Myriel Nyffeler ◽  
Michael Boretzki ◽  
...  

Background: Previous research has indicated that children with moderate hearing loss experience difficulty with recognition of high-frequency speech sounds, such as fricatives and affricates. Conventional behind-the-ear (BTE) amplification typically does not provide ample output in the high frequencies (4000 Hz and beyond) to ensure optimal audibility for these sounds. Purpose: To evaluate nonlinear frequency compression (NLFC) as a means to improve speech recognition for children with moderate to moderately severe hearing loss. Research Design: Within subject, crossover design with repeated measures across test conditions. Study Sample: Fifteen children, aged 5–13 yr, with moderate to moderately severe high-frequency sensorineural hearing loss were fitted with Phonak Nios, microsized, BTE hearing aids. These children were previous users of digital hearing aids and communicated via spoken language. Their speech and language abilities were age-appropriate. Data Collection and Analysis: Aided thresholds and speech recognition in quiet and in noise were assessed after 6 wk of use with NLFC and 6 wk of use without NLFC. Participants were randomly assigned to counter-balanced groups so that eight participants began the first 6 wk trial with NLFC enabled and the other seven participants started with NLFC disabled. Then, the provision of NLFC was switched for the second 6 wk trial. Speech recognition in quiet was assessed via word recognition assessments with the University of Western Ontario (UWO) Plural Test and recognition of vowel-consonant-vowel nonsense syllables with the Phonak Logatome test. Speech recognition in noise was assessed by evaluating the signal-to-noise ratio in dB for 50% correct performance on the Bamford-Kowal-Bench Speech-in-Noise (BKB-SIN) test, an adaptive test of speech perception in a multitalker babble background. Results: Aided thresholds for high-frequency stimuli were significantly better when NLFC was enabled, and use of NLFC resulted in significantly better speech recognition in quiet for the UWO Plural Test and for the phonemes /d/ and /s/ on the Phonak Logatome test. There was not a statistically significant difference in performance on the BKB-SIN test between the NLFC enabled and disabled conditions. Conclusions: These results indicate that NLFC improves audibility for and recognition of high-frequency speech sounds for children with moderate to moderately severe hearing loss in quiet listening situations.


2019 ◽  
Vol 30 (02) ◽  
pp. 103-114
Author(s):  
Patrick N. Plyler ◽  
Brittney Tardy ◽  
Mark Hedrick

AbstractNonlinear frequency compression (NLFC) and digital noise reduction (DNR) are hearing aid features often used simultaneously in the adult population with hearing loss. Although each feature has been studied extensively in isolation, the effects of using them in combination are unclear.The effects of NLFC and DNR in noise on word recognition and satisfaction ratings in noise in adult hearing aid users were evaluated.A repeated measures design was used.Two females and 13 males between the ages of 55 and 83 yr who were experienced hearing aid users participated. Thirteen were experienced with NLFC and all were experienced with DNR. Each participant was fit with Phonak Bolero Q90-P hearing instruments using their specific audiometric data and the Desired Sensation Level v5.0 (adult) fitting strategy. Fittings were verified with probe microphone measurements using speech at 65-dB sound pressure level (SPL). NLFC verification was performed using the Protocol for the Provision of Amplification, Version 2014.01.All testing was conducted in a double-walled sound booth. Four hearing aid conditions were used for all testing: Baseline (NLFC off, DNR off), NLFC only, DNR only, and Combination (NLFC on, DNR on). A modified version of the Pascoe’s High-Frequency Word List was presented at 65-dB SPL with speech spectrum noise at 6-dB signal-to-noise ratio (SNR) and 1-dB SNR for each hearing aid condition. Listener satisfaction ratings were obtained after each listening condition in terms of word comfort, word clarity, and average satisfaction. Two-way repeated measures analyses of variance were conducted to assess listener performance. Pairwise comparisons were then completed for significant main effects.Word recognition results indicated a significant SNR effect only (6 dB SNR > 1 dB SNR). Satisfaction ratings results indicated a significant SNR and hearing aid condition effect for clarity, comfort, and average satisfaction. Clarity ratings were significantly higher for DNR and Combination than NLFC. Comfort ratings were significantly higher for DNR than NLFC. Average satisfaction was significantly higher for DNR and Combination than for NLFC. Also, average ratings were significantly higher for Combination than Baseline.Activating NLFC or DNR in isolation or in combination did not significantly impact word recognition in noise. Activating NLFC in isolation reduced satisfaction ratings relative to the DNR or Combination conditions. The isolated use of DNR significantly improved all satisfaction ratings when compared with the isolated use of NLFC. These findings suggest NLFC should not be used in isolation and should be coupled with DNR for best results. Future research should include a field trial as this was a limitation of the study.


2013 ◽  
Vol 24 (02) ◽  
pp. 105-120 ◽  
Author(s):  
Ann E. Perreau ◽  
Ruth A. Bentler ◽  
Richard S. Tyler

Background: Frequency-lowering signal processing in hearing aids has re-emerged as an option to improve audibility of the high frequencies by expanding the input bandwidth. Few studies have investigated the usefulness of the scheme as an option for bimodal users (i.e., combined use of a cochlear implant and a contralateral hearing aid). In this study, that question was posed. Purpose: The purposes of this study were (1) to determine if frequency compression was a better bimodal option than conventional amplification and (2) to determine the impact of a frequency-compression hearing aid on speech recognition abilities. Research Design: There were two separate experiments in this study. The first experiment investigated the contribution of a frequency-compression hearing aid to contralateral cochlear implant (CI) performance for localization and speech perception in noise. The second experiment assessed monaural consonant and vowel perception in quiet using the frequency-compression and conventional hearing aid without the use of a contralateral CI or hearing aid. Study Sample: Ten subjects fitted with a cochlear implant and hearing aid participated in the first experiment. Seventeen adult subjects with a cochlear implant and hearing aid or two hearing aids participated in the second experiment. To be included, subjects had to have a history of postlingual deafness, a moderate or moderate-to-severe hearing loss, and have not worn this type of frequency-lowering hearing aid previously. Data Collection and Analysis: In the first experiment, performance using the frequency-compression and conventional hearing aids was assessed on tests of sound localization, speech perception in a background of noise, and two self-report questionnaires. In the second experiment, consonant and vowel perception in quiet was assessed monaurally for the two conditions. In both experiments, subjects alternated daily between a frequency-compression and conventional hearing aid for 2 mo. The parameters of frequency compression were set individually for each subject, and audibility was measured for the frequency compression and conventional hearing aid programs by comparing estimations of the Speech Intelligibility Index (SII) using a modified algorithm (Bentler et al, 2011). In both experiments, the outcome measures were administered following the hearing aid fitting to assess performance at baseline and after 2 mo of use. Results: For this group of subjects, the results revealed no significant difference between the frequency-compression and conventional hearing aid on tests of localization and consonant recognition. Spondee-in-noise and vowel perception scores were significantly higher with the conventional hearing aid compared to the frequency-compression hearing aid after 2 mo of use. Conclusions: These results suggest that, for the subjects in this study, frequency compression is not a better bimodal option than conventional amplification. In addition, speech perception may be negatively influenced by frequency compression because formant frequencies are too severely compressed and can no longer be distinguished.


2020 ◽  
Vol 7 (Supplement_1) ◽  
pp. S144-S144
Author(s):  
Azza Elamin ◽  
Faisal Khan ◽  
Ali Abunayla ◽  
Rajasekhar Jagarlamudi ◽  
aditee Dash

Abstract Background As opposed to Staphylococcus. aureus bacteremia, there are no guidelines to recommend repeating blood cultures in Gram-negative bacilli bacteremia (GNB). Several studies have questioned the utility of follow-up blood cultures (FUBCs) in GNB, but the impact of this practice on clinical outcomes is not fully understood. Our aim was to study the practice of obtaining FUBCs in GNB at our institution and to assess it’s impact on clinical outcomes. Methods We conducted a retrospective, single-center study of adult patients, ≥ 18 years of age admitted with GNB between January 2017 and December 2018. We aimed to compare clinical outcomes in those with and without FUBCs. Data collected included demographics, comorbidities, presumed source of bacteremia and need for intensive care unit (ICU) admission. Presence of fever, hypotension /shock and white blood cell (WBC) count on the day of FUBC was recorded. The primary objective was to compare 30-day mortality between the two groups. Secondary objectives were to compare differences in 30-day readmission rate, hospital length of stay (LOS) and duration of antibiotic treatment. Mean and standard deviation were used for continuous variables, frequency and proportion were used for categorical variables. P-value < 0.05 was defined as statistically significant. Results 482 patients were included, and of these, 321 (67%) had FUBCs. 96% of FUBCs were negative and 2.8% had persistent bacteremia. There was no significant difference in 30-day mortality between those with and without FUBCs (2.9% and 2.7% respectively), or in 30-day readmission rate (21.4% and 23.4% respectively). In patients with FUBCs compared to those without FUBCs, hospital LOS was longer (7 days vs 5 days, P < 0.001), and mean duration of antibiotic treatment was longer (14 days vs 11 days, P < 0.001). A higher number of patients with FUBCs needed ICU care compared to those without FUBCs (41.4% and 25.5% respectively, P < 0.001) Microbiology of index blood culture in those with and without FUBCs Outcomes in those with and without FUBCs FUBCs characteristics Conclusion Obtaining FUBCs in GNB had no impact on 30-day mortality or 30-day readmission rate. It was associated with longer LOS and antibiotic duration. Our findings suggest that FUBCs in GNB are low yield and may not be recommended in all patients. Prospective studies are needed to further examine the utility of this practice in GNB. Disclosures All Authors: No reported disclosures


Author(s):  
D. Kiessling ◽  
C. Rennings ◽  
M. Hild ◽  
A. Lappas ◽  
T. S. Dietlein ◽  
...  

Abstract Purpose To determine the impact of failed ab-interno trabeculectomy on the postoperative outcome of subsequent XEN45 gel stent (Allergan, CA, USA) implantation in pseudophakic eyes. Methods In this retrospective single-center study, we included 60 pseudophakic eyes from 60 participants who underwent XEN45 gel stent implantation. Thirty eyes each underwent primary stent implantation (control group) or had previously undergone a failed ab-interno trabeculectomy (trabectome group). The groups were matched at a 1:1 ratio based on the following criteria: preoperative and maximum Intraocular pressure (IOP), preoperative medication score, cup/disk-ratio, follow-up time, best-corrected visual acuity at baseline, age, and the proportion of patients classified as primary open angle glaucoma or exfoliation glaucoma. We defined a successful surgery by the following three scores: an IOP reduction > 20% and IOP at the longest follow-up < 21 mmHg (Score A) or < 18 mmHg (Score B) or IOP ≤ 15 mmHg and an IOP reduction ≥ 40% (Score C). One open conjunctival revision was allowed in all scores, and a repeat surgery was considered a failure. Results Following an average follow-up period of 22 ± 12 months, we observed a mean IOP reduction of 38%, from 23.5 ± 5.2–14.5 ± 5.0 mmHg. Comparative analyses between the groups did not reveal a significant difference in the postoperative IOP, postoperative medication score, side effects, revision rate, repeat surgery rate, or success rate. Conclusions Trabectome is a viable first-line procedure for medically uncontrolled glaucoma before filtering ab-interno microstent surgery is considered.


2021 ◽  
pp. 1-9
Author(s):  
Leonard Naymagon ◽  
Douglas Tremblay ◽  
John Mascarenhas

Data supporting the use of etoposide-based therapy in hemophagocytic lymphohistiocytosis (HLH) arise largely from pediatric studies. There is a lack of comparable data among adult patients with secondary HLH. We conducted a retrospective study to assess the impact of etoposide-based therapy on outcomes in adult secondary HLH. The primary outcome was overall survival. The log-rank test was used to compare Kaplan-Meier distributions of time-to-event outcomes. Multivariable Cox proportional hazards modeling was used to estimate adjusted hazard ratios (HRs) with 95% confidence intervals (CIs). Ninety adults with secondary HLH seen between January 1, 2009, and January 6, 2020, were included. Forty-two patients (47%) received etoposide-based therapy, while 48 (53%) received treatment only for their inciting proinflammatory condition. Thirty-three patients in the etoposide group (72%) and 32 in the no-etoposide group (67%) died during follow-up. Median survival in the etoposide and no-etoposide groups was 1.04 and 1.39 months, respectively. There was no significant difference in survival between the etoposide and no-etoposide groups (log-rank <i>p</i> = 0.4146). On multivariable analysis, there was no association between treatment with etoposide and survival (HR for death with etoposide = 1.067, 95% CI: 0.633–1.799, <i>p</i> = 0.8084). Use of etoposide-based therapy was not associated with improvement in outcomes in this large cohort of adult secondary HLH patients.


2019 ◽  
Vol 2019 ◽  
pp. 1-6 ◽  
Author(s):  
Joanna Wojtasik-Bakalarz ◽  
Zoltan Ruzsa ◽  
Tomasz Rakowski ◽  
Andreas Nyerges ◽  
Krzysztof Bartuś ◽  
...  

The most relevant comorbidities in patients with peripheral artery disease (PAD) are coronary artery disease (CAD) and diabetes mellitus (DM). However, data of long-term follow-up of patients with chronic total occlusion (CTO) are scarce. The aim of the study was to assess the impact of CAD and DM on long-term follow-up patients after superficial femoral artery (SFA) CTO retrograde recanalization. In this study, eighty-six patients with PAD with diagnosed CTO in the femoropopliteal region and at least one unsuccessful attempt of antegrade recanalization were enrolled in 2 clinical centers. Mean time of follow-up in all patients was 47.5 months (±40 months). Patients were divided into two groups depending on the presence of CAD (CAD group: n=45 vs. non-CAD group: n=41) and DM (DM group: n=50 vs. non-DM group: n=36). In long-term follow-up, major adverse peripheral events (MAPE) occurred in 66.6% of patients with CAD vs. 36.5% of patients without CAD and in 50% of patients with DM vs. 55% of non-DM subjects. There were no statistical differences in peripheral endpoints in both groups. However, there was a statistically significant difference in all-cause mortality: in the DM group, there were 6 deaths (12%) (P value = 0.038). To conclude, patients after retrograde recanalization, with coexisting CTO and DM, are at higher risk of death in long-term follow-up.


2021 ◽  
Vol 80 (Suppl 1) ◽  
pp. 876-877
Author(s):  
W. Zhu ◽  
T. De Silva ◽  
L. Eades ◽  
S. Morton ◽  
S. Ayoub ◽  
...  

Background:Telemedicine was widely utilised to complement face-to-face (F2F) care in 2020 during the COVID-19 pandemic, but the impact of this on patient care is poorly understood.Objectives:To investigate the impact of telemedicine during COVID-19 on outpatient rheumatology services.Methods:We retrospectively audited patient electronic medical records from rheumatology outpatient clinics in an urban tertiary rheumatology centre between April-May 2020 (telemedicine cohort) and April-May 2019 (comparator cohort). Differences in age, sex, primary diagnosis, medications, and proportion of new/review appointments were assessed using Mann-Whitney U and Chi-square tests. Univariate analysis was used to estimate associations between telemedicine usage and the ability to assign a diagnosis in patients without a prior rheumatological diagnosis, the frequency of changes to immunosuppression, subsequent F2F review, planned admissions or procedures, follow-up phone calls, and time to next appointment.Results:3,040 outpatient appointments were audited: 1,443 from 2019 and 1,597 from 2020. There was no statistically significant difference in the age, sex, proportion of new/review appointments, or frequency of immunosuppression use between the cohorts. Inflammatory arthritis (IA) was a more common diagnosis in the 2020 cohort (35.1% vs 31%, p=0.024). 96.7% (n=1,444) of patients seen in the 2020 cohort were reviewed via telemedicine. In patients without an existing rheumatological diagnosis, the odds of making a diagnosis at the appointment were significantly lower in 2020 (28.6% vs 57.4%; OR 0.30 [95% CI 0.16-0.53]; p<0.001). Clinicians were also less likely to change immunosuppressive therapy in 2020 (22.6% vs 27.4%; OR 0.78 [95% CI 0.65-0.92]; p=0.004). This was mostly driven by less de-escalation in therapy (10% vs 12.6%; OR 0.75 [95% CI 0.59-0.95]; p=0.019) as there was no statistically significant difference in the escalation or switching of immunosuppressive therapies. There was no significant difference in frequency of follow-up phone calls, however, patients seen in 2020 required earlier follow-up appointments (p<0.001). There was also no difference in unplanned rheumatological presentations but significantly fewer planned admissions and procedures in 2020 (1% vs 2.6%, p=0.002). Appointment non-attendance reduced in 2020 to 6.5% from 10.9% in 2019 (OR 0.57 [95% CI 0.44-0.74]; p<0.001), however the odds of discharging a patient from care were significantly lower in 2020 (3.9% vs 6%; OR 0.64 [95% CI 0.46-0.89]; p=0.008), although there was no significance when patients who failed to attend were excluded. Amongst patients seen via telemedicine in 2020, a subsequent F2F appointment was required in 9.4%. The predictors of needing a F2F review were being a new patient (OR 6.28 [95% CI 4.10-9.64]; p<0.001), not having a prior rheumatological diagnosis (OR 18.43 [95% CI: 2.35-144.63]; p=0.006), or having a diagnosis of IA (OR 2.85 [95% CI: 1.40-5.80]; p=0.004) or connective tissue disease (OR 3.22 [95% CI: 1.11-9.32]; p=0.031).Conclusion:Most patients in the 2020 cohort were seen via telemedicine. Telemedicine use during the COVID-19 pandemic was associated with reduced clinic non-attendance, but with diagnostic delay, reduced likelihood of changing existing immunosuppressive therapy, earlier requirement for review, and lower likelihood of discharge. While the effects of telemedicine cannot be differentiated from changes in practice related to other aspects of the pandemic, they suggest that telemedicine may have a negative impact on the timeliness of management of rheumatology patients.Disclosure of Interests:None declared.


Sign in / Sign up

Export Citation Format

Share Document