scholarly journals Effects of contextual cues on speech recognition in simulated electric-acoustic stimulation

2015 ◽  
Vol 137 (5) ◽  
pp. 2846-2857 ◽  
Author(s):  
Ying-Yee Kong ◽  
Gail Donaldson ◽  
Ala Somarowthu
2013 ◽  
Vol 17 (1) ◽  
pp. 3-26 ◽  
Author(s):  
Paola V. Incerti ◽  
Teresa Y. C. Ching ◽  
Robert Cowan

2021 ◽  
Vol 32 (08) ◽  
pp. 521-527
Author(s):  
Yang-Soo Yoon ◽  
George Whitaker ◽  
Yune S. Lee

Abstract Background Cochlear implant technology allows for acoustic and electric stimulations to be combined across ears (bimodal) and within the same ear (electric acoustic stimulation [EAS]). Mechanisms used to integrate speech acoustics may be different between the bimodal and EAS hearing, and the configurations of hearing loss might be an important factor for the integration. Thus, differentiating the effects of different configurations of hearing loss on bimodal or EAS benefit in speech perception (differences in performance with combined acoustic and electric stimulations from a better stimulation alone) is important. Purpose Using acoustic simulation, we determined how consonant recognition was affected by different configurations of hearing loss in bimodal and EAS hearing. Research Design A mixed design was used with one between-subject variable (simulated bimodal group vs. simulated EAS group) and one within-subject variable (acoustic stimulation alone, electric stimulation alone, and combined acoustic and electric stimulations). Study Sample Twenty adult subjects (10 for each group) with normal hearing were recruited. Data Collection and Analysis Consonant perception was unilaterally or bilaterally measured in quiet. For the acoustic stimulation, four different simulations of hearing loss were created by band-pass filtering consonants with a fixed lower cutoff frequency of 100 Hz and each of the four upper cutoff frequencies of 250, 500, 750, and 1,000 Hz. For the electric stimulation, an eight-channel noise vocoder was used to generate a typical spectral mismatch by using fixed input (200–7,000 Hz) and output (1,000–7,000 Hz) frequency ranges. The effects of simulated hearing loss on consonant recognition were compared between the two groups. Results Significant bimodal and EAS benefits occurred regardless of the configurations of hearing loss and hearing technology (bimodal vs. EAS). Place information was better transmitted in EAS hearing than in bimodal hearing. Conclusion These results suggest that configurations of hearing loss are not a significant factor for integrating consonant information between acoustic and electric stimulations. The results also suggest that mechanisms used to integrate consonant information may be similar between bimodal and EAS hearing.


2017 ◽  
Vol 3 (2) ◽  
pp. 119-122
Author(s):  
Wouter J. van Drunen ◽  
Sarra Kacha Lachheb ◽  
Anatoly Glukhovskoy ◽  
Jens Twiefel ◽  
Marc C. Wurz ◽  
...  

AbstractFor patients suffering from profound hearing loss or deafness still having respectable residual hearing in the low frequency range, the combination of a hearing aid with a cochlear implant results in the best quality of hearing perception (EAS – electric acoustic stimulation). In order to optimize EAS, ongoing research focusses on the integration of these stimuli in a single implant device. Within this study, the performance of piezoelectric actuators, particularly the dual actuator stimulation, in a scaled uncoiled test rig was investigated.


2020 ◽  
Vol 47 (2) ◽  
pp. 198-202
Author(s):  
Kazuya Saito ◽  
Takeshi Fujita ◽  
Yasuhiro Osaki ◽  
Hajime Koyama ◽  
Ko Shiraishi ◽  
...  

2018 ◽  
Vol 39 (3) ◽  
pp. 299-305 ◽  
Author(s):  
Harold C. Pillsbury ◽  
Margaret T. Dillon ◽  
Craig A. Buchman ◽  
Hinrich Staecker ◽  
Sandra M. Prentiss ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document