scholarly journals Early Auditory Experience Induces Frequency-Specific, Adaptive Plasticity in the Forebrain Gaze Fields of the Barn Owl

2001 ◽  
Vol 85 (5) ◽  
pp. 2184-2194 ◽  
Author(s):  
Greg L. Miller ◽  
Eric I. Knudsen

Binaural acoustic cues such as interaural time and level differences (ITDs and ILDs) are used by many species to determine the locations of sound sources. The relationship between cue values and locations in space is frequency dependent and varies from individual to individual. In the current study, we tested the capacity of neurons in the forebrain localization pathway of the barn owl to adjust their tuning for binaural cues in a frequency-dependent manner in response to auditory experience. Auditory experience was altered by raising young owls with a passive acoustic filtering device that caused frequency-dependent changes in ITD and ILD. Extracellular recordings were made in normal and device-reared owls to characterize frequency-specific ITD and ILD tuning in the auditory archistriatum (AAr), an output structure of the forebrain localization pathway. In device-reared owls, individual sites in the AAr exhibited highly abnormal, frequency-dependent variations in ITD tuning, and across the population of sampled sites, there were frequency-dependent shifts in the representation of ITD. These changes were in a direction that compensated for the acoustic effects of the device on ITD and therefore tended to restore a normal representation of auditory space. Although ILD tuning was degraded relative to normal at many sites in the AAr of device-reared owls, the representation of frequency-specific ILDs across the population of sampled sites was shifted in the adaptive direction. These results demonstrate that early auditory experience shapes the representation of binaural cues in the forebrain localization pathway in an adaptive, frequency-dependent manner.

1999 ◽  
Vol 82 (5) ◽  
pp. 2197-2209 ◽  
Author(s):  
Joshua I. Gold ◽  
Eric I. Knudsen

Bimodal, auditory-visual neurons in the optic tectum of the barn owl are sharply tuned for sound source location. The auditory receptive fields (RFs) of these neurons are restricted in space primarily as a consequence of their tuning for interaural time differences and interaural level differences across broad ranges of frequencies. In this study, we examined the extent to which frequency-specific features of early auditory experience shape the auditory spatial tuning of these neurons. We manipulated auditory experience by implanting in one ear canal an acoustic filtering device that altered the timing and level of sound reaching the eardrum in a frequency-dependent fashion. We assessed the auditory spatial tuning at individual tectal sites in normal owls and in owls raised with the filtering device. At each site, we measured a family of auditory RFs using broadband sound and narrowband sounds with different center frequencies both with and without the device in place. In normal owls, the narrowband RFs for a given site all included a common region of space that corresponded with the broadband RF and aligned with the site's visual RF. Acute insertion of the filtering device in normal owls shifted the locations of the narrowband RFs away from the visual RF, the magnitude and direction of the shifts depending on the frequency of the stimulus. In contrast, in owls that were raised wearing the device, narrowband and broadband RFs were aligned with visual RFs so long as the device was in the ear but not after it was removed, indicating that auditory spatial tuning had been adaptively altered by experience with the device. The frequency tuning of tectal neurons in device-reared owls was also altered from normal. The results demonstrate that experience during development adaptively modifies the representation of auditory space in the barn owl's optic tectum in a frequency-dependent manner.


1996 ◽  
Vol 85 (2) ◽  
pp. 393-402 ◽  
Author(s):  
Charles A. Napolitano ◽  
Pekka M. J. Raatikainen ◽  
Jeffrey R. Martens ◽  
Donn M. Dennis

Background Supraventricular tachydysrhythmias such as atrial fibrillation frequently complicate the perioperative period. Two electrophysiologic factors critical to the pathogenesis of supraventricular tachydysrhythmias are: 1) atrial wavelength, the product of atrial conduction velocity (CV) and effective refractory period (ERP), and 2) atrioventricular nodal conduction. Modulation of these factors by drugs has important clinical ramifications. The authors studied the effects of propofol, thiopental, and ketamine on atrial wavelength and atrioventricular nodal function in guinea pig isolated atrial trabeculae and hearts, respectively. Methods Electrocardiogram recordings in superfused atrial tissue were obtained using hanging microelectrodes. A stimulating and two recording electrodes were placed on a single atrial trabecula, and the interelectrode distance was measured. Atrial ERP determinations were made using a premature stimulus protocol. The time (t) required for a propagated impulse to traverse the interelectrode distance (d) was measured. Conduction velocity was calculated as d/t. Langendorff-perfused guinea pig hearts were instrumented for low atrial pacing (cycle length = 300 ms) and for measurements of stimulusto-His bundle interval, an index of atrioventricular nodal conduction. To investigate the frequency-dependent behavior of the atrioventricular node, computer-based measurements were made of Wenckebach cycle length (WCL) and atrioventricular nodal ERP. Results Thiopental significantly prolonged atrial ERP in a concentration-dependent manner, whereas propofol and ketamine had no significant effect on atrial refractoriness. In contrast, ketamine caused a dose-dependent decrease in atrial CV, but propofol and thiopental had no significant effect on CV. Therefore, thiopental, ketamine, and propofol caused an increase, a decrease, and no change, respectively, in atrial wavelength. All anesthetics caused a concentration-dependent prolongation of the stimulus-to-His bundle interval, atrioventricular nodal ERP, and WCL. However, on an equimolar basis, significant differences in potencies were found. The concentrations of drug that caused a 20% increase in ERP (ERP20) and WCL (WCL20) for propofol, thiopental, and ketamine were 14 +/- 2 microM, 26 +/- 3 microM, and 62 +/- 11 microM, and 17 +/- 2 microM, 50 +/- 1 microM, and 123 +/- 19 microM (mean +/- SEM), respectively. Therefore, the rank order of potency for frequency-dependent atrioventricular nodal effects is propofol > thiopental > ketamine. Conclusion The authors' results indicate that propofol would be most effective at filtering atrial impulses during supraventricular tachydysrhythmias, whereas thiopental would be most effective at preventing atrial reentrant dysrhythmias. In contrast, ketamine may be most likely to promote atrial reentry while having minimal effect on atrioventricular nodal conduction.


2018 ◽  
Vol 369 ◽  
pp. 79-89 ◽  
Author(s):  
Ann Clock Eddins ◽  
Erol J. Ozmeral ◽  
David A. Eddins
Keyword(s):  

2014 ◽  
Vol 25 (09) ◽  
pp. 791-803 ◽  
Author(s):  
Evelyne Carette ◽  
Tim Van den Bogaert ◽  
Mark Laureyns ◽  
Jan Wouters

Background: Several studies have demonstrated negative effects of directional microphone configurations on left-right and front-back (FB) sound localization. New processing schemes, such as frequency-dependent directionality and front focus with wireless ear-to-ear communication in recent, commercial hearing aids may preserve the binaural cues necessary for left-right localization and may introduce useful spectral cues necessary for FB disambiguation. Purpose: In this study, two hearing aids with different processing schemes, which were both designed to preserve the ability to localize sounds in the horizontal plane (left-right and FB), were compared. Research Design: We compared horizontal (left-right and FB) sound localization performance of hearing aid users fitted with two types of behind-the-ear (BTE) devices. The first type of BTE device had four different programs that provided (1) no directionality, (2–3) symmetric frequency-dependent directionality, and (4) an asymmetric configuration. The second pair of BTE devices was evaluated in its omnidirectional setting. This setting automatically activates a soft forward-oriented directional scheme that mimics the pinna effect. Also, wireless communication between the hearing aids was present in this configuration (5). A broadband stimulus was used as a target signal. The directional hearing abilities of the listeners were also evaluated without hearing aids as a reference. Study Sample: A total of 12 listeners with moderate to severe hearing loss participated in this study. All were experienced hearing-aid users. As a reference, 11 listeners with normal hearing participated. Data Collection and Analysis: The participants were positioned in a 13-speaker array (left-right, –90°/+90°) or 7-speaker array (FB, 0–180°) and were asked to report the number of the loudspeaker located the closest to where the sound was perceived. The root mean square error was calculated for the left-right experiment, and the percentage of FB errors was used as a FB performance measure. Results were analyzed with repeated-measures analysis of variance. Results: For the left-right localization task, no significant differences could be proven between the unaided condition and both partial directional schemes and the omnidirectional scheme. The soft forward-oriented system and the asymmetric system did show a detrimental effect compared with the unaided condition. On average, localization was worst when users used the asymmetric condition. Analysis of the results of the FB experiment showed good performance, similar to unaided, with both the partial directional systems and the asymmetric configuration. Significantly worse performance was found with the omnidirectional and the omnidirectional soft forward-oriented BTE systems compared with the other hearing-aid systems. Conclusions: Bilaterally fitted partial directional systems preserve (part of) the binaural cues necessary for left-right localization and introduce, preserve, or enhance useful spectral cues that allow FB disambiguation. Omnidirectional systems, although good for left-right localization, do not provide the user with enough spectral information for an optimal FB localization performance.


2018 ◽  
Vol 115 (16) ◽  
pp. 4264-4269 ◽  
Author(s):  
Daria Genzel ◽  
Michael Schutte ◽  
W. Owen Brimijoin ◽  
Paul R. MacNeilage ◽  
Lutz Wiegrebe

Distance is important: From an ecological perspective, knowledge about the distance to either prey or predator is vital. However, the distance of an unknown sound source is particularly difficult to assess, especially in anechoic environments. In vision, changes in perspective resulting from observer motion produce a reliable, consistent, and unambiguous impression of depth known as motion parallax. Here we demonstrate with formal psychophysics that humans can exploit auditory motion parallax, i.e., the change in the dynamic binaural cues elicited by self-motion, to assess the relative depths of two sound sources. Our data show that sensitivity to relative depth is best when subjects move actively; performance deteriorates when subjects are moved by a motion platform or when the sound sources themselves move. This is true even though the dynamic binaural cues elicited by these three types of motion are identical. Our data demonstrate a perceptual strategy to segregate intermittent sound sources in depth and highlight the tight interaction between self-motion and binaural processing that allows assessment of the spatial layout of complex acoustic scenes.


2014 ◽  
Vol 11 (90) ◽  
pp. 20130857 ◽  
Author(s):  
Robert Malkin ◽  
Thomas R. McDonagh ◽  
Natasha Mhatre ◽  
Thomas S. Scott ◽  
Daniel Robert

Animal ears are exquisitely adapted to capture sound energy and perform signal analysis. Studying the ear of the locust, we show how frequency signal analysis can be performed solely by using the structural features of the tympanum. Incident sound waves generate mechanical vibrational waves that travel across the tympanum. These waves shoal in a tsunami-like fashion, resulting in energy localization that focuses vibrations onto the mechanosensory neurons in a frequency-dependent manner. Using finite element analysis, we demonstrate that two mechanical properties of the locust tympanum, distributed thickness and tension, are necessary and sufficient to generate frequency-dependent energy localization.


Sign in / Sign up

Export Citation Format

Share Document