passive hearing
Recently Published Documents


TOTAL DOCUMENTS

24
(FIVE YEARS 4)

H-INDEX

6
(FIVE YEARS 1)

2020 ◽  
Author(s):  
Gioia De Franceschi ◽  
Tania Rinaldi Barkat

Sensory processing varies depending on behavioral context. Here, we asked how task-engagement modulates neurons in the auditory system. We trained mice in a simple tone-detection task, and compared their neuronal activity during passive hearing and active listening. Electrophysiological extracellular recordings in the inferior colliculus, medial geniculate body, primary auditory cortex and anterior auditory field revealed widespread modulations across all regions and cortical layers, and in both putative regular and fast-spiking cortical neurons. Clustering analysis unveiled ten distinct modulation patterns that could either enhance or suppress neuronal activity. Task-engagement changed the tone-onset response in most neurons. Such modulations first emerged in subcortical areas, ruling out cortical feedback from primary auditory areas as the only mechanism underlying subcortical modulations. Half the neurons additionally displayed late modulations associated with licking, arousal or reward. Our results reveal the presence of functionally distinct subclasses of neurons, differentially sensitive to specific task-related variables but anatomically distributed along the auditory pathway.


2019 ◽  
Vol 24 (3) ◽  
pp. 297-306
Author(s):  
Emma-Kate Matthews

This article discusses the importance of active listening when engaging new audiences with experimental and unfamiliar musical formats. Spatial music is examined as a physically immersive medium in which the audience is implicated as an active component in the execution of its performance. A brief account of the historic context of spatial music will be presented alongside speculation for the trajectory of its future; particularly its potential as a model for audience engagement. This article will first consider how spatially immersive performances have the capacity to activate listeners and how can this help to engage new audiences with new ways of listening. It will also question the notion of inhabiting spatial music, with an investigation of the multiple ways in which spatial music relates to physical space and the terms of its inhabitation. The concept of virtual listening will be discussed in response to trends towards passive hearing, as driven by recent technological developments in acoustic software and hardware, and the resultant abstraction of the spatial and social dynamics of sound in virtual space. The physiological and psychological differences between listening and hearing will also be examined as a means of establishing fundamental differences in the ways that we interact with music, and questioning what our listening habits tell us about audience engagement in the context of experimental music performance. This article will also question the individual roles of the musician, composer, architect/designer and audience in the ongoing responsibility to improve audience engagement in new, or unfamiliar musical works. Importantly, this article will also explicitly examine who we are referring to when we use the term ‘new audiences’. Major developments in acoustic technology during the last few decades have somewhat confused the diagram between music, space and listener. The understanding of which elements are active and which are passive is especially ambiguous at a time when ambisonic and binaural technologies have become developed enough to provide accurate simulations of the abstract, acoustic qualities of spaces, but on virtual terms. Architects, composers, musicians, engineers and audiences are at a crossroads in the development of new music and experimental, spatiosonic practice. ‘Spatiosonic’ is a hybrid term which is used throughout this article to describe work and phenomena which regard space (spatio) and sound (sonic) as equal, interactive partners. This article considers some of the opportunities and limitations at stake in current techniques of composition, performance and listening.


2019 ◽  
Vol 12 (1) ◽  
Author(s):  
Pan-tong Yao ◽  
Jia Shen ◽  
Liang Chen ◽  
Shaoyu Ge ◽  
Qiaojie Xiong

Abstract Selective attention modulates sensory cortical activity. It remains unclear how auditory cortical activity represents stimuli that differ behaviorally. We designed a cross-modality task in which mice made decisions to obtain rewards based on attended visual or auditory stimuli. We recorded auditory cortical activity in behaving mice attending to, ignoring, or passively hearing auditory stimuli. Engaging in the task bidirectionally modulates neuronal responses to the auditory stimuli in both the attended and ignored conditions compared to passive hearing. Neuronal ensemble activity in response to stimuli under attended, ignored and passive conditions are readily distinguishable. Furthermore, ensemble activity under attended and ignored conditions are in closer states compared to passive condition, and they share a component of attentional modulation which drives them to the same direction in the population activity space. Our findings suggest that the ignored condition is very different from the passive condition, and the auditory cortical sensory processing under ignored, attended and passive conditions are modulated differently.


2019 ◽  
Author(s):  
Pan-tong Yao ◽  
Jia Shen ◽  
Liang Chen ◽  
Shaoyu Ge ◽  
Qiaojie Xiong

AbstractSelective attention modulates sensory cortical activity. It remains unclear how auditory cortical activity represents stimuli that differ behaviorally. We designed a cross-modality task in which mice made decisions to obtain rewards based on attended visual or auditory stimuli. We recorded auditory cortical activity in behaving mice attending to, ignoring, or passively hearing auditory stimuli. Engaging in the task bidirectionally modulates neuronal responses to the auditory stimuli in both the attended and ignored conditions compared to passive hearing. Neuronal ensemble activity in response to stimuli under attended, ignored and passive conditions are readily distinguishable. Furthermore, ensemble activity under attended and ignored conditions are in closer states compared to passive condition, and they share a component of attentional modulation which drives them to the same direction in the population activity space. Our findings suggest that task engagement changes sensory cortical representations across modalities in the same directions, and cross-modality attention may differentially modulates attended and ignored modalities.


2017 ◽  
Vol 57 (sup1) ◽  
pp. S51-S60 ◽  
Author(s):  
Chantal Laroche ◽  
Christian Giguère ◽  
Véronique Vaillancourt ◽  
Karine Roy ◽  
Louis-Philippe Pageot ◽  
...  

2016 ◽  
Vol 136 (6) ◽  
pp. 556-558 ◽  
Author(s):  
Mario Emilio Zernotti ◽  
Maria Fernanda Di Gregorio ◽  
Pablo Galeazzi ◽  
Paola Tabernero

PLoS ONE ◽  
2015 ◽  
Vol 10 (8) ◽  
pp. e0136568 ◽  
Author(s):  
Andrew D. Brown ◽  
Brianne T. Beemer ◽  
Nathaniel T. Greene ◽  
Theodore Argo ◽  
G. Douglas Meegan ◽  
...  

2015 ◽  
Vol 2 (8) ◽  
pp. 150225 ◽  
Author(s):  
G. Arditi ◽  
A. J. Weiss ◽  
Y. Yovel

Determining the location of a sound source is crucial for survival. Both predators and prey usually produce sound while moving, revealing valuable information about their presence and location. Animals have thus evolved morphological and neural adaptations allowing precise sound localization. Mammals rely on the temporal and amplitude differences between the sound signals arriving at their two ears, as well as on the spectral cues available in the signal arriving at a single ear to localize a sound source. Most mammals rely on passive hearing and are thus limited by the acoustic characteristics of the emitted sound. Echolocating bats emit sound to perceive their environment. They can, therefore, affect the frequency spectrum of the echoes they must localize. The biosonar sound beam of a bat is directional, spreading different frequencies into different directions. Here, we analyse mathematically the spatial information that is provided by the beam and could be used to improve sound localization. We hypothesize how bats could improve sound localization by altering their echolocation signal design or by increasing their mouth gape (the size of the sound emitter) as they, indeed, do in nature. Finally, we also reveal a trade-off according to which increasing the echolocation signal's frequency improves the accuracy of sound localization but might result in undesired large localization errors under low signal-to-noise ratio conditions.


Sign in / Sign up

Export Citation Format

Share Document