perceptual weights
Recently Published Documents


TOTAL DOCUMENTS

19
(FIVE YEARS 4)

H-INDEX

6
(FIVE YEARS 1)

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Cora Kubetschek ◽  
Christoph Kayser

AbstractMany studies speak in favor of a rhythmic mode of listening, by which the encoding of acoustic information is structured by rhythmic neural processes at the time scale of about 1 to 4 Hz. Indeed, psychophysical data suggest that humans sample acoustic information in extended soundscapes not uniformly, but weigh the evidence at different moments for their perceptual decision at the time scale of about 2 Hz. We here test the critical prediction that such rhythmic perceptual sampling is directly related to the state of ongoing brain activity prior to the stimulus. Human participants judged the direction of frequency sweeps in 1.2 s long soundscapes while their EEG was recorded. We computed the perceptual weights attributed to different epochs within these soundscapes contingent on the phase or power of pre-stimulus EEG activity. This revealed a direct link between 4 Hz EEG phase and power prior to the stimulus and the phase of the rhythmic component of these perceptual weights. Hence, the temporal pattern by which the acoustic information is sampled over time for behavior is directly related to pre-stimulus brain activity in the delta/theta band. These results close a gap in the mechanistic picture linking ongoing delta band activity with their role in shaping the segmentation and perceptual influence of subsequent acoustic information.


2021 ◽  
Author(s):  
Kyle Jasmin ◽  
Adam Tierney ◽  
Lori Holt

AbstractSegmental speech units (e.g. phonemes) are described as multidimensional categories wherein perception involves contributions from multiple acoustic input dimensions, and the relative perceptual weights of these dimensions respond dynamically to context. Can prosodic aspects of speech spanning multiple phonemes, syllables or words be characterized similarly? Here we investigated the relative contribution of two acoustic dimensions to word emphasis. Participants categorized instances of a two-word phrase pronounced with typical covariation of fundamental frequency (F0) and duration, and in the context of an artificial ‘accent’ in which F0 and duration covaried atypically. When categorizing ‘accented’ speech, listeners rapidly down-weighted the secondary dimension (duration) while continuing to rely on the primary dimension (F0). This clarifies two core theoretical questions: 1) prosodic categories are signalled by multiple input acoustic dimensions and 2) perceptual cue weights for prosodic categories dynamically adapt to local regularities of speech input.HighlightsProsodic categories are signalled by multiple acoustic dimensions.The influence of these dimensions flexibly adapts to changes in local speech input.This adaptive plasticity may help tune perception to atypical accented speech.Similar learning models may account for segmental and suprasegmental flexibility.


2020 ◽  
Author(s):  
Cora Kubetschek ◽  
Christoph Kayser

AbstractMany studies speak in favor of a rhythmic mode of listening, by which the encoding of acoustic information is structured by rhythmic neural processes at the time scale of about 1 to 4 Hz. Indeed, psychophysical data suggest that humans sample acoustic information in extended soundscapes not uniformly, but weigh the evidence at different moments for their perceptual decision at the time scale of about 2 Hz. We here test the critical prediction that such rhythmic perceptual sampling is directly related to the state of ongoing brain activity prior to the stimulus. Human participants judged the direction of frequency sweeps in 1.2 s long soundscapes while their EEG was recorded. Computing the perceptual weights attributed to different epochs within these soundscapes contingent on the phase or power of pre-stimulus oscillatory EEG activity revealed a direct link between the 4Hz EEG phase and power prior to the stimulus and the phase of the rhythmic component of these perceptual weights. Hence, the temporal pattern by which the acoustic information is sampled over time for behavior is directly related to pre-stimulus brain activity in the delta/theta band. These results close a gap in the mechanistic picture linking ongoing delta band activity with their role in shaping the segmentation and perceptual influence of subsequent acoustic information.


2019 ◽  
Author(s):  
Christoph Kayser

AbstractConverging results suggest that perception is controlled by rhythmic processes in the brain. In the auditory domain, neuroimaging studies show that the perception of brief sounds is shaped by rhythmic activity prior to the stimulus and electrophysiological recordings have linked delta band (1-2 Hz) activity to the functioning of individual neurons. These results have promoted theories of rhythmic modes of listening and generally suggest that the perceptually relevant encoding of acoustic information is structured by rhythmic processes along auditory pathways. A prediction from this perspective – which so far has not been tested – is that such rhythmic processes also shape how acoustic information is combined over time to judge extended soundscapes. The present study was designed to directly test this prediction. Human participants judged the overall change in perceived frequency content in temporally extended (1.2 to 1.8 s) soundscapes, while the perceptual use of the available sensory evidence was quantified using psychophysical reverse correlation. Model-based analysis of individual participant’s perceptual weights revealed a rich temporal structure, including linear trends, a U-shaped profile tied to the overall stimulus duration, and importantly, rhythmic components at the time scale of 1 to 2Hz. The collective evidence found here across four versions of the experiment supports the notion that rhythmic processes operating on the delta band time scale structure how perception samples temporally extended acoustic scenes.


2018 ◽  
Author(s):  
Kyle Jasmin ◽  
Fred Dick ◽  
Lori Holt ◽  
Adam Tierney

AbstractIn speech, linguistic information is conveyed redundantly by many simultaneously present acoustic dimensions, such as fundamental frequency, duration and amplitude. Listeners show stable tendencies to prioritize these acoustic dimensions differently, relative to one another, which suggests individualized speech perception ‘strategies’. However, it is unclear what drives these strategies, and more importantly, what impact they have on diverse aspects of communication. Here we show that such individualized perceptual strategies can be related to individual differences in perceptual ability. In a cue weighting experiment, we first demonstrate that individuals with a severe pitch perception deficit (congenital amusics) categorize linguistic stimuli similarly to controls when their deficit is unrelated to the main distinguishing cue for that category (in this case, durational or temporal cues). In contrast, in a prosodic task where pitch-related cues are typically more informative, amusics place less importance on this pitch-related information when categorizing speech. Instead, they relied more on duration information. Crucially, these differences in perceptual weights were observed even when pitch-related differences were large enough to be perceptually distinct to amusic listeners. In a second set of experiments involving musical and prosodic phrase interpretation, we found that this reliance on duration information allowed amusics to overcome their perceptual deficits and perceive both speech and music successfully. These results suggest that successful speech - and potentially music - comprehension is achieved through multiple perceptual strategies whose underlying weights may in part reflect individuals’ perceptual abilities.


2017 ◽  
Author(s):  
Stephanie C Boyle ◽  
Stephanie J Kayser ◽  
Christoph Kayser

To make accurate perceptual estimates observers must take the reliability of sensory information into account. Despite many behavioural studies showing that subjects weight individual sensory cues in proportion to their reliabilities, it is still unclear when during a trial neuronal responses are modulated by the reliability of sensory information, or when they reflect the perceptual weights attributed to each sensory input during decision making. We investigated these questions using a combination of psychophysics, EEG based neuroimaging and single-trial decoding. Our results show that the weighted integration of sensory information in the brain is a dynamic process; effects of sensory reliability on task-relevant EEG components were evident around 84ms after stimulus onset, while neural correlates of perceptual weights emerged around 120ms after stimulus onset. These neural processes also had different underlying topographies, arising from areas consistent with sensory and parietal regions. Together these results reveal the temporal dynamics of perceptual and neural audio-visual integration and support the notion of temporally early and functionally specific multisensory processes in the brain.


PLoS ONE ◽  
2016 ◽  
Vol 11 (9) ◽  
pp. e0162876 ◽  
Author(s):  
Wei Hu ◽  
Lin Mi ◽  
Zhen Yang ◽  
Sha Tao ◽  
Mingshuang Li ◽  
...  

2016 ◽  
Vol 13 (118) ◽  
pp. 20160057 ◽  
Author(s):  
Erin E. Sutton ◽  
Alican Demir ◽  
Sarah A. Stamper ◽  
Eric S. Fortune ◽  
Noah J. Cowan

Animal nervous systems resolve sensory conflict for the control of movement. For example, the glass knifefish, Eigenmannia virescens , relies on visual and electrosensory feedback as it swims to maintain position within a moving refuge. To study how signals from these two parallel sensory streams are used in refuge tracking, we constructed a novel augmented reality apparatus that enables the independent manipulation of visual and electrosensory cues to freely swimming fish ( n = 5). We evaluated the linearity of multisensory integration, the change to the relative perceptual weights given to vision and electrosense in relation to sensory salience, and the effect of the magnitude of sensory conflict on sensorimotor gain. First, we found that tracking behaviour obeys superposition of the sensory inputs, suggesting linear sensorimotor integration. In addition, fish rely more on vision when electrosensory salience is reduced, suggesting that fish dynamically alter sensorimotor gains in a manner consistent with Bayesian integration. However, the magnitude of sensory conflict did not significantly affect sensorimotor gain. These studies lay the theoretical and experimental groundwork for future work investigating multisensory control of locomotion.


2014 ◽  
Vol 136 (2) ◽  
pp. 728-735 ◽  
Author(s):  
Walt Jesteadt ◽  
Daniel L. Valente ◽  
Suyash N. Joshi ◽  
Kendra K. Schmid
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document