auditory dominance
Recently Published Documents


TOTAL DOCUMENTS

26
(FIVE YEARS 9)

H-INDEX

8
(FIVE YEARS 1)

2021 ◽  
pp. 1-19
Author(s):  
Alexandra N. Scurry ◽  
Daniela M. Lemus ◽  
Fang Jiang

Abstract Reliable duration perception is an integral aspect of daily life that impacts everyday perception, motor coordination, and subjective passage of time. The Scalar Expectancy Theory (SET) is a common model that explains how an internal pacemaker, gated by an external stimulus-driven switch, accumulates pulses during sensory events and compares these accumulated pulses to a reference memory duration for subsequent duration estimation. Second-order mechanisms, such as multisensory integration (MSI) and attention, can influence this model and affect duration perception. For instance, diverting attention away from temporal features could delay the switch closure or temporarily open the accumulator, altering pulse accumulation and distorting duration perception. In crossmodal duration perception, auditory signals of unequal duration can induce perceptual compression and expansion of durations of visual stimuli, presumably via auditory influence on the visual clock. The current project aimed to investigate the role of temporal (stimulus alignment) and nontemporal (stimulus complexity) features on crossmodal, specifically auditory over visual, duration perception. While temporal alignment revealed a larger impact on the strength of crossmodal duration percepts compared to stimulus complexity, both features showcase auditory dominance in processing visual duration.


2021 ◽  
Vol 15 ◽  
Author(s):  
Hiroshi Yoshimatsu ◽  
Yuko Yotsumoto

We constantly integrate multiple types of information from different sensory modalities. Generally, such integration is influenced by the modality that we attend to. However, for duration perception, it has been shown that when duration information from visual and auditory modalities is integrated, the perceived duration of the visual stimulus leaned toward the duration of the auditory stimulus, irrespective of which modality was attended. In these studies, auditory dominance was assessed using visual and auditory stimuli with different durations whose timing of onset and offset would affect perception. In the present study, we aimed to investigate the effect of attention on duration integration using visual and auditory stimuli of the same duration. Since the duration of a visual flicker and auditory flutter tends to be perceived as longer than and shorter than its physical duration, respectively, we used the 10 Hz visual flicker and auditory flutter with the same onset and offset timings but different perceived durations. The participants were asked to attend either visual, auditory, or both modalities. Contrary to the attention-independent auditory dominance reported in previous studies, we found that the perceived duration of the simultaneous flicker and flutter presentation depended on which modality the participants attended. To further investigate the process of duration integration of the two modalities, we applied Bayesian hierarchical modeling, which enabled us to define a flexible model in which the multisensory duration is represented by the weighted average of each sensory modality. In addition, to examine whether auditory dominance results from the higher reliability of auditory stimuli, we applied another models to consider the stimulus reliability. These behavioral and modeling results suggest the following: (1) the perceived duration of visual and auditory stimuli is influenced by which modality the participants attended to when we control for the confounding effect of onset–offset timing of stimuli, and (2) the increase of the weight by attention affects the duration integration, even when the effect of stimulus reliability is controlled. Our models can be extended to investigate the neural basis and effects of other sensory modalities in duration integration.


2021 ◽  
Author(s):  
Allison Fitch ◽  
Nilam Thaker ◽  
Zsuzsa Kaldy

Verbal labels have been shown to help preverbal infants’ performance on various cognitive tasks, such as categorization. Redundant labels also aid adults’ visual working memory (WM), but it is not known if this linguistic benefit extends to preverbal infants’ WM. In two eye-tracking studies, we tested whether 8- and 10-month-old infants’ WM performance would improve with the presence of redundant labels in a Delayed Match Retrieval (DMR) paradigm that tested infants’ WM for object-location bindings. Findings demonstrated that infants at both ages were unable to remember two object-location bindings when co-presented with labels at encoding. Moreover, infants who encoded the object-location bindings with labels were not significantly better than those who did so in silence. These findings are discussed in the context of label advantages in cognition and auditory dominance.


2020 ◽  
Author(s):  
Paddy Ross ◽  
Beth Atkins ◽  
Laura Allison ◽  
Holly Simpson ◽  
Catherine Duffell ◽  
...  

Effective emotion recognition is imperative to successfully navigating social situations. Research suggests differing developmental trajectories for the recognition of bodily and vocal emotion, but emotions are usually studied in isolation and rarely considered as multimodal stimuli in the literature. When presented with basic multimodal sensory stimuli, the Colavita effect suggests that adults have a visual dominance, whereas more recent research finds that an auditory sensory dominance may be present in children under 8 years of age. However, it is not currently known whether this phenomenon holds for more complex multimodal social stimuli. Here we presented children and adults with multimodal social stimuli consisting of emotional bodies and voices, asking them to recognise the emotion in one modality while ignoring the other. We found that adults can perform this task with no detrimental effects to performance, regardless of whether the ignored emotion was congruent or not. However, children find it extremely challenging to recognise bodily emotion while trying to ignore incongruent vocal emotional information. In several instances they perform below chance level, indicating that the auditory modality actively informs their choice of bodily emotion. This is therefore the first evidence, to our knowledge, of an auditory dominance in children when presented with emotionally meaningful stimuli.


2020 ◽  
Author(s):  
Kyongsik Yun ◽  
Joydeep Bhattacharya ◽  
Simone Sandkuhler ◽  
Yong-Jun Lin ◽  
Sunao Iwaki ◽  
...  

AbstractWhen different senses are in conflict, one sense may dominate the perception of other sense, but it is not known whether the sensory cortex associated with the dominant modality exerts directional influence, at the functional brain level, over the sensory cortex associated with the dominated modality; in short, the link between sensory dominance and neuronal dominance is not established. In a task involving audio-visual conflict, using magnetoencephalography recordings in humans, we first demonstrated that the neuronal dominance – visual cortex being functionally influenced by the auditory cortex – was associated with the sensory dominance – participants’ visual perception being qualitatively altered by sound. Further, we found that prestimulus auditory-to-visual connectivity could predict the perceptual outcome on a trial-by-trial basis. Subsequently, we performed an effective connectivity-guided neurofeedback electroencephalography experiment and showed that participants who were briefly trained to increase the neuronal dominance from auditory to visual cortex also showed higher sensory, i.e. auditory, dominance during the conflict task immediately after the training. The results shed new light into the interactive neuronal nature of multisensory integration and open up exciting opportunities by enhancing or suppressing targeted mental functions subserved by effective connectivity.


Vision ◽  
2020 ◽  
Vol 4 (1) ◽  
pp. 14
Author(s):  
Margeaux Ciraolo ◽  
Samantha O’Hanlon ◽  
Christopher Robinson ◽  
Scott Sinnett

Investigations of multisensory integration have demonstrated that, under certain conditions, one modality is more likely to dominate the other. While the direction of this relationship typically favors the visual modality, the effect can be reversed to show auditory dominance under some conditions. The experiments presented here use an oddball detection paradigm with variable stimulus timings to test the hypothesis that a stimulus that is presented earlier will be processed first and therefore contribute to sensory dominance. Additionally, we compared two measures of sensory dominance (slowdown scores and error rate) to determine whether the type of measure used can affect which modality appears to dominate. When stimuli were presented asynchronously, analysis of slowdown scores and error rates yielded the same result; for both the 1- and 3-button versions of the task, participants were more likely to show auditory dominance when the auditory stimulus preceded the visual stimulus, whereas evidence for visual dominance was observed as the auditory stimulus was delayed. In contrast, for the simultaneous condition, slowdown scores indicated auditory dominance, whereas error rates indicated visual dominance. Overall, these results provide empirical support for the hypothesis that the modality that engages processing first is more likely to show dominance, and suggest that more explicit measures of sensory dominance may favor the visual modality.


2020 ◽  
Author(s):  
Christopher W Robinson

The current study examined how simple tones affect speeded visual responses in a visual-spatial sequence learning task. Across the three reported experiments, participants were presented with a visual target that appeared in different locations on a touchscreen monitor and they were instructed to touch the visual targets as quickly as possible. Response times typically sped up across training and participants were slower to respond to the visual stimuli when the sequences were paired with tones. Moreover, these interference effects were more pronounced early in training and explicit instructions directing attention to the visual modality had little effect on eliminating auditory interference, suggesting that these interference effects may stem from bottom-up factors and do not appear to be under attentional control. These findings have implications on tasks that require the processing of simultaneously presented auditory and visual information and provide support for a proposed mechanism underlying auditory dominance on a task that is typically better suited for the visual modality.


2020 ◽  
Author(s):  
Christopher W Robinson

The current study used cross-modal oddball tasks to examine cardiac and behavioral responses to changing auditory and visual information. When instructed to press the same button for auditory and visual oddballs, auditory dominance was found with cross-modal presentation slowing down visual response times more than auditory response times (Experiment 1). When instructed to make separate responses to auditory and visual oddballs, visual dominance was found with cross-modal presentation decreasing auditory discrimination. Participants also made more visual-based than auditory-based errors on cross-modal trials (Experiment 2). Experiment 3 increased task demands while requiring a single button press and found evidence of auditory dominance, suggesting that it is unlikely that increased task demands can account for the reversal in Experiment 2. Examination of cardiac responses that were time-locked with stimulus onset showed cross-modal facilitation effects, with auditory and visual discrimination occurring earlier in the course of processing in the cross-modal condition than in the unimodal conditions. The current findings showing that response demand manipulations reversed modality dominance and that time-locked cardiac responses show cross-modal facilitation, not interference, suggest that auditory and visual dominance effects may both be occurring later in the course of processing, not from disrupted encoding.


2017 ◽  
Vol 35 (1) ◽  
pp. 77-93 ◽  
Author(s):  
Marilyn G. Boltz

Although the visual modality often dominates the auditory one, one exception occurs in the presence of tempo discrepancies between the two perceptual systems: variations in auditory rate typically have a greater influence on perceived visual rate than vice versa. This phenomenon, termed “auditory driving,” is investigated here through certain techniques used in cinematic art. Experiments 1 and 2 relied on montages (slideshows) of still photos accompanied by musical selections in which the perceived rate of one modality was assessed through a recognition task while the rate of the other modality was systematically varied. A similar methodological strategy was used in Experiments 3 and 4 in which film excerpts of various moving objects were accompanied by the sounds they typically produce. In both cases, auditory dominance was observed, which has implications at both a theoretical and applied level.


Sign in / Sign up

Export Citation Format

Share Document