crossmodal interactions
Recently Published Documents


TOTAL DOCUMENTS

45
(FIVE YEARS 11)

H-INDEX

10
(FIVE YEARS 1)

2021 ◽  
Author(s):  
Daria Kvasova ◽  
Travis Stewart ◽  
Salvador Soto-Faraco

In real-world scenes, the different objects and events available to our senses are interconnected within a rich web of semantic associations. These semantic links help parse information and make sense of the environment. For example, during goal-directed attention, characteristic everyday life object sounds help speed up visual search for these objects in natural and dynamic environments. However, it is not known whether semantic correspondences also play a role under spontaneous observation. Here, we investigated this question addressing whether crossmodal semantic congruence can drive spontaneous, overt visual attention in free-viewing conditions. We used eye-tracking whilst participants (N=45) viewed video clips of realistic complex scenes presented alongside sounds of varying semantic congruency with objects within the videos. We found that characteristic sounds increased the probability of looking, the number of fixations, and the total dwell time on the semantically corresponding visual objects, in comparison to when the same scenes were presented with semantically neutral sounds or just with background noise only. Our results suggest that crossmodal semantic congruence has an impact on spontaneous gaze and eye movements, and therefore on how attention samples information in a free viewing paradigm. Our findings extend beyond known effects of object-based crossmodal interactions with simple stimuli and shed new light upon how audio-visual semantically congruent relationships play out in everyday life scenarios.


2021 ◽  
Vol 39 (1) ◽  
pp. 1-20
Author(s):  
Zachary Wallmark ◽  
Linh Nghiem ◽  
Lawrence E. Marks

Musical timbre is often described using terms from non-auditory senses, mainly vision and touch; but it is not clear whether crossmodality in timbre semantics reflects multisensory processing or simply linguistic convention. If multisensory processing is involved in timbre perception, the mechanism governing the interaction remains unknown. To investigate whether timbres commonly perceived as “bright-dark” facilitate or interfere with visual perception (darkness-brightness), we designed two speeded classification experiments. Participants were presented consecutive images of slightly varying (or the same) brightness along with task-irrelevant auditory primes (“bright” or “dark” tones) and asked to quickly identify whether the second image was brighter/darker than the first. Incongruent prime-stimulus combinations produced significantly more response errors compared to congruent combinations but choice reaction time was unaffected. Furthermore, responses in a deceptive identical-image condition indicated subtle semantically congruent response bias. Additionally, in Experiment 2 (which also incorporated a spatial texture task), measures of reaction time (RT) and accuracy were used to construct speed-accuracy tradeoff functions (SATFs) in order to critically compare two hypothesized mechanisms for timbre-based crossmodal interactions, sensory response change vs. shift in response criterion. Results of the SATF analysis are largely consistent with the response criterion hypothesis, although without conclusively ruling out sensory change.


2021 ◽  
Vol 12 ◽  
Author(s):  
Sandra Courrèges ◽  
Rim Aboulaasri ◽  
Anjali Bhatara ◽  
Marie-Héloïse Bardel

In the present series of studies, we investigated crossmodal perception of odor and texture. In four studies, participants tried two textures of face creams, one high viscosity (HV) and one low viscosity (LV), each with one of three levels of added odor (standard level, half of standard, or base [no added odor]), and then reported their levels of well-being. They also reported their perceptions of the face creams, including liking (global liking of the product, liking of its texture) and “objective” evaluations on just about right (JAR) scales (texture and visual appearance evaluations). In Study 1, women in France tried the creams on their hands, as they would when testing them in a store, and in Study 2, a second group of French women tried the creams on their faces, as they would at home. In Studies 3 and 4, these same two procedures were repeated in China. Results showed that both odor and texture had effects on well-being, liking, and JAR ratings, including interaction effects. Though effects varied by country and context (hand or face), the addition of odor to the creams generally increased reports of well-being, global liking and texture liking, in some cases affecting the “objective” evaluations of texture. This is one of the first investigations of crossmodal olfactory and tactile perception's impacts on well-being, and it reinforces previous literature showing the importance of olfaction on well-being.


Author(s):  
Ryan J. Ward ◽  
Sophie M. Wuerger ◽  
Alan Marshall

Olfaction is ingrained into the fabric of our daily lives and constitutes an integral part of our perceptual reality. Within this reality, there are crossmodal interactions and sensory expectations; understanding how olfaction interacts with other sensory modalities is crucial for augmenting interactive experiences with more advanced multisensorial capabilities. This knowledge will eventually lead to better designs, more engaging experiences, and enhancing the perceived quality of experience. Toward this end, the authors investigated a range of crossmodal correspondences between ten olfactory stimuli and different modalities (angularity of shapes, smoothness of texture, pleasantness, pitch, colors, musical genres, and emotional dimensions) using a sample of 68 observers. Consistent crossmodal correspondences were obtained in all cases, including our novel modality (the smoothness of texture). These associations are most likely mediated by both the knowledge of an odor’s identity and the underlying hedonic ratings: the knowledge of an odor’s identity plays a role when judging the emotional and musical dimensions but not for the angularity of shapes, smoothness of texture, perceived pleasantness, or pitch. Overall, hedonics was the most dominant mediator of crossmodal correspondences.


Author(s):  
Ryan J. Ward ◽  
Sophie M. Wuerger ◽  
Alan Marshall

Olfaction is ingrained into the fabric of our daily lives and constitutes an integral part of our perceptual reality. Within this reality, there are crossmodal interactions and sensory expectations; understanding how olfaction interacts with other sensory modalities is crucial for augmenting interactive experiences with more advanced multisensorial capabilities. This knowledge will eventually lead to better designs, more engaging experiences, and enhancing the perceived quality of experience. Toward this end, the authors investigated a range of crossmodal correspondences between ten olfactory stimuli and different modalities (angularity of shapes, smoothness of texture, pleasantness, pitch, colors, musical genres, and emotional dimensions) using a sample of 68 observers. Consistent crossmodal correspondences were obtained in all cases, including our novel modality (the smoothness of texture). These associations are most likely mediated by both the knowledge of an odor’s identity and the underlying hedonic ratings: the knowledge of an odor’s identity plays a role when judging the emotional and musical dimensions but not for the angularity of shapes, smoothness of texture, perceived pleasantness, or pitch. Overall, hedonics was the most dominant mediator of crossmodal correspondences.


2020 ◽  
Vol 20 (11) ◽  
pp. 1768
Author(s):  
Armand R. Tanguay, Jr. ◽  
Noelle R. B. Stiles ◽  
Ishani Ganguly ◽  
Shinsuke Shimojo

2020 ◽  
Vol 33 (4-5) ◽  
pp. 457-478
Author(s):  
Louise Manfron ◽  
Valéry Legrain ◽  
Lieve Filbrich

Abstract Examining the mechanisms underlying crossmodal interaction between nociceptive and visual stimuli is crucial to understand how humans handle potential bodily threats in their environment. It has recently been shown that nociceptive stimuli can affect the perception of visual stimuli, provided that they occur close together in external space. The present study addresses the question whether these crossmodal interactions between nociceptive and visual stimuli are mediated by the visually perceived proximity between the visual stimuli and the limb on which nociceptive stimuli are applied, by manipulating the presence vs. absence of visual feedback about the position of the stimulated limb. Participants performed temporal order judgments on pairs of visual stimuli, shortly preceded by nociceptive stimuli, either applied on one hand or both hands simultaneously. The hands were placed near the visual stimuli and could either be seen directly, seen through a glass barrier, or hidden from sight with a wooden board. Unilateral nociceptive stimuli induced spatial biases to the advantage of visual stimuli presented near the stimulated hand, which were greater in the conditions in which the hands were seen than in the condition in which vision was prevented. Spatial biases were not modulated by the presence of the glass barrier, minimizing the possibility that the differential effect between the vision and no-vision conditions is solely due to the presence of the barrier between the hands and the visual stimuli. These findings highlight the importance of visual feedback for determining spatial mapping between nociceptive and visual stimuli for crossmodal interaction.


2020 ◽  
Author(s):  
Rebecca Hirst ◽  
David McGovern ◽  
Annalisa Setti ◽  
ladan shams ◽  
Fiona Newell

In the Sound-Induced Flash Illusion (SIFI) sound dramatically alters visual perception, as presenting a single flash with two beeps results in the perception of two flashes. In this comprehensive review, we synthesise 20 years of research using the SIFI, from over 100 studies. We discuss the neural and computational principles governing this illusion and examine the influence of perceptual experience, development, ageing and clinical conditions. Convergent findings show that the SIFI results from optimal integration and probabilistic inference and directly reflects crossmodal interactions in the temporal domain. Its neural basis lies in early modulation of visual cortex by auditory and multisensory regions. The SIFI shows increasingly strong potential as an efficient tool for measuring multisensory processing. Greater harmonisation across studies is now required to maximise this potential. We therefore propose considerations for researchers relating to choice of stimulus parameters and signpost directions for future research.


2019 ◽  
Author(s):  
Daria Kvasova ◽  
Laia Garcia-Vernet ◽  
Salvador Soto-Faraco

AbstractReal-world multisensory events do not only provide temporally and spatially correlated information, but also semantic correspondences about object identity. Semantically consistent sounds can enhance visual detection, identification and search performance, but these effects are always demonstrated in simple and stereotyped displays that lack ecological validity. In order to address identity-based crossmodal relationships in real world scenarios, we designed a visual search task using complex, dynamic scenes. Participants searched objects in video clips from real life scenes with background sounds. Auditory cues embedded in the background sounds could be target-consistent, distracter-consistent, neutral and no sound (just background noise). We found that characteristic sounds enhance visual search of relevant objects in natural scenes but fail to increase the salience of irrelevant distracters. Our findings generalize previous results on object-based crossmodal interactions with simple stimuli and shed light upon how audio-visual semantically congruent relationships play out in real life contexts.


2019 ◽  
Author(s):  
Camille Vanderclausen ◽  
Marion Bourgois ◽  
Anne De Volder ◽  
Valéry Legrain

AbstractAdequately localizing pain is crucial to protect the body against physical damage and react to the stimulus in external space having caused such damage. Accordingly, it is hypothesized that nociceptive inputs are remapped from a somatotopic reference frame, representing the skin surface, towards a spatiotopic frame, representing the body parts in external space. This ability is thought to be developed and shaped by early visual experience. To test this hypothesis, normally sighted and early blind participants performed temporal order judgment tasks during which they judged which of two nociceptive stimuli applied on each hand’s dorsum was perceived as first delivered. Crucially, tasks were performed with the hands either in an uncrossed posture or crossed over body midline. While early blinds were not affected by the posture, performances of the normally sighted participants decreased in the crossed condition relative to the uncrossed condition. This indicates that nociceptive stimuli were automatically remapped into a spatiotopic representation that interfered with somatotopy in normally sighted individuals, whereas early blinds seemed to mostly rely on a somatotopic representation to localize nociceptive inputs. Accordingly, the plasticity of the nociceptive system would not purely depend on bodily experiences but also on crossmodal interactions between nociception and vision during early sensory experience.


Sign in / Sign up

Export Citation Format

Share Document