Crossmodal Correspondences: Standing Issues and Experimental Guidelines

2016 ◽  
Vol 29 (1-3) ◽  
pp. 7-28 ◽  
Author(s):  
Cesare V. Parise

Crossmodal correspondences refer to the systematic associations often found across seemingly unrelated sensory features from different sensory modalities. Such phenomena constitute a universal trait of multisensory perception even in non-human species, and seem to result, at least in part, from the adaptation of sensory systems to natural scene statistics. Despite recent developments in the study of crossmodal correspondences, there are still a number of standing questions about their definition, their origins, their plasticity, and their underlying computational mechanisms. In this paper, I will review such questions in the light of current research on sensory cue integration, where crossmodal correspondences can be conceptualized in terms of natural mappings across different sensory cues that are present in the environment and learnt by the sensory systems. Finally, I will provide some practical guidelines for the design of experiments that might shed new light on crossmodal correspondences.

2020 ◽  
Author(s):  
Ryan Joseph Ward ◽  
Sophie Wuerger ◽  
Alan Marshall

Crossmodal correspondences are the associations between apparently distinct stimuli in different sensory modalities . These associations, albeit surprising, are generally shared in most of the population. Olfaction is ingrained in the fabric of our daily life and constitutes an integral part of our perceptual reality, with olfaction being more commonly used in the entertainment and analytical domains, it is crucial to uncover the robust correspondences underlying common aromatic compounds. Towards this end, we investigated an aggregate of crossmodal correspondences between ten olfactory stimuli and other modalities ( angularity of shapes, smoothness of texture, pleasantness, pitch, colours, musical genres and emotional dimensions ) using a large sample of 68 observers. We uncover the correspondences between these modalities and extent of these associations with respect to the explicit knowledge of the respective aromatic compound. The results revealed the robustness of prior studies, as well as, contributions towards olfactory integration between an aggregate of other dimensions. The knowledge of an odour's identity coupled with the multisensory perception of the odours indicates that these associations, for the most part, are relatively robust and do not rely on explicit knowledge of the odour. Through principal component analysis of the perceptual ratings, new cross-model mediations have been uncovered between odours and their intercorrelated sensory dimensions. Our results demonstrate a collective of associations between olfaction and other dimensions, potential cross modal mediations via exploratory factor analysis and the robustness of these correspondence with respect to the explicit knowledge of an odour. We anticipate the findings reported in this paper could be used as a psychophysical framework aiding in a collective of applications ranging from olfaction enhanced multimedia to marketing.


2014 ◽  
Vol 369 (1635) ◽  
pp. 20120512 ◽  
Author(s):  
Rebecca Knight ◽  
Caitlin E. Piette ◽  
Hector Page ◽  
Daniel Walters ◽  
Elizabeth Marozzi ◽  
...  

How the brain combines information from different sensory modalities and of differing reliability is an important and still-unanswered question. Using the head direction (HD) system as a model, we explored the resolution of conflicts between landmarks and background cues. Sensory cue integration models predict averaging of the two cues, whereas attractor models predict capture of the signal by the dominant cue. We found that a visual landmark mostly captured the HD signal at low conflicts: however, there was an increasing propensity for the cells to integrate the cues thereafter. A large conflict presented to naive rats resulted in greater visual cue capture (less integration) than in experienced rats, revealing an effect of experience. We propose that weighted cue integration in HD cells arises from dynamic plasticity of the feed-forward inputs to the network, causing within-trial spatial redistribution of the visual inputs onto the ring. This suggests that an attractor network can implement decision processes about cue reliability using simple architecture and learning rules, thus providing a potential neural substrate for weighted cue integration.


2012 ◽  
Vol 25 (0) ◽  
pp. 44
Author(s):  
Valeria Occelli ◽  
Gianluca Esposito ◽  
Paola Venuti ◽  
Peter Walker ◽  
Massimiliano Zampini

The label ‘crossmodal correspondences’ has been used to define the nonarbitrary associations that appear to exist between different basic physical stimulus attributes in different sensory modalities. For instance, it has been consistently shown in the neurotypical population that higher pitched sounds are more frequently matched with visual patterns which are brighter, smaller, and sharper than those associated to lower pitched sounds. Some evidence suggests that patients with ASDs tend not to show this crossmodal preferential association pattern (e.g., curvilinear shapes and labial/lingual consonants vs. rectilinear shapes and plosive consonants). In the present study, we compared the performance of children with ASDs (6–15 years) and matched neurotypical controls in a non-verbal crossmodal correspondence task. The participants were asked to indicate which of two bouncing visual patterns was making a centrally located sound. In intermixed trials, the visual patterns varied in either size, surface brightness, or shape, whereas the sound varied in pitch. The results showed that, whereas the neurotypical controls reliably matched the higher pitched sound to a smaller and brighter visual pattern, the performance of participants with ASDs was at chance level. In the condition where the visual patterns differed in shape, no inter-group difference was observed. Children’s matching performance cannot be attributed to intensity matching or difficulties in understanding the instructions, which were controlled. These data suggest that the tendency to associate congruent visual and auditory features vary as a function of the presence of ASDs, possibly pointing to poorer capabilities to integrate auditory and visual inputs in this population.


Neuroscience ◽  
2019 ◽  
Vol 408 ◽  
pp. 378-387 ◽  
Author(s):  
Qadeer Arshad ◽  
Marta Casanovas Ortega ◽  
Usman Goga ◽  
Rhannon Lobo ◽  
Shuaib Siddiqui ◽  
...  

2019 ◽  
Vol 32 (3) ◽  
pp. 235-265 ◽  
Author(s):  
Charles Spence

Abstract This review deals with the question of the relative vs absolute nature of crossmodal correspondences, with a specific focus on those correspondences involving the auditory dimension of pitch. Crossmodal correspondences have been defined as the often-surprising crossmodal associations that people experience between features, attributes, or dimensions of experience in different sensory modalities, when either physically present, or else merely imagined. In the literature, crossmodal correspondences have often been contrasted with synaesthesia in that the former are frequently said to be relative phenomena (e.g., it is the higher-pitched of two sounds that is matched with the smaller of two visual stimuli, say, rather than there being a specific one-to-one crossmodal mapping between a particular pitch of sound and size of object). By contrast, in the case of synaesthesia, the idiosyncratic mapping between inducer and concurrent tends to be absolute (e.g., it is a particular sonic inducer that elicits a specific colour concurrent). However, a closer analysis of the literature soon reveals that the distinction between relative and absolute in the case of crossmodal correspondences may not be as clear-cut as some commentators would have us believe. Furthermore, it is important to note that the relative vs absolute question may receive different answers depending on the particular (class of) correspondence under empirical investigation.


2013 ◽  
Vol 14 (6) ◽  
pp. 429-442 ◽  
Author(s):  
Christopher R. Fetsch ◽  
Gregory C. DeAngelis ◽  
Dora E. Angelaki
Keyword(s):  

Perception ◽  
2013 ◽  
Vol 42 (4) ◽  
pp. 477-479
Author(s):  
Paul Barry Hibbard
Keyword(s):  

2019 ◽  
Author(s):  
Kenneth W. Latimer ◽  
Dylan Barbera ◽  
Michael Sokoletsky ◽  
Bshara Awwad ◽  
Yonaton Katz ◽  
...  

AbstractSensory systems encounter remarkably diverse stimuli in the external environment. Natural stimuli exhibit timescales and amplitudes of variation that span a wide range. Mechanisms of adaptation, ubiquitous feature of sensory systems, allow for the accommodation of this range of scales. Are there common rules of adaptation across different sensory modalities? We measured the membrane potential responses of individual neurons in the visual, somatosensory and auditory cortices to discrete, punctate stimuli delivered at a wide range of fixed and nonfixed frequencies. We find that the adaptive profile of the response is largely preserved across these three areas, exhibiting attenuation and responses to the cessation of stimulation which are signatures of response to changes in stimulus statistics. We demonstrate that these adaptive responses can emerge from a simple model based on the integration of fixed filters operating over multiple time scales.


2018 ◽  
pp. 48-59
Author(s):  
Marta González-Colominas

Materials can be considered the interface of a product as they mediate between user, environment and object (Karana, Pedgley and Rognoli 2014). They characterize the physical world and generate a continuous flow of sensory interactions. In this age of mass production, engineers and designers are in a unique position to use the opportunities presented by materials development and apply them in creative ways to trigger meaningful user experiences. Dynamism is considered a very promising material experience in terms of creating meaningful interactions, and, consequently, user attachment to a product (Rognoli, Ferrara and Arquilla 2016). Dynamic products are those that show sensory features that change over time in a proactive and reversible way, activating one or more user’s sensory modalities and aiming at enhancing the user’s experience (Colombo 2016). Smart materials could be considered the most suitable candidates to provide dynamic experiences. They react to external stimuli, such as pressure, temperature or the electric field, changing properties such as shape or colour. They are capable of both sensing and responding to the environment, as well as exerting active control of their responses (Addington and Schodek 2004). Compared to understanding traditional materials, smart materials involve additional technical complexity. The aim of this paper is to share how the Material Driven Design (MDD) method (Karana et al. 2015) has been applied and to analyse a set of 10 projects, grouped into 5 case studies, developed by students from ELISAVA over the last 3 years to improve ways to implement the method. We have analysed the case studies in terms of the changes observed in the sensory features, using a sensory map proposed by Sara Colombo (Colombo 2016). By comparing different projects, the paper shows how the sensorial aspects are invoked by different smart material properties. The 5 case studies have integrated the smart materials into functional prototypes for different application sectors, such as healthcare, energy harvesting or fashion. We have found that only three sensory modalities (sound, sight and touch) were involved in the user experience, with sight being the most predominant sensory perception. This study aims to serve as a springboard for other scholars interested in designing dynamic products with smart materials.


Sign in / Sign up

Export Citation Format

Share Document