Cross-Modality and Embodiment of Tempo and Timing

2021 ◽  
pp. 214-234
Author(s):  
Renee Timmers

This chapter explores the insights that research into cross-modal correspondences and multisensory integration offer to our understanding and investigation of tempo and timing in music performance. As tempo and timing are generated through action, actions and sensory modalities are coupled in performance and form a multimodal unit of intention. This coupled intention is likely to demonstrate characteristics of cross-modal correspondences, linking movement and sound. Testable properties predictions are offered by research into cross-modal correspondences that have so far mainly found confirmation in controlled perceptual experiments. For example, fast tempo is predicted to be linked to smaller movement that is higher in space. Confirmation in the context of performance is complicated by interacting associations with intentions related to e.g. dynamics and energy, which can be addressed through appropriate experimental manipulation. This avenue of research highlights the close association between action and cross-modality, conceiving action as a source of cross-modal correspondences as well as indicating the cross-modal basis of actions. For timing and tempo concepts, action and cross-modality offer concrete and embodied modalities of expression.

2019 ◽  
Vol 9 (1) ◽  
Author(s):  
Sabrina Laura López ◽  
Rodrigo Laje

AbstractPaced finger tapping is a sensorimotor synchronization task where a subject has to keep pace with a metronome while the time differences (asynchronies) between each stimulus and its response are recorded. A usual way to study the underlying error correction mechanism is to perform unexpected temporal perturbations to the stimuli sequence. An overlooked issue is that at the moment of a temporal perturbation two things change: the stimuli period (a parameter) and the asynchrony (a variable). In terms of experimental manipulation, it would be desirable to have separate, independent control of parameter and variable values. In this work we perform paced finger tapping experiments combining simple temporal perturbations (tempo step change) and spatial perturbations with temporal effect (raised or lowered point of contact). In this way we decouple the parameter-and-variable confounding, performing novel perturbations where either the parameter or the variable changes. Our results show nonlinear features like asymmetry and are compatible with a common error correction mechanism for all types of asynchronies. We suggest taking this confounding into account when analyzing perturbations of any kind in finger tapping tasks but also in other areas of sensorimotor synchronization, like music performance experiments and paced walking in gait coordination studies.


2019 ◽  
Author(s):  
Sabrina L. López ◽  
Rodrigo Laje

AbstractPaced finger tapping is a sensorimotor synchronization task where a subject is instructed to keep pace with an external metronome, as when following along with the beat of music, and the time differences (asynchronies) between each stimulus and its response are recorded. The usual way to study the underlying error correction mechanism is to make unexpected temporal perturbations to the stimuli sequence and then let the subject recover average synchronization. A critical but overlooked issue in traditional temporal perturbations, however, is that at the moment of perturbation two things change: both the stimuli period (a parameter) and the asynchrony (a variable). In terms of experimental manipulation, it would be desirable to have separate, independent control of parameter and variable values. In this work we perform paced finger tapping experiments combining simple temporal perturbations (tempo step change) and spatial perturbations with temporal effect (raised or lowered point of contact). In this way we decouple the parameter-and-variable confounding of traditional temporal perturbations and perform novel perturbations where either the parameter only changes or the variable only changes. Our results show nonlinear features like asymmetry and are compatible with the idea of a common mechanism for the correction of all types of asynchronies. We suggest taking this confounding into account when analyzing perturbations of any kind in finger tapping tasks but also in other areas of sensorimotor synchronization, like music performance experiments and paced walking in gait coordination studies.


Author(s):  
Hans Colonius ◽  
Adele Diederich

The notion of copula has attracted attention from the field of contextuality and probability. A copula is a function that joins a multivariate distribution to its one-dimensional marginal distributions. Thereby, it allows characterizing the multivariate dependency separately from the specific choice of margins. Here, we demonstrate the use of copulas by investigating the structure of dependency between processing stages in a stochastic model of multisensory integration, which describes the effect of stimulation by several sensory modalities on human reaction times. We derive explicit terms for the covariance and Kendall's tau between the processing stages and point out the specific role played by two stochastic order relations, the usual stochastic order and the likelihood ratio order, in determining the sign of dependency. This article is part of the theme issue ‘Contextuality and probability in quantum mechanics and beyond’.


2020 ◽  
Vol 71 (1) ◽  
pp. 193-219 ◽  
Author(s):  
Mark T. Wallace ◽  
Tiffany G. Woynaroski ◽  
Ryan A. Stevenson

During our everyday lives, we are confronted with a vast amount of information from several sensory modalities. This multisensory information needs to be appropriately integrated for us to effectively engage with and learn from our world. Research carried out over the last half century has provided new insights into the way such multisensory processing improves human performance and perception; the neurophysiological foundations of multisensory function; the time course for its development; how multisensory abilities differ in clinical populations; and, most recently, the links between multisensory processing and cognitive abilities. This review summarizes the extant literature on multisensory function in typical and atypical circumstances, discusses the implications of the work carried out to date for theory and research, and points toward next steps for advancing the field.


2008 ◽  
Vol 31 (3) ◽  
pp. 335-336 ◽  
Author(s):  
Andrew J. Bremner ◽  
Charles Spence

AbstractMareschal and his colleagues argue that cognition consists of partial representations emerging from organismic constraints placed on information processing through development. However, any notion of constraints must consider multiple sensory modalities, and their gradual integration across development. Multisensory integration constitutes one important way in which developmental constraints may lead to enriched representations that serve more than immediate behavioural goals.


Author(s):  
Lana Kühle

This chapter considers how we might understand the effect that emotions have on the justification of our perceptual beliefs about the world, beliefs that we acquire from a variety of sensory modalities—audition, gustation, olfaction, and so on. The chapter takes the problem to be associated with one of two forms of perceptual influence: penetration or multisensory integration. In any given perceptual moment there are multiple sensory modalities and mental states at play, each affecting the overall experience. Whether we understand the influence of emotion on perception as a form of non-perceptual penetration or a form of non-visual sensory perception of the inner body—interoception—the potential epistemological difficulties remain: How can we be said to acquire justified beliefs and knowledge on the basis of such influenced perceptual experience? As has been the norm, only the five exteroceptive senses of vision, audition, olfaction, taste, and touch are typically discussed in the context of sensory perception. However, as this chapter argues, there is strong reason to accept the claim that emotional experience is a form of interoception, and that interoception ought to be considered when discussing sensory perception. In this way, then, the chapter proposes that clarifying the role played by interoception in sense perception across modalities is necessary if we are to make progress on the epistemological problems at hand.


2019 ◽  
Vol 31 (1) ◽  
pp. 184-193
Author(s):  
Nicole E Munoz ◽  
Daniel T Blumstein

Abstract Animals are often confronted with potentially informative stimuli from a variety of sensory modalities. Although there is a large proximate literature demonstrating multisensory integration, no general framework explains why animals integrate. We developed and tested a quantitative model that explains why multisensory integration is not always adaptive and explains why unimodal decision-making might be favored over multisensory integration. We present our model in terms of a prey that must determine the presence or absence of a predator. A greater chance of encountering a predator, a greater benefit of correctly responding to a predator, a lower benefit of correctly foraging, or a greater uncertainty of the second stimulus favors integration. Uncertainty of the first stimulus may either increase or decrease the favorability of integration. In three field studies, we demonstrate how our model can be empirically tested. We evaluated the model with field studies of yellow-bellied marmots (Marmota flaviventer) by presenting marmots with an olfactory-acoustic predator stimulus at a feed station. We found some support for the model's prediction that integration is favored when the second stimulus is less noisy. We hope additional predictions of the model will guide future empirical work that seeks to understand the extent to which multimodal integration might be situation dependent. We suggest that the model is generalizable beyond antipredator contexts and can be applied within or between individuals, populations, or species. Multisensory integration is often studied from a very proximate view that simply describes the process of integration. We developed a model, the first of its kind, to investigate the situations under which multisensory integration is adaptive. We empirically evaluated the model by investigating the conditions under which yellow-bellied marmots integrated predatory scents and sounds. We found that integration can depend on an animal's situation at a given point in time.


2015 ◽  
Vol 28 (1-2) ◽  
pp. 71-99 ◽  
Author(s):  
Monica Gori

During the first years of life, sensory modalities communicate with each other. This process is fundamental for the development of unisensory and multisensory skills. The absence of one sensory input impacts on the development of other modalities. Since 2008 we have studied these aspects and developed our cross-sensory calibration theory. This theory emerged from the observation that children start to integrate multisensory information (such as vision and touch) only after 8–10 years of age. Before this age the more accurate sense teaches (calibrates) the others; when one calibrating modality is missing, the other modalities result impaired. Children with visual disability have problems in understanding the haptic or auditory perception of space and children with motor disabilities have problems in understanding the visual dimension of objects. This review presents our recent studies on multisensory integration and cross-sensory calibration in children and adults with and without sensory and motor disabilities. The goal of this review is to show the importance of interaction between sensory systems during the early period of life in order to correct perceptual development to occur.


2012 ◽  
Vol 20 (1) ◽  
pp. 135-167 ◽  
Author(s):  
Irene Ronga ◽  
Carla Bazzanella ◽  
Ferdinando Rossi ◽  
Giandomenico Iannetti

Recent studies on cortical processing of sensory information highlight the importance of multisensory integration, and define precise rules governing reciprocal influences between inputs of different sensory modalities. We propose that psychophysical interactions between different types of sensory stimuli and linguistic synaesthesia share common origins and mechanisms. To test this hypothesis, we compare neurophysiological findings with corpus-based analyses relating to linguistic synaesthesia. Namely, we present Williams’ hypothesis and its recent developments about the hierarchy of synaesthetic pairings, and examine critical aspects of this theory concerning universality, directionality, sensory categories, and usage of corpora. These theoretical issues are verified against linguistic data derived from corpus-based analyses of Italian synaesthetic pairings related to auditory and tactile modalities. Our findings reveal a strong parallel between linguistic synaesthesia and neurophysiological interactions between different sensory stimuli, suggesting that linguistic synaesthesia is affected by tendencies similar to the rules underlying the perceptual association of distinct sensory modalities.


2012 ◽  
Vol 25 (0) ◽  
pp. 127 ◽  
Author(s):  
Dragan Jankovic

Crossmodal correspondences have been widely demonstrated, although mechanisms that stand behind the phenomenon have not been fully established yet. According to the Evaluative similarity hypothesis crossmodal correspondences are influenced by evaluative (affective) similarity of stimuli from different sensory modalities (Jankovic, 2010, Journal of Vision 10(7), 859). From this view, detection of similar evaluative information in stimulation from different sensory modalities facilitates crossmodal correspondences and multisensory integration. The aim of this study was to explore the evaluative similarity hypothesis of crossmodal correspondences in children. In Experiment 1 two groups of participants (nine- and thirteen-year-olds) were asked to make explicit matches between presented auditory stimuli (1 s long sound clips) and abstract visual patterns. In Experiment 2 the same participants judged abstract visual patterns and auditory stimuli on the set of evaluative attributes measuring affective valence and arousal. The results showed that crossmodal correspondences are mostly influenced by evaluative similarity of visual and auditory stimuli in both age groups. The most frequently matched were visual and auditory stimuli congruent in both valence and arousal, followed by stimuli congruent in valence, and finally stimuli congruent in arousal. Evaluatively incongruent stimuli demonstrated low crossmodal associations especially in older group.


Sign in / Sign up

Export Citation Format

Share Document