scholarly journals Crossmodal metaperception: Visual and tactile confidence share a common scale

2021 ◽  
Author(s):  
Lena Klever ◽  
Marie Mosebach ◽  
Katja Fiehler ◽  
Pascal Mamassian ◽  
Jutta Billino

Perceptual decisions are typically accompanied by a subjective sense of (un)certainty. There is robust evidence that observers have access to a reliable estimate of their own uncertainty and can judge the validity of their perceptual decisions. However, there is still a debate to what extent these meta-perceptual judgements underly a common mechanism that can monitor perceptual decisions across different sensory modalities. It has been suggested that perceptual confidence can be evaluated on an abstract scale that is not only task-independent but also modality-independent. We aimed to scrutinize these findings by measuring visual contrast and tactile vibration discrimination thresholds in a confidence forced-choice task. A total of 56 participants took part in our study. We determined thresholds for trials in which perceptual decisions were chosen as confident and for those that were declined as confident. Confidence comparisons were made between perceptual decisions either within the visual and tactile modality, respectively, or across both modalities. Furthermore, we assessed executive functions to explore a possible link between cognitive control and meta-perceptual capacities. We found that perceptual performance was a good predictor of confidence judgments and that the threshold modulation was similarly pronounced in both modalities. Most importantly, participants compared their perceptual confidence across visual and tactile decisions with the same precision as within the same modality. Cognitive control capacities were not related to meta-perceptual performance. In conclusion, our findings corroborate that perceptual uncertainty can be accessed on an abstract scale, allowing for confidence comparisons across sensory modalities.

2020 ◽  
Author(s):  
Marie Chancel ◽  
H. Henrik Ehrsson

The experience of one’s body as one’s own is referred to as the sense of body ownership. This central part of human conscious experience determines the boundary between the self and the external environment, a crucial distinction in perception, action, and cognition. Although body ownership is known to involve the integration of signals from multiple sensory modalities, including vision, touch and proprioception, little is known about the principles that determine this integration process, and the relationship between body ownership and perception is unclear. These uncertainties stem from the lack of a sensitive and rigorous method to quantify body ownership. Here, we describe a two-alternative forced choice discrimination task that allows precise and direct measurement of body ownership as participants decide which of two rubber hands feels more like their own in a version of the rubber hand illusion. In two experiments, we show that the temporal and spatial congruence principles of multisensory stimulation, which determine ownership discrimination, impose tighter constraints than previously thought and that texture congruence constitutes an additional principle; these findings are compatible with theoretical models of multisensory integration. Taken together, our results suggest that body ownership constitutes a genuine perceptual multisensory phenomenon that can be quantified with psychophysics in discrimination experiments.


2010 ◽  
Vol 41 (3) ◽  
pp. 122-126 ◽  
Author(s):  
Czeslaw Nosal

The structure and regulative function of the cognitive styles: a new theory The organization of human cognitive styles can be described as a kind of functional system or as an holon. In this framework it is possible to propose a new theoretical base for classifying the primary cognitive styles. The fundamental theoretical thesis is that for all styles there is one common mechanism of forming and scanning the perceptual and memory field induced by the situation, and by the differences in the manner of carrying out the processes of field scanning /codes interfering depend on the range of conceptual equivalency and cognitive control of behavior. In the functional describing of the basic set of cognitive styles we must take into account three elements of the chain: neurobiological modules ® organization of cognitive holon ® behavioral manifestation of styles.


2015 ◽  
Vol 77 (4) ◽  
pp. 1295-1306 ◽  
Author(s):  
Ai Koizumi ◽  
Brian Maniscalco ◽  
Hakwan Lau

2020 ◽  
Vol 82 (8) ◽  
pp. 4058-4083 ◽  
Author(s):  
Marie Chancel ◽  
H. Henrik Ehrsson

Abstract The experience of one’s body as one’s own is referred to as the sense of body ownership. This central part of human conscious experience determines the boundary between the self and the external environment, a crucial distinction in perception, action, and cognition. Although body ownership is known to involve the integration of signals from multiple sensory modalities, including vision, touch, and proprioception, little is known about the principles that determine this integration process, and the relationship between body ownership and perception is unclear. These uncertainties stem from the lack of a sensitive and rigorous method to quantify body ownership. Here, we describe a two-alternative forced-choice discrimination task that allows precise and direct measurement of body ownership as participants decide which of two rubber hands feels more like their own in a version of the rubber hand illusion. In two experiments, we show that the temporal and spatial congruence principles of multisensory stimulation, which determine ownership discrimination, impose tighter constraints than previously thought and that texture congruence constitutes an additional principle; these findings are compatible with theoretical models of multisensory integration. Taken together, our results suggest that body ownership constitutes a genuine perceptual multisensory phenomenon that can be quantified with psychophysics in discrimination experiments.


2021 ◽  
Author(s):  
Lena Klever ◽  
Pascal Mamassian ◽  
Jutta Billino

Visual perception is not only shaped by sensitivity, but also by confidence, i.e. the ability to estimate the accuracy of a visual decision. There is robust evidence that younger observers have access to a reliable measure of their own uncertainty when making visual decisions. This metacognitive ability might be challenged during aging due to increasing sensory noise and decreasing cognitive control resources. We investigated age effects on visual confidence using a confidence forced-choice paradigm. We determined discrimination thresholds for trials in which perceptual judgements were indicated as confident and for those in which they were declined as confident. Younger adults (19-38 years) showed significantly lower discrimination thresholds than older adults (60-78 years). In both age groups, perceptual performance was linked to confidence judgements, but overall results suggest reduced confidence efficiency in older adults. However, we observed substantial variability of confidence effects across all particpants. This variability was closely linked to individual differences in cognitive control capacities, i.e. executive function. Our findings provide evidence for age-related differences in meta-perceptual efficiency that present a specific challenge to perceptual performance in old age. We propose that these age effects are primarily mediated by cognitive control resources, supporting their crucial role for metacognitive efficiency.


2019 ◽  
Author(s):  
Jan R. Wessel ◽  
David E. Huber

AbstractThe brain constantly generates predictions about the environment to guide action. Unexpected events lead to surprise and can necessitate the modification of ongoing behavior. Surprise can occur for any sensory domain, but it is not clear how these separate surprise signals are integrated to affect motor output. By applying a trial-to-trial Bayesian surprise model to human electroencephalography data recorded during a cross-modal oddball task, we tested whether there are separate predictive models for different sensory modalities (visual, auditory), or whether expectations are integrated across modalities such that surprise in one modality decreases surprise for a subsequent unexpected event in the other modality. We found that while surprise was represented in a common frontal signature across sensory modalities (the fronto-central P3 event-related potential), the single-trial amplitudes of this signature more closely conformed to a model with separate surprise terms for each sensory domain. We then investigated whether surprise-related fronto-central P3 activity indexes the rapid inhibitory control of ongoing behavior after surprise, as suggested by recent theories. Confirming this prediction, the fronto-central P3 amplitude after both auditory and visual unexpected events was highly correlated with the fronto-central P3 found after stop-signals (measured in a separate stop-signal task). Moreover, surprise-related and stopping-related activity loaded onto the same component in a cross-task independent components analysis. Together, these findings suggest that medial frontal cortex maintains separate predictive models for different sensory domains, but engages a common mechanism for inhibitory control of behavior regardless of the source of surprise.Author summarySurprise is an elementary cognitive computation that the brain performs to guide behavior. We investigated how the brain tracks surprise across different senses: Do unexpected sounds make subsequent unexpected visual stimuli less surprising? Or does the brain maintain separate expectations of environmental regularities for different senses? We found that the latter is the case. However, even though surprise was separately tracked for auditory and visual events, it elicited a common signature over frontal cortex in both sensory domains. Importantly, we observed the same neural signature when actions had to be stopped after non-surprising stop-signals in a motor inhibition task. This suggests that this signature reflects a rapid interruption of ongoing behavior when our surroundings do not conform to our expectations.


2016 ◽  
Vol 224 (2) ◽  
pp. 125-132 ◽  
Author(s):  
Stefan Brodoehl ◽  
Carsten Klingner ◽  
Denise Schaller ◽  
Otto W. Witte

Abstract. During everyday experiences, people sometimes close their eyes to better understand spoken words, to listen to music, or when touching textures and objects. A plausible explanation for this observation is that a reversible loss of vision changes the perceptual function of the remaining non-deprived sensory modalities. Within this work, we discuss general aspects of the effects of visual deprivation on the perceptual performance of the non-deprived sensory modalities with a focus on the time dependency of these modifications. In light of ambiguous findings concerning the effects of short-term visual deprivation and because recent literature provides evidence that the act of blindfolding can change the function of the non-deprived senses within seconds, we performed additional psychophysiological and functional magnetic resonance imaging (fMRI) analysis to provide new insight into this matter. Eye closure for several seconds led to a substantial impact on tactile perception probably caused by an unmasking of preformed neuronal pathways.


Author(s):  
Nuphar Katzman ◽  
Tal Oron-Gilad

Vibro-tactile interfaces can support users in various aspects and contexts. Despite their inherent advantages, it is important to realize that they are limited in the type and capacity of information they can convey. This study is part of a series of experiments that aim to develop and evaluate a “tactile taxonomy” for dismounted operational environments. The current experiment includes a simulation of an operational mission with a remote Unmanned Ground Vehicle (UGV). During the mission, 20 participants were required to interpret notifications that they received in one (or more) of the following modalities: auditory, visual and/or tactile. Three specific notification types were chosen based on previous studies, in order to provide an intuitive connection between the notification and its semantic meaning. Response times to notifications, the ability to distinguish between the information types that they provided, and the operational mission performance metrics, were collected. Results indicate that it is possible to use a limited “tactile taxonomy” in a visually loaded and auditory noisy scene while performing a demanding operational task. The use of the tactile modality with other sensory modalities leverages the participants’ ability to perceive and identify the notifications.


2019 ◽  
Author(s):  
Mohsen Mosleh ◽  
Katelynn Kyker ◽  
Jonathan D. Cohen ◽  
David Gertler Rand

The scale of human interaction patterns is larger now than ever before – people regularly interact with and learn from others around the world, and we each have the ability to impact the global environment that is shared by all. The consequences of local versus global interaction - particularly for the evolution of cooperation - have been studied extensively by evolutionary game theorists for decades. Here, we use this lens to explore a new question: How does the scale of interaction affect the evolution of cognition, and in particular the use of automatic (e.g., reflexive or habitual) versus controlled (e.g., deliberative) cognitive processing? We find robust evidence of cycles of automaticity versus control, and that these dynamics are influenced by the scale of interaction. Specifically, globalized environment disfavors cognitive control; globalized direct contact can either favor or disfavor control, depending on whether controlled agents are harmed or helped by contact with automatic agents; and globalized learning phase-locks the whole population and destroys meso-scale communities of more versus less controlled agents. These results emphasize the importance of the scale of interaction for the evolution of cognition, and help shed light on challenges currently facing our species.


Sign in / Sign up

Export Citation Format

Share Document