Processes

Author(s):  
Casey O'Callaghan

Crossmodal perceptual illusions such as ventriloquism, the McGurk effect, the rubber hand, and the sound-induced flash demonstrate that one sense can causally impact perceptual processing and experience that is associated with another sense. This chapter argues that such causal interactions between senses are not merely accidental. Interactions between senses are part of typical perceptual functioning. Unlike synesthesia, they reveal principled perceptual strategies for dealing with noisy, fallible sensory stimulation from multiple sources. Recalibrations resolve conflicts between senses and weight in deference to the more reliable modality. Coordination between senses thus improves the coherence and the reliability of human perceptual capacities. Therefore, some perceptual processes of the sort relevant to empirical psychology are multisensory.

Author(s):  
Dan Cavedon-Taylor

What is the correct procedure for determining the contents of perception? Philosophers tackling this question increasingly rely on empirically oriented procedures. This chapter argues that this strategy constitutes an improvement over the armchair methodology of phenomenal contrast arguments, but that there is a respect in which current empirical procedures remain limited: they are unimodal in nature, wrongly treating the senses as isolatable. The chapter thus has two aims: first, to motivate a reorientation of the admissible contents debate into a multimodal framework. The second is to explore whether experimental studies of multimodal perception support a so-called Liberal account of perception’s admissible contents. The chapter concludes that the McGurk effect and the ventriloquist effect are both explicable without the postulation of high-level content, but that at least one multimodal experimental paradigm may necessitate such content: the rubber hand illusion. One upshot is that Conservatives who claim that the Liberal view intolerably broadens the scope of perceptual illusions, particularly from the perspective of perceptual psychology, should pursue other arguments against that view.


2009 ◽  
Vol 20 (5) ◽  
pp. 529-533 ◽  
Author(s):  
Gary Bargary ◽  
Kylie J. Barnett ◽  
Kevin J. Mitchell ◽  
Fiona N. Newell

Although it is estimated that as many as 4% of people experience some form of enhanced cross talk between (or within) the senses, known as synaesthesia, very little is understood about the level of information processing required to induce a synaesthetic experience. In work presented here, we used a well-known multisensory illusion called the McGurk effect to show that synaesthesia is driven by late, perceptual processing, rather than early, unisensory processing. Specifically, we tested 9 linguistic-color synaesthetes and found that the colors induced by spoken words are related to what is perceived (i.e., the illusory combination of audio and visual inputs) and not to the auditory component alone. Our findings indicate that color-speech synaesthesia is triggered only when a significant amount of information processing has occurred and that early sensory activation is not directly linked to the synaesthetic experience.


2018 ◽  
Author(s):  
Maria Laura Filippetti ◽  
Louise P. Kirsch ◽  
Laura Crucianelli ◽  
Aikaterini Fotopoulou

AbstractOur sense of body ownership relies on integrating different sensations according to their temporal and spatial congruency. Nevertheless, there is ongoing controversy about the role of affective congruency during multisensory integration, i.e. whether the stimuli to be perceived by the different sensory channels are congruent or incongruent in terms of their affective quality. In the present study, we applied a widely used multisensory integration paradigm, the Rubber Hand Illusion, to investigate the role of affective, top-down aspects of sensory congruency between visual and tactile modalities in the sense of body ownership. In Experiment 1 (N = 36), we touched participants with either soft or rough fabrics in their unseen hand, while they watched a rubber hand been touched synchronously with the same fabric or with a ‘hidden’ fabric of ‘uncertain roughness’. In Experiment 2 (N = 50), we used the same paradigm as in Experiment 1, but replaced the ‘uncertainty’ condition with an ‘incongruent’ one, in which participants saw the rubber hand being touched with a fabric of incongruent roughness and hence opposite valence. We found that certainty (Experiment 1) and congruency (Experiment 2) between the felt and vicariously perceived tactile affectivity led to higher subjective embodiment compared to uncertainty and incongruency, respectively, irrespective of any valence effect. Our results suggest that congruency in the affective top-down aspects of sensory stimulation is important to the multisensory integration process leading to embodiment, over and above temporal and spatial properties.


Author(s):  
Bence Nanay

There has been a lot of discussion about how the cognitive penetrability of perception may or may not have important implications for understanding perceptual justification. The aim of this chapter is to argue that a different set of findings in perceptual psychology poses an even more serious challenge to the very idea of perceptual justification. These findings are about the importance of perceptual processing that is not driven by corresponding sensory stimulation in the relevant sense modality (such as amodal completion and multimodal completion). These findings show that everyday perception is in fact a mixture of sensory-stimulation-driven perceptual processing and perceptual processing that is not driven by corresponding sensory stimulation in the relevant sense modality and that we have strong reasons to doubt the epistemic pedigree of the latter process. The implication of this is not that we should become skeptics or deny the possibility of perceptual justification. It is, rather, that the only way in which we can understand when and whether a perceptual state justifies beliefs is by paying close attention to empirical facts about the reliability of perceptual processing that is not driven by corresponding sensory stimulation in the relevant sense modality. In this sense (a very narrow sense) epistemology needs to be naturalized.


2013 ◽  
Vol 311 ◽  
pp. 491-496 ◽  
Author(s):  
Chia Ju Liu ◽  
Chin Fei Huang ◽  
Chia Yi Chou ◽  
Ming Chi Lu ◽  
Cheng Hsieh Yu ◽  
...  

Auditory phase-synchronization near 40Hz is reportedly related to sensory stimulation. This study applied the phase synchrony analysis and Bi-coherence analyses to analyze the electroencephalographic measurements. Four experimental stages were conducted with 34 healthy high school students to collect the data: (A) resting with eyes closed, (B) listening to the classical music, (C) resting with eyes closed, and (D) listening to popular music. The result shows that the whole brain phase-synchronization occurs at 40Hz and lasts about 400 ms, which is quite different from the estimated 40Hz phase-coupling lasting about 20–25 ms in previous studies and seems to play an important role in inducing auditory attention loss. Additionally, the result also shows that hypersynchronous states may affect perceptual processing. This study develops an original nonlinear time serial analytical approach and suggests that 40Hz phase-synchronization might be an important indicator in perceptual process.


2016 ◽  
Vol 2016 ◽  
pp. 1-9 ◽  
Author(s):  
Woong Choi ◽  
Liang Li ◽  
Satoru Satoh ◽  
Kozaburo Hachimura

Improving the sense of immersion is one of the core issues in virtual reality. Perceptual illusions of ownership can be perceived over a virtual body in a multisensory virtual reality environment. Rubber Hand and Virtual Hand Illusions showed that body ownership can be manipulated by applying suitable visual and tactile stimulation. In this study, we investigate the effects of multisensory integration in the Virtual Hand Illusion with active movement. A virtual xylophone playing system which can interactively provide synchronous visual, tactile, and auditory stimulation was constructed. We conducted two experiments regarding different movement conditions and different sensory stimulations. Our results demonstrate that multisensory integration with free active movement can improve the sense of immersion in virtual reality.


2021 ◽  
pp. 174702182110248
Author(s):  
V. Botan ◽  
Abigail Salisbury ◽  
H.D. Critchley ◽  
Jamie Ward

Some people report localised pain on their body when seeing other people in pain (sensory-localised vicarious pain responders). In this study we assess whether this is related to atypical computations of body ownership which, in paradigms such as the Rubber Hand Illusion (RHI), can be conceptualised as a Bayesian inference as to whether multiple sources of sensory information (visual, somatosensory) belong together on a single body (one’s own) or are distributed across several bodies (vision=other, somatosensory=self). According to this model, computations of body ownership depend on the degree (and precision) of sensory evidence, rather than synchrony per se. Sensory-localised vicarious pain responders exhibit the RHI following synchronous stroking and – unusually – also after asynchronous stroking. Importantly, this occurs only in asynchronous conditions in which the stroking is predictable (alternating) rather than unpredictable (random). There was no evidence that their bottom-up proprioceptive signals are less precise, suggesting individual differences in the top-down weighting of sensory evidence. Finally, the Enfacement illusion (EI) was also employed as a conceptually-related bodily illusion paradigm that involves a completely different response judgment (based on vision rather than proprioception). Sensory-localised responders show a comparable pattern on this task after synchronous and asynchronous stroking. This is consistent with the idea that they have top-down (prior) differences in the way body ownership is inferred that transcends the exact judgment being made (visual or proprioceptive).


Author(s):  
Ruth Garrett Millikan

Perceptual processing is translation of patterns in the data of sense into cognitive understanding without uniceptual inference. Understanding language differs from ordinary perceptual processing in that the signs it translates are detached rather than attached. This similarity is obscured because ordinary uses of the verbs of perception do not track a kind of psychological processing. Their use is mostly factive, which encourages overlooking the fallibility of perception. One result is the mistaken view that perceptual illusions are an anomaly and that perception is cognitively impenetrable. The assumption that each of the senses has its own proprietary level of perception and the assumption that differences in the result of perceptual processing are always accompanied by differences in perceptual experience are questioned. Finally, a number of intuitive objections to the idea that understanding language is a form of perceptual processing are discussed.


2004 ◽  
Vol 27 (4) ◽  
pp. 593-594
Author(s):  
Jocelyn Faubert ◽  
Armando Bertone

Phillips & Silverstein (P&S, 2003) have proposed that NMDA-receptor hypofunction is the central reason for impaired cognitive coordination and abnormal gestalt-like perceptual processing in schizophrenia. We suggest that this model may also be applicable to non-pathological (or normal) aging given the compelling evidence of NMDA-receptor involvement during the aging process that results in age-related change in higher-level perceptual performance. Given that such deficits are present in other neurological disorders such as autism, an argument for a systematic assessment of perceptual functioning in these conditions may be posited.


2021 ◽  
Author(s):  
Elizabeth A. Kaplan-Kahn ◽  
Aesoon Park ◽  
Natalie Russo

Autistic individuals show enhanced perceptual functioning on many behavioral tasks. Neurophysiological evidence also supports the conclusion that autistic individuals utilize perceptual processes to a greater extent than neurotypical comparisons to support problem solving and reasoning; however, how atypicalities in early perceptual processing influence subsequent cognitive processes remains to be elucidated. The goals of the present study were to test the relationship between early perceptual and subsequent cognitive event related potentials (ERPs) and their relationship to levels of autism traits. 62 neurotypical adults completed the Autism Spectrum Quotient (AQ) and participated in an ERP task. Path models were compared to test causal relationships among an early perceptual ERP (the P1 component), a subsequent cognitive ERP (the N400 effect), and the Attention to Detail subscale of the AQ. The size of participants’ P1 components was positively correlated with the size of their N400 effect and their Attention to Detail score. Model comparisons supported the model specifying that variation in Attention to Detail scores predicted meaningful differences in participants’ ERP waveforms. The relationship between Attention to Detail scores and the size of the N400 effect was significantly mediated by the size of the P1 effect. This study revealed that neurotypical adults with higher levels of Attention to Detail show larger P1 differences, which, in turn, correspond to larger N400 effects. Findings support the Enhanced Perceptual Functioning model of autism, suggesting that early perceptual processing differences may cascade forward and result in modifications to later cognitive mechanisms.


Sign in / Sign up

Export Citation Format

Share Document