The interplay between multisensory integration and perceptual decision making
AbstractFacing perceptual uncertainty, the brain combines information from different senses to shape optimal decision making and to guide behavior. Despite overlapping neural networks underlying multisensory integration and perceptual decision making, the process chain of decision formation has been studied mostly in unimodal contexts and is thought to be supramodal. To reveal whether and how multisensory processing interplay with perceptual decision making, we devised a paradigm mimicking naturalistic situations where human participants were exposed to continuous cacophonous audiovisual inputs containing an unpredictable relevant signal cue in one or two modalities. Using multivariate pattern analysis on concurrently recorded EEG, we decoded the neural signatures of sensory encoding and decision formation stages. Generalization analyses across conditions and time revealed that multisensory signal cues were processed faster during both processing stages. We further established that acceleration of neural dynamics was directly linked to two distinct multisensory integration processes and associated with multisensory benefit. Our results, substantiated in both detection and categorization tasks, provide evidence that the brain integrates signals from different modalities at both the sensory encoding and the decision formation stages.