Multisensory binding is driven by the strength of stimulus correlation
Our perceptual system is adept at exploiting sensory regularities to better extract information about our environment. One clear example of this is how the sensory and multisensory systems can utilize consistency to group sensory features into a perceptual object and segregate objects from each other and background noise. Leveraging tenets of object-based attention and multisensory binding, we sought whether this ability scaled with the strength of that consistency. We presented participants with amplitude modulated (AM) auditory and visual streams and asked them to detect imbedded orthogonal, near-threshold frequency modulation (FM) events. We modulated the correlation of the streams by varying the phase of the visual AM. In line with a previous report, we first observed peak performance that was shifted from 0°. After accounting for this, we found that across participants discriminability of the FM event linearly improved with correlation. Additionally, we sought to answer a question left dangling from our previous report as to the possible explanation for the phase shift. We found that phase shift correlated with auditory and visual response time differences, but not point of subjective simultaneity, suggesting differences in sensory processing times may account for the observed phase shift. These results suggest that our perceptual system can bind multisensory features across a spectrum of temporal correlations, a process necessary for multisensory binding in complex environments where unrelated signals may have small errant correlations.