An ERP study of audiovisual simultaneity perception

2012 ◽  
Vol 25 (0) ◽  
pp. 159
Author(s):  
Marek Binder

The aim of this study was to examine relation between conscious perception of temporal relation between the elements of an audiovisual pair and the dynamics of accompanying neural activity. This was done by using a simultaneity judgment task and EEG event-related potentials (ERP). During Experiment 1 the pairs of 10 ms white-noise bursts and flashes were used. On presenting each pair subjects pressed one of two buttons to indicate their synchrony. Values of stimulus onset asynchrony (SOA) were based on individual estimates of simultaneity thresholds (50∕50 probability of either response). They were estimated prior to EEG measurement using interleaved staircase involving both sound-first and flash-first stimulus pairs. Experiment 2 had the identical setup, except subjects indicated if audio–visual pair began simultaneously (termination was synchronous). ERP waveforms were time-locked to the second stimulus in the pair. Effects of synchrony perception were studied by comparing ERPs in trials that were judged as simultaneous and non-simultaneous. Subjects were divided into two subgroups with similar SOA values. In both experiments at about 200 ms after the second stimulus onset a stronger ERP wave positivity for trials judged as non-simultaneous was observed in parieto-central sites. This effect was observed for both sound-first and video-first pairs and for both SOA subgroups. The results demonstrate that the perception of temporal relations between multimodal stimuli with identical physical parameters is reflected in localized ERP differences. Given their localization in the posterior parietal regions, these differences may be viewed as correlates of conscious perception of temporal integration vs. separation of audiovisual stimuli.

2012 ◽  
Vol 50 (8) ◽  
pp. 1852-1870 ◽  
Author(s):  
Michael Dambacher ◽  
Olaf Dimigen ◽  
Mario Braun ◽  
Kristin Wille ◽  
Arthur M. Jacobs ◽  
...  

2012 ◽  
Vol 25 (0) ◽  
pp. 83
Author(s):  
Miketa Arvaniti ◽  
Noam Sagiv ◽  
Lucille Lecoutre ◽  
Argiro Vatakis

Our research project aimed at investigating multisensory temporal integration in synesthesia and explore whether or not there are commonalities in the sensory experiences of synesthetes and non-synesthetes. Specifically, we investigated whether or not synesthetes are better integrators than non-synesthetes by examining the strength of multisensory binding (i.e., the unity effect) using an unspeeded temporal order judgment task. We used audiovisual stimuli based on grapheme-colour synesthetic associations (Experiment 1) and on crossmodal correspondences (e.g., high-pitch — light colours; Experiment 2) presented at various stimulus onset asynchronies (SOAs) with the method of constant stimuli. Presentation of these stimuli in congruent and incongruent format allowed us to examine whether congruent stimuli lead to a stronger unity effect than incongruent ones in synesthetes and non-synesthetes and, thus, whether synesthetes experience enhanced multisensory integration than non-synesthetes. Preliminary data support the hypothesis that congruent crossmodal correspondences lead to a stronger unity effect than incongruent ones in both groups, with this effect being stronger in synesthetes than non-synesthetes. We also found that synesthetes experience stronger unity effect when presented with idiosyncratically congruent grapheme-colour associations than in incongruent ones as compared to non-synesthetes trained in certain grapheme-colour associations. Currently, we are investigating (Experiment 3) whether trained non-synesthetes exhibit enhanced integration when presented with synesthetic associations that occur frequently among synesthetes. Utilizing this design we will provide psychophysical evidence of the multisensory integration in synesthesia and the possible common processing mechanisms in synesthetes and non-synesthetes.


2010 ◽  
Vol 24 (3) ◽  
pp. 198-209 ◽  
Author(s):  
Yan Wang ◽  
Jianhui Wu ◽  
Shimin Fu ◽  
Yuejia Luo

In the present study, we used event-related potentials (ERPs) and behavioral measurements in a peripherally cued line-orientation discrimination task to investigate the underlying mechanisms of orienting and focusing in voluntary and involuntary attention conditions. Informative peripheral cue (75% valid) with long stimulus onset asynchrony (SOA) was used in the voluntary attention condition; uninformative peripheral cue (50% valid) with short SOA was used in the involuntary attention condition. Both orienting and focusing were affected by attention type. Results for attention orienting in the voluntary attention condition confirmed the “sensory gain control theory,” as attention enhanced the amplitude of the early ERP components, P1 and N1, without latency changes. In the involuntary attention condition, compared with invalid trials, targets in the valid trials elicited larger and later contralateral P1 components, and smaller and later contralateral N1 components. Furthermore, but only in the voluntary attention condition, targets in the valid trials elicited larger N2 and P3 components than in the invalid trials. Attention focusing in the involuntary attention condition resulted in larger P1 components elicited by targets in small-cue trials compared to large-cue trials, whereas in the voluntary attention condition, larger P1 components were elicited by targets in large-cue trials than in small-cue trials. There was no interaction between orienting and focusing. These results suggest that orienting and focusing of visual-spatial attention are deployed independently regardless of attention type. In addition, the present results provide evidence of dissociation between voluntary and involuntary attention during the same task.


2015 ◽  
Vol 114 (5) ◽  
pp. 2672-2681 ◽  
Author(s):  
Emanuel N. van den Broeke ◽  
André Mouraux ◽  
Antonia H. Groneberg ◽  
Doreen B. Pfau ◽  
Rolf-Detlef Treede ◽  
...  

Secondary hyperalgesia is believed to be a key feature of “central sensitization” and is characterized by enhanced pain to mechanical nociceptive stimuli. The aim of the present study was to characterize, using EEG, the effects of pinprick stimulation intensity on the magnitude of pinprick-elicited brain potentials [event-related potentials (ERPs)] before and after secondary hyperalgesia induced by intradermal capsaicin in humans. Pinprick-elicited ERPs and pinprick-evoked pain ratings were recorded in 19 healthy volunteers, with mechanical pinprick stimuli of varying intensities (0.25-mm probe applied with a force extending between 16 and 512 mN). The recordings were performed before (T0) and 30 min after (T1) intradermal capsaicin injection. The contralateral noninjected arm served as control. ERPs elicited by stimulation of untreated skin were characterized by 1) an early-latency negative-positive complex peaking between 120 and 250 ms after stimulus onset (N120-P240) and maximal at the vertex and 2) a long-lasting positive wave peaking 400–600 ms after stimulus onset and maximal more posterior (P500), which was correlated to perceived pinprick pain. After capsaicin injection, pinprick stimuli were perceived as more intense in the area of secondary hyperalgesia and this effect was stronger for lower compared with higher stimulus intensities. In addition, there was an enhancement of the P500 elicited by stimuli of intermediate intensity, which was significant for 64 mN. The other components of the ERPs were unaffected by capsaicin. Our results suggest that the increase in P500 magnitude after capsaicin is mediated by facilitated mechanical nociceptive pathways.


2020 ◽  
Vol 8 (3-4) ◽  
pp. 254-278
Author(s):  
Lisa V. Eberhardt ◽  
Ferdinand Pittino ◽  
Anna Scheins ◽  
Anke Huckauf ◽  
Markus Kiefer ◽  
...  

Abstract Emotional stimuli like emotional faces have been frequently shown to be temporally overestimated compared to neutral ones. This effect has been commonly explained by induced arousal caused by emotional processing leading to the acceleration of an inner-clock-like pacemaker. However, there are some studies reporting contradictory effects and others point to relevant moderating variables. Given this controversy, we aimed at investigating the processes underlying the temporal overestimation of emotional faces by combining behavioral and electrophysiological correlates in a temporal bisection task. We assessed duration estimation of angry and neutral faces using anchor durations of 400 ms and 1600 ms while recording event-related potentials. Subjective ratings and the early posterior negativity confirmed encoding and processing of stimuli’s emotionality. However, temporal ratings did not differ between angry and neutral faces. In line with this behavioral result, the Contingent Negative Variation (CNV), an electrophysiological index of temporal accumulation, was not modulated by the faces’ emotionality. Duration estimates, i.e., short or long responses toward stimuli of ambiguous durations of 1000 ms, were nevertheless associated with a differential CNV amplitude. Interestingly, CNV modulation was already observed at 600–700 ms after stimulus onset, i.e., long before stimulus offset. The results are discussed in light of the information-processing model of time perception as well as regarding possible factors of the experimental setup moderating temporal overestimation of emotional stimuli. In sum, combining behavioral and electrophysiological measures seems promising to more clearly understand the complex processes leading to the illusion of temporal lengthening of emotional faces.


2003 ◽  
Vol 15 (7) ◽  
pp. 1039-1051 ◽  
Author(s):  
Ute Leonards ◽  
Julie Palix ◽  
Christoph Michel ◽  
Vicente Ibanez

Functional magnetic resonance imaging studies have indicated that efficient feature search (FS) and inefficient conjunction search (CS) activate partially distinct frontoparietal cortical networks. However, it remains a matter of debate whether the differences in these networks reflect differences in the early processing during FS and CS. In addition, the relationship between the differences in the networks and spatial shifts of attention also remains unknown. We examined these issues by applying a spatio-temporal analysis method to high-resolution visual event-related potentials (ERPs) and investigated how spatio-temporal activation patterns differ for FS and CS tasks. Within the first 450 msec after stimulus onset, scalp potential distributions (ERP maps) revealed 7 different electric field configurations for each search task. Configuration changes occurred simultaneously in the two tasks, suggesting that contributing processes were not significantly delayed in one task compared to the other. Despite this high spatial and temporal correlation, two ERP maps (120–190 and 250–300 msec) differed between the FS and CS. Lateralized distributions were observed only in the ERP map at 250–300 msec for the FS. This distribution corresponds to that previously described as the N2pc component (a negativity in the time range of the N2 complex over posterior electrodes of the hemisphere contralateral to the target hemifield), which has been associated with the focusing of attention onto potential target items in the search display. Thus, our results indicate that the cortical networks involved in feature and conjunction searching partially differ as early as 120 msec after stimulus onset and that the differences between the networks employed during the early stages of FS and CS are not necessarily caused by spatial attention shifts.


Perception ◽  
1996 ◽  
Vol 25 (1_suppl) ◽  
pp. 147-147
Author(s):  
P Stivalet ◽  
Y Moreno ◽  
C Cian ◽  
J Richard ◽  
P-A Barraud

In a visual search paradigm we measured the stimulus onset asynchrony (SOA) between a stimulus and a mask that was required to reach 90% correct responses. This procedure has the advantage of taking into account the real processing time and excluding the time for the generation of the motor response. Twelve congenitally deaf adult subjects and twelve normal subjects were given a visual search task for a target letter O among a varying number of distractor letters Q and vice-versa. In both groups we found the asymmetrical visual search pattern classically observed with parallel processing for the search for the target Q and with serial processing for the search for the target O (Treisman, 1985 Computer Vision, Graphics, and Image Processing31 156 – 177). The difference between the mean search slopes for an O target was not statistically significant between the groups; this might be due to the variability within the groups. The visual search amidst the congenitally deaf does not seem to benefit from a compensatory effect in relation to the acoustic deprivation. Our results seem to confirm data reported by Neville (1990 Annals of the New York Academy of Science 71 – 91) obtained by an electrophysiological technique based on event-related potentials. Nevertheless, the deaf subjects were 2.5 times faster at the visual search task.


2020 ◽  
Vol 10 (1) ◽  
Author(s):  
Jing Meng ◽  
Zuoshan Li ◽  
Lin Shen

Abstract This study tested the hypothesis that autistic traits influence the neuronal habituation that underlies the processing of others’ pain. Based on their autism-spectrum quotient (AQ), two groups of participants were classified according to their autistic traits: High-AQ and Low-AQ groups. Their event-related potentials in response to trains of three identical audio recordings, exhibiting either painful or neutral feelings of others, were compared during three experimental tasks. (1) In a Pain Judgment Task, participants were instructed to focus on pain-related cues in the presented audio recordings. (2) In a Gender Judgment Task, participants were instructed to focus on non-pain-related cues in the presented audio recordings. (3) In a Passive Listening Task, participants were instructed to passively listen. In the High-AQ group, an altered empathic pattern of habituation, indexed by frontal-central P2 responses of the second repeated painful audio recordings, was found during the Passive Listening Task. Nevertheless, both High-AQ and Low-AQ groups exhibited similar patterns of habituation to hearing others’ voices, both neutral and painful, in the Pain Judgment and Gender Judgment Tasks. These results suggest altered empathic neuronal habituation in the passive processing of others’ vocal pain by individuals with autistic traits.


Sign in / Sign up

Export Citation Format

Share Document