scholarly journals Differential generation of saccade, fixation and image onset event-related potentials in the human mesial temporal lobe

2018 ◽  
Author(s):  
Chaim N. Katz ◽  
Kramay Patel ◽  
Omid Talakoub ◽  
David Groppe ◽  
Kari Hoffman ◽  
...  

The electrophysiological signatures of encoding and retrieval recorded from mesial temporal lobe (MTL) structures are observed as event related potentials (ERPs) during visual memory tasks. The waveforms of the ERPs associated with the onset of visual stimuli (image-onset) and eye movements (saccades and fixations) provide insights into the mechanisms of their generation. We hypothesized that since eye movements and image-onset (common methods of stimulus presentation when testing memory) provide MTL structures with salient visual information, that perhaps they both engage similar neural mechanisms. To explore this question, we used intracranial electroencephalographic (iEEG) data from the MTLs of 11 patients with medically refractory epilepsy who participated in a visual search task. We sought to characterize electrophysiological responses of MTL structures to saccades, fixations and image onset. We demonstrate that the image-onset response is an evoked/additive response with a low-frequency power increase and post-stimulus phase clustering. In contrast, ERPs following eye movements appeared to arise from phase resetting of higher frequencies than the image onset ERP. Intriguingly, this reset was associated with saccade onset and not saccade termination (fixation), suggesting it is likely the MTL response to a corollary discharge, rather than a response to visual stimulation - in stark contrast to the image onset response. The distinct mechanistic underpinnings of these two ERP may help guide future development of visual memory tasks.

2020 ◽  
Vol 30 (10) ◽  
pp. 5502-5516
Author(s):  
Chaim N Katz ◽  
Kramay Patel ◽  
Omid Talakoub ◽  
David Groppe ◽  
Kari Hoffman ◽  
...  

Abstract Event-related potentials (ERPs) are a commonly used electrophysiological signature for studying mesial temporal lobe (MTL) function during visual memory tasks. The ERPs associated with the onset of visual stimuli (image-onset) and eye movements (saccades and fixations) provide insights into the mechanisms of their generation. We hypothesized that since eye movements and image-onset provide MTL structures with salient visual information, perhaps they both engage similar neural mechanisms. To explore this question, we used intracranial electroencephalographic data from the MTLs of 11 patients with medically refractory epilepsy who participated in a visual search task. We characterized the electrophysiological responses of MTL structures to saccades, fixations, and image-onset. We demonstrated that the image-onset response is an evoked/additive response with a low-frequency power increase. In contrast, ERPs following eye movements appeared to arise from phase resetting of higher frequencies than the image-onset ERP. Intriguingly, this reset was associated with saccade onset and not termination (fixation), suggesting it is likely the MTL response to a corollary discharge, rather than a response to visual stimulation. We discuss the distinct mechanistic underpinnings of these responses which shed light on the underlying neural circuitry involved in visual memory processing.


2019 ◽  
Author(s):  
Sebastian Schindler ◽  
Maximilian Bruchmann ◽  
Bettina Gathmann ◽  
robert.moeck ◽  
thomas straube

Emotional facial expressions lead to modulations of early event-related potentials (ERPs). However, it has so far remained unclear in how far these modulations represent face-specific effects rather than differences in low-level visual features, and to which extent they depend on available processing resources. To examine these questions, we conducted two preregistered independent experiments (N = 40 in each experiment) using different variants of a novel task which manipulates peripheral perceptual load across levels but keeps overall visual stimulation constant. Centrally, task-irrelevant angry, neutral and happy faces and their Fourier phase-scrambled versions, which preserved low-level visual features, were presented. The results of both studies showed load-independent P1 and N170 emotion effects. Importantly, we could confirm by using Bayesian analyses that these emotion effects were face-independent for the P1 but not for the N170 component. We conclude that firstly, ERP modulations during the P1 interval strongly depend on low-level visual information, while the emotional N170 modulation requires the processing of figural facial features. Secondly, both P1 and N170 modulations appear to be immune to a large range of variations in perceptual load.


2021 ◽  
Vol 15 ◽  
Author(s):  
Thorben Hülsdünker ◽  
David Riedel ◽  
Hannes Käsbauer ◽  
Diemo Ruhnow ◽  
Andreas Mierau

Although vision is the dominating sensory system in sports, many situations require multisensory integration. Faster processing of auditory information in the brain may facilitate time-critical abilities such as reaction speed however previous research was limited by generic auditory and visual stimuli that did not consider audio-visual characteristics in ecologically valid environments. This study investigated the reaction speed in response to sport-specific monosensory (visual and auditory) and multisensory (audio-visual) stimulation. Neurophysiological analyses identified the neural processes contributing to differences in reaction speed. Nineteen elite badminton players participated in this study. In a first recording phase, the sound profile and shuttle speed of smash and drop strokes were identified on a badminton court using high-speed video cameras and binaural recordings. The speed and sound characteristics were transferred into auditory and visual stimuli and presented in a lab-based experiment, where participants reacted in response to sport-specific monosensory or multisensory stimulation. Auditory signal presentation was delayed by 26 ms to account for realistic audio-visual signal interaction on the court. N1 and N2 event-related potentials as indicators of auditory and visual information perception/processing, respectively were identified using a 64-channel EEG. Despite the 26 ms delay, auditory reactions were significantly faster than visual reactions (236.6 ms vs. 287.7 ms, p < 0.001) but still slower when compared to multisensory stimulation (224.4 ms, p = 0.002). Across conditions response times to smashes were faster when compared to drops (233.2 ms, 265.9 ms, p < 0.001). Faster reactions were paralleled by a lower latency and higher amplitude of the auditory N1 and visual N2 potentials. The results emphasize the potential of auditory information to accelerate the reaction time in sport-specific multisensory situations. This highlights auditory processes as a promising target for training interventions in racquet sports.


2017 ◽  
Author(s):  
Christ Devia ◽  
Rodrigo Montefusco-Siegmund ◽  
José Ignacio Egaña ◽  
Pedro E. Maldonado

AbstractPerception is the result of ongoing brain activity combined with sensory stimuli. In natural vision, changes in the visual input typically occur as the result of self-initiated eye movements. Nonetheless, in most studies, stimuli are flashed, and natural eye movements are avoided or restricted. As a consequence, the neural sensory processing associated with active vision is poorly understood. Here, we show that occipital event-related potentials (ERP) to eye movements during free exploration of natural images exhibited different amplitudes, time course and motor dependency than that from the same flashed stimuli. We found that the ERP to visual fixations doubles in P1 magnitude and does not show a late component, which is classically seen with flashed stimuli1,2. In addition, we discovered that the ERP to the saccade onset was as large as the ERP to fixations onset, with an early component that preceded the visual input, suggesting that a motor modulation was associated with the saccades3. Furthermore, the use of different visual scenes revealed that both the ERP amplitude and time course were dependent on the type of image explored. Our results demonstrated that during active vision, the nervous system engages a mechanism of sensory modulation that is precisely timed to the self-initiated stimulus changes. This mechanism could help coordinate neural activity across different cortical areas and, by extension, serve as a general mechanism for the global coordination of neural networks.


2009 ◽  
Vol 23 (2) ◽  
pp. 63-76 ◽  
Author(s):  
Silke Paulmann ◽  
Sarah Jessen ◽  
Sonja A. Kotz

The multimodal nature of human communication has been well established. Yet few empirical studies have systematically examined the widely held belief that this form of perception is facilitated in comparison to unimodal or bimodal perception. In the current experiment we first explored the processing of unimodally presented facial expressions. Furthermore, auditory (prosodic and/or lexical-semantic) information was presented together with the visual information to investigate the processing of bimodal (facial and prosodic cues) and multimodal (facial, lexic, and prosodic cues) human communication. Participants engaged in an identity identification task, while event-related potentials (ERPs) were being recorded to examine early processing mechanisms as reflected in the P200 and N300 component. While the former component has repeatedly been linked to physical property stimulus processing, the latter has been linked to more evaluative “meaning-related” processing. A direct relationship between P200 and N300 amplitude and the number of information channels present was found. The multimodal-channel condition elicited the smallest amplitude in the P200 and N300 components, followed by an increased amplitude in each component for the bimodal-channel condition. The largest amplitude was observed for the unimodal condition. These data suggest that multimodal information induces clear facilitation in comparison to unimodal or bimodal information. The advantage of multimodal perception as reflected in the P200 and N300 components may thus reflect one of the mechanisms allowing for fast and accurate information processing in human communication.


2015 ◽  
Vol 27 (3) ◽  
pp. 492-508 ◽  
Author(s):  
Nicholas E. Myers ◽  
Lena Walther ◽  
George Wallis ◽  
Mark G. Stokes ◽  
Anna C. Nobre

Working memory (WM) is strongly influenced by attention. In visual WM tasks, recall performance can be improved by an attention-guiding cue presented before encoding (precue) or during maintenance (retrocue). Although precues and retrocues recruit a similar frontoparietal control network, the two are likely to exhibit some processing differences, because precues invite anticipation of upcoming information whereas retrocues may guide prioritization, protection, and selection of information already in mind. Here we explored the behavioral and electrophysiological differences between precueing and retrocueing in a new visual WM task designed to permit a direct comparison between cueing conditions. We found marked differences in ERP profiles between the precue and retrocue conditions. In line with precues primarily generating an anticipatory shift of attention toward the location of an upcoming item, we found a robust lateralization in late cue-evoked potentials associated with target anticipation. Retrocues elicited a different pattern of ERPs that was compatible with an early selection mechanism, but not with stimulus anticipation. In contrast to the distinct ERP patterns, alpha-band (8–14 Hz) lateralization was indistinguishable between cue types (reflecting, in both conditions, the location of the cued item). We speculate that, whereas alpha-band lateralization after a precue is likely to enable anticipatory attention, lateralization after a retrocue may instead enable the controlled spatiotopic access to recently encoded visual information.


2007 ◽  
Vol 38 (3) ◽  
pp. 168-171 ◽  
Author(s):  
Wuttichai V. Chayasirisobhon ◽  
Sirichai Chayasirisobhon ◽  
Sue Nwe Tin ◽  
Ngoc Leu ◽  
Keo Tehrani ◽  
...  

We studied scalp-recorded auditory event-related potentials (ERPs) of 30 untreated patients with new-onset temporal lobe epilepsy and 30 age-and sex-matched normal controls. This study was designed to eliminate the effects of intractability of seizures and chronic use of antiepileptic drugs on P300 auditory ERPs. There were no statistically significant differences in both latency and amplitude of P300 between the two groups. Similar methods were also used to analyze component latencies and amplitudes of ERPs of 9 patients who had hippocampal sclerosis with comparison to control subjects. There were no statistically significant differences between these two groups as well. Our study evidently does not support temporal lobe sources of P300 scalp-recorded auditory ERPs. We also conclude that the scalp-recorded auditory ERPs procedure is not a useful tool to evaluate temporal lobe epilepsy.


Sign in / Sign up

Export Citation Format

Share Document