Neuronal Synchronization and Selective Color Processing in the Human Brain

2004 ◽  
Vol 16 (3) ◽  
pp. 503-522 ◽  
Author(s):  
Matthias M. Müller ◽  
Andreas Keil

In the present study, subjects selectively attended to the color of checkerboards in a feature-based attention paradigm. Induced gamma band responses (GBRs), the induced alpha band, and the event-related potential (ERP) were analyzed to uncover neuronal dynamics during selective feature processing. Replicating previous ERP findings, the selection negativity (SN) with a latency of about 160 msec was extracted. Furthermore, and similarly to previous EEG studies, a gamma band peak in a time window between 290 and 380 msec was found. This peak had its major energy in the 55to 70-Hz range and was significantly larger for the attended color. Contrary to previous human induced gamma band studies, a much earlier 40to 50-Hz peak in a time window between 160 and 220 msec after stimulus onset and, thus, concurrently to the SN was prominent with significantly more energy for attended as opposed to unattended color. The induced alpha band (9.8–11.7 Hz), on the other hand, exhibited a marked suppression for attended color in a time window between 450 and 600 msec after stimulus onset. A comparison of the time course of the 40to 50-Hz and 55to 70-Hz induced GBR, the induced alpha band, and the ERP revealed temporal coincidences for changes in the morphology of these brain responses. Despite these similarities in the time domain, the cortical source configuration was found to discriminate between induced GBRs and the SN. Our results suggest that large-scale synchronous high-frequency brain activity as measured in the human GBR play a specific role in attentive processing of stimulus features.

2012 ◽  
Vol 24 (2) ◽  
pp. 521-529 ◽  
Author(s):  
Frank Oppermann ◽  
Uwe Hassler ◽  
Jörg D. Jescheniak ◽  
Thomas Gruber

The human cognitive system is highly efficient in extracting information from our visual environment. This efficiency is based on acquired knowledge that guides our attention toward relevant events and promotes the recognition of individual objects as they appear in visual scenes. The experience-based representation of such knowledge contains not only information about the individual objects but also about relations between them, such as the typical context in which individual objects co-occur. The present EEG study aimed at exploring the availability of such relational knowledge in the time course of visual scene processing, using oscillatory evoked gamma-band responses as a neural correlate for a currently activated cortical stimulus representation. Participants decided whether two simultaneously presented objects were conceptually coherent (e.g., mouse–cheese) or not (e.g., crown–mushroom). We obtained increased evoked gamma-band responses for coherent scenes compared with incoherent scenes beginning as early as 70 msec after stimulus onset within a distributed cortical network, including the right temporal, the right frontal, and the bilateral occipital cortex. This finding provides empirical evidence for the functional importance of evoked oscillatory activity in high-level vision beyond the visual cortex and, thus, gives new insights into the functional relevance of neuronal interactions. It also indicates the very early availability of experience-based knowledge that might be regarded as a fundamental mechanism for the rapid extraction of the gist of a scene.


2019 ◽  
Author(s):  
Mattson Ogg ◽  
Thomas A. Carlson ◽  
L. Robert Slevc

Human listeners are bombarded by acoustic information that the brain rapidly organizes into coherent percepts of objects and events in the environment, which aids speech and music perception. The efficiency of auditory object recognition belies the critical constraint that acoustic stimuli necessarily require time to unfold. Using magentoencephalography (MEG), we studied the time course of the neural processes that transform dynamic acoustic information into auditory object representations. Participants listened to a diverse set of 36 tokens comprising everyday sounds from a typical human environment. Multivariate pattern analysis was used to decode the sound tokens from the MEG recordings. We show that sound tokens can be decoded from brain activity beginning 90 milliseconds after stimulus onset with peak decoding performance occurring at 155 milliseconds post stimulus onset. Decoding performance was primarily driven by differences between category representations (e.g., environmental vs. instrument sounds), although within-category decoding was better than chance. Representational similarity analysis revealed that these emerging neural representations were related to harmonic and spectrotemporal differences among the stimuli, which correspond to canonical acoustic features processed by the auditory pathway. Our findings begin to link the processing of physical sound properties with the perception of auditory objects and events in cortex.


2020 ◽  
Vol 32 (1) ◽  
pp. 111-123 ◽  
Author(s):  
Mattson Ogg ◽  
Thomas A. Carlson ◽  
L. Robert Slevc

Human listeners are bombarded by acoustic information that the brain rapidly organizes into coherent percepts of objects and events in the environment, which aids speech and music perception. The efficiency of auditory object recognition belies the critical constraint that acoustic stimuli necessarily require time to unfold. Using magnetoencephalography, we studied the time course of the neural processes that transform dynamic acoustic information into auditory object representations. Participants listened to a diverse set of 36 tokens comprising everyday sounds from a typical human environment. Multivariate pattern analysis was used to decode the sound tokens from the magnetoencephalographic recordings. We show that sound tokens can be decoded from brain activity beginning 90 msec after stimulus onset with peak decoding performance occurring at 155 msec poststimulus onset. Decoding performance was primarily driven by differences between category representations (e.g., environmental vs. instrument sounds), although within-category decoding was better than chance. Representational similarity analysis revealed that these emerging neural representations were related to harmonic and spectrotemporal differences among the stimuli, which correspond to canonical acoustic features processed by the auditory pathway. Our findings begin to link the processing of physical sound properties with the perception of auditory objects and events in cortex.


2013 ◽  
Vol 2013 ◽  
pp. 1-6 ◽  
Author(s):  
Jing Zhao ◽  
John X. Zhang ◽  
Hong-Yan Bi

The present study explored the time course of neighborhood frequency effect at the early processing stages, examining whether orthographic neighbors with higher frequency exerted an influence on target processing especially at the phonological stage by using the event-related potential (ERP). Thirteen undergraduate students were recruited in this study, and they were required to covertly name Chinese characters with or without higher-frequency neighbors (HFNs); meanwhile, their brain activity was recorded. Results showed that the effect of neighborhood frequency was significant in frontocentral P2 amplitude, with a reduction for naming characters with HFNs compared to those without HFNs; while there was no effect in posterior N1 amplitude. The only neighborhood frequency effect in P2 component suggested a special role for the HFNs in phonological access of  Chinese characters. The decrease in amplitude for naming with-HFN characters might be associated with the phonological interference of higher-frequency neighbors due to their different pronunciations from the target characters.


2006 ◽  
Vol 18 (9) ◽  
pp. 1488-1497 ◽  
Author(s):  
James W. Tanaka ◽  
Tim Curran ◽  
Albert L. Porterfield ◽  
Daniel Collins

Electrophysiological studies using event-related potentials have demonstrated that face stimuli elicit a greater negative brain potential in right posterior recording sites 170 msec after stimulus onset (N170) relative to nonface stimuli. Results from repetition priming paradigms have shown that repeated exposures of familiar faces elicit a larger negative brainwave (N250r) at inferior temporal sites compared to repetitions of unfamiliar faces. However, less is known about the time course and learning conditions under which the N250 face representation is acquired. In the familiarization phase of the Joe/no Joe task, subjects studied a target “Joe” face (“Jane” for female subjects) and, during the course of the experiment, identified a series of sequentially presented faces as either Joe or not Joe. The critical stimulus conditions included the subject's own face, a same-sex Joe ( Jane) face and a same-sex “other” face. The main finding was that the subject's own face produced a focal negative deflection (N250) in posterior channels relative to nontarget faces. The task-relevant Joe target face was not differentiated from other nontarget faces in the first half of the experiment. However, in the second half, the Joe face produced an N250 response that was similar in magnitude to the own face. These findings suggest that the N250 indexes two types of face memories: a preexperimentally familiar face representation (i.e., the “own face” and a newly acquired face representation (i.e., the Joe/Jane face) that was formed during the course of the experiment.


2014 ◽  
Author(s):  
Jaime Martin del Campo ◽  
John Maltby ◽  
Giorgio Fuggetta

The present study tested the Dysexecutive Luck hypothesis by examining whether deficits in the early stage of top down attentional control led to an increase of neural activity in later stages of response related selection process among those who thought themselves to be unlucky. Individuals with these beliefs were compared to a control group using an Event-Related Potential (ERP) measure assessing underlying neural activity of semantic inhibition while completing a Stroop test. Results showed stronger main interference effects in the former group, via greater reaction times and a more negative distributed scalp late ERP component during incongruent trials in the time window of 450 – 780 ms post stimulus onset. Further, less efficient maintenance of task set among the former group was associated with greater late ERP response-related activation to compensate for the lack of top-down attentional control. These findings provide electrophysiological evidence to support the Dysexecutive Luck hypothesis.


2019 ◽  
Vol 122 (2) ◽  
pp. 539-551 ◽  
Author(s):  
David W. Sutterer ◽  
Joshua J. Foster ◽  
John T. Serences ◽  
Edward K. Vogel ◽  
Edward Awh

A hallmark of episodic memory is the phenomenon of mentally reexperiencing the details of past events, and a well-established concept is that the neuronal activity that mediates encoding is reinstated at retrieval. Evidence for reinstatement has come from multiple modalities, including functional magnetic resonance imaging and electroencephalography (EEG). These EEG studies have shed light on the time course of reinstatement but have been limited to distinguishing between a few categories. The goal of this work was to use recently developed experimental and technical approaches, namely continuous report tasks and inverted encoding models, to determine which frequencies of oscillatory brain activity support the retrieval of precise spatial memories. In experiment 1, we establish that an inverted encoding model applied to multivariate alpha topography tracks the retrieval of precise spatial memories. In experiment 2, we demonstrate that the frequencies and patterns of multivariate activity at study are similar to the frequencies and patterns observed during retrieval. These findings highlight the broad potential for using encoding models to characterize long-term memory retrieval. NEW & NOTEWORTHY Previous EEG work has shown that category-level information observed during encoding is recapitulated during memory retrieval, but studies with this time-resolved method have not demonstrated the reinstatement of feature-specific patterns of neural activity during retrieval. Here we show that EEG alpha-band activity tracks the retrieval of spatial representations from long-term memory. Moreover, we find considerable overlap between the frequencies and patterns of activity that track spatial memories during initial study and at retrieval.


Sensors ◽  
2019 ◽  
Vol 19 (1) ◽  
pp. 190 ◽  
Author(s):  
Siddharth Kohli ◽  
Alexander J. Casson

Transcranial electrical stimulation is a widely used non-invasive brain stimulation approach. To date, EEG has been used to evaluate the effect of transcranial Direct Current Stimulation (tDCS) and transcranial Alternating Current Stimulation (tACS), but most studies have been limited to exploring changes in EEG before and after stimulation due to the presence of stimulation artifacts in the EEG data. This paper presents two different algorithms for removing the gross tACS artifact from simultaneous EEG recordings. These give different trade-offs in removal performance, in the amount of data required, and in their suitability for closed loop systems. Superposition of Moving Averages and Adaptive Filtering techniques are investigated, with significant emphasis on verification. We present head phantom testing results for controlled analysis, together with on-person EEG recordings in the time domain, frequency domain, and Event Related Potential (ERP) domain. The results show that EEG during tACS can be recovered free of large scale stimulation artifacts. Previous studies have not quantified the performance of the tACS artifact removal procedures, instead focusing on the removal of second order artifacts such as respiration related oscillations. We focus on the unresolved challenge of removing the first order stimulation artifact, presented with a new multi-stage validation strategy.


2006 ◽  
Vol 18 (12) ◽  
pp. 2108-2129 ◽  
Author(s):  
Gilles Pourtois ◽  
Michael De Pretto ◽  
Claude-Alain Hauert ◽  
Patrik Vuilleumier

People often remain “blind” to visual changes occurring during a brief interruption of the display. The processing stages responsible for such failure remain unresolved. We used event-related potentials to determine the time course of brain activity during conscious change detection versus change blindness. Participants saw two successive visual displays, each with two faces, and reported whether one of the faces changed between the first and second displays. Relative to blindness, change detection was associated with a distinct pattern of neural activity at several successive processing stages, including an enhanced occipital P1 response and a sustained frontal activity (CNV-like potential) after the first display, before the change itself. The amplitude of the N170 and P3 responses after the second visual display were also modulated by awareness of the face change. Furthermore, a unique topography of event-related potential activity was observed during correct change and correct no-change reports, but not during blindness, with a recurrent time course in the stimulus sequence and simultaneous sources in the parietal and temporo-occipital cortex. These results indicate that awareness of visual changes may depend on the attentional state subserved by coordinated neural activity in a distributed network, before the onset of the change itself.


2020 ◽  
Author(s):  
Clara Cámara ◽  
Cristina de la Malla ◽  
Josep Marco-Pallarés ◽  
Joan López-Moliner

ABSTRACTEvery time we use our smartphone, tablet, or other electronic devices we are exposed to temporal delays between our actions and the sensory feedback. We can compensate for such delays by adjusting our motor commands and doing so likely requires establishing new temporal mappings between motor areas and sensory predictions. However, little is known about the neural underpinnings that would support building new temporal correspondences between different brain areas. We here address the possibility that communication through coherence, which is thought to support neural interareal communication, lies behind the neural processes accounting for how humans cope with additional delays between motor and sensory areas. We recorded EEG activity while participants intercepted moving targets while seeing a cursor that followed their hand with a delay rather than their own hand. Participants adjusted their movements to the delayed visual feedback and intercepted the target with the cursor. The EEG data shows a significant increase in coherence of beta and gamma bands between visual and motor areas during the hand on-going movement towards interception. However, when looking at differences between participants depending on the level of adaptation, only the increase in gamma band correlated with the level of temporal adaptation. We are able to describe the time course of the coherence using coupled oscillators showing that the times at which high coherence is achieved are within useful ranges to solve the task. Altogether, these results evidence the functional relevance of brain coherence in a complex task where adapting to new delays is crucial.AUTHOR SUMMARYHumans are often exposed to delays between their actions and the incoming sensory feedback caused by actions. While there have been advances in the understanding of the conditions at which temporal adaptation can occur, little is known about the neural mechanisms enabling temporal adaptation. In the present study we measure brain activity (EEG) to investigate whether communication through coherence between motor and sensory areas might be responsible for one’s ability to cope with externally imposed delays in an interception task. We show evidence that neural coherence at gamma band between visual and motor areas is related to the degree of adaptation to temporal delays.


Sign in / Sign up

Export Citation Format

Share Document