scholarly journals Intermodal Auditory, Visual, and Tactile Attention Modulates Early Stages of Neural Processing

2009 ◽  
Vol 21 (4) ◽  
pp. 669-683 ◽  
Author(s):  
Christina M. Karns ◽  
Robert T. Knight

We used event-related potentials (ERPs) and gamma band oscillatory responses (GBRs) to examine whether intermodal attention operates early in the auditory, visual, and tactile modalities. To control for the effects of spatial attention, we spatially coregistered all stimuli and varied the attended modality across counterbalanced blocks in an intermodal selection task. In each block, participants selectively responded to either auditory, visual, or vibrotactile stimuli from the stream of intermodal events. Auditory and visual ERPs were modulated at the latencies of early cortical processing, but attention manifested later for tactile ERPs. For ERPs, auditory processing was modulated at the latency of the Na (29 msec), which indexes early cortical or thalamocortical processing and the subsequent P1 (90 msec) ERP components. Visual processing was modulated at the latency of the early phase of the C1 (62–72 msec) thought to be generated in the primary visual cortex and the subsequent P1 and N1 (176 msec). Tactile processing was modulated at the latency of the N160 (165 msec) likely generated in the secondary association cortex. Intermodal attention enhanced early sensory GBRs for all three modalities: auditory (onset 57 msec), visual (onset 47 msec), and tactile (onset 27 msec). Together, these results suggest that intermodal attention enhances neural processing relatively early in the sensory stream independent from differential effects of spatial and intramodal selective attention.

2021 ◽  
Author(s):  
Wei Dou ◽  
Audrey Morrow ◽  
Luca Iemi ◽  
Jason Samaha

The neurogenesis of alpha-band (8-13 Hz) activity has been characterized across many different animal experiments. However, the functional role that alpha oscillations play in perception and behavior has largely been attributed to two contrasting hypotheses, with human evidence in favor of either (or both or neither) remaining sparse. On the one hand, alpha generators have been observed in relay sectors of the visual thalamus and are postulated to phasically inhibit afferent visual input in a feedforward manner 1-4. On the other hand, evidence also suggests that the direction of influence of alpha activity propagates backwards along the visual hierarchy, reflecting a feedback influence upon the visual cortex 5-9. The primary source of human evidence regarding the role of alpha phase in visual processing has been on perceptual reports 10-16, which could be modulated either by feedforward or feedback alpha activity. Thus, although these two hypotheses are not mutually exclusive, human evidence clearly supporting either one is lacking. Here, we present human subjects with large, high-contrast visual stimuli that elicit robust C1 event-related potentials (ERP), which peak between 70-80 milliseconds post-stimulus and are thought to reflect afferent primary visual cortex (V1) input 17-20. We find that the phase of ongoing alpha oscillations modulates the global field power (GFP) of the EEG during this first volley of stimulus processing (the C1 time-window). On the standard assumption 21-23 that this early activity reflects postsynaptic potentials being relayed to visual cortex from the thalamus, our results suggest that alpha phase gates visual responses during the first feed-forward sweep of processing.


2019 ◽  
Author(s):  
Stefania Ferraro ◽  
Markus J. Van Ackeren ◽  
Roberto Mai ◽  
Laura Tassi ◽  
Francesco Cardinale ◽  
...  

AbstractUnequivocally demonstrating the presence of multisensory signals at the earliest stages of cortical processing remains challenging in humans. In our study, we relied on the unique spatio-temporal resolution provided by intracranial stereotactic electroencephalographic (SEEG) recordings in patients with drug-resistant epilepsy to characterize the signal extracted from early visual (calcarine and pericalcarine) and auditory (Heschl’s gyrus and planum temporale) regions during a simple audio-visual oddball task. We provide evidences that both cross-modal responses (visual responses in auditory cortex or the reverse) and multisensory processing (alteration of the unimodal responses during bimodal stimulation) can be observed in intracranial event-related potentials (iERPs) and in power modulations of oscillatory activity at different temporal scales within the first 150 ms after stimulus onset. The temporal profiles of the iERPs are compatible with the hypothesis that MSI occurs by means of direct pathways linking early visual and auditory regions. Our data indicate, moreover, that MSI mainly relies on modulations of the low-frequency bands (foremost the theta band in the auditory cortex and the alpha band in the visual cortex), suggesting the involvement of feedback pathways between the two sensory regions. Remarkably, we also observed high-gamma power modulations by sounds in the early visual cortex, thus suggesting the presence of neuronal populations involved in auditory processing in the calcarine and pericalcarine region in humans.


2010 ◽  
Vol 1 (2) ◽  
Author(s):  
Joshua Baruth ◽  
Manuel Casanova ◽  
Lonnie Sears ◽  
Estate Sokhadze

AbstractIt has been reported that individuals with autism spectrum disorder (ASD) have abnormal responses to the sensory environment. For these individuals sensory overload can impair functioning, raise physiological stress, and adversely affect social interaction. Early-stage (i.e. within 200 ms of stimulus onset) auditory processing abnormalities have been widely examined in ASD using event-related potentials (ERP), while ERP studies investigating early-stage visual processing in ASD are less frequent. We wanted to test the hypothesis of early-stage visual processing abnormalities in ASD by investigating ERPs elicited in a visual oddball task using illusory figures. Our results indicate that individuals with ASD have abnormally large cortical responses to task irrelevant stimuli over both parieto-occipital and frontal regions-of-interest (ROI) during early stages of visual processing compared to the control group. Furthermore, ASD patients showed signs of an overall disruption in stimulus discrimination, and had a significantly higher rate of motor response errors.


2018 ◽  
Vol 35 (3) ◽  
pp. 315-331 ◽  
Author(s):  
Paula Virtala ◽  
Minna Huotilainen ◽  
Esa Lilja ◽  
Juha Ojala ◽  
Mari Tervaniemi

Guitar distortion used in rock music modifies a chord so that new frequencies appear in its harmonic structure. A distorted dyad (power chord) has a special role in heavy metal music due to its harmonics that create a major third interval, making it similar to a major chord. We investigated how distortion affects cortical auditory processing of chords in musicians and nonmusicians. Electric guitar chords with or without distortion and with or without the interval of the major third (i.e., triads or dyads) were presented in an oddball design where one of them served as a repeating standard stimulus and others served as occasional deviants. This enabled the recording of event-related potentials (ERPs) of the electroencephalogram (EEG) related to deviance processing (the mismatch negativity MMN and the attention-related P3a component) in an ignore condition. MMN and P3a responses were elicited in most paradigms. Distorted chords in a nondistorted context only elicited early P3a responses. However, the power chord did not demonstrate a special role in the level of the ERPs. Earlier and larger MMN and P3a responses were elicited when distortion was modified compared to when only harmony (triad vs. dyad) was modified between standards and deviants. The MMN responses were largest when distortion and harmony deviated simultaneously. Musicians demonstrated larger P3a responses than nonmusicians. The results suggest mostly independent cortical auditory processing of distortion and harmony in Western individuals, and facilitated chord change processing in musicians compared to nonmusicians. While distortion has been used in heavy rock music for decades, this study is among the first ones to shed light on its cortical basis.


Author(s):  
Luodi Yu ◽  
Jiajing Zeng ◽  
Suiping Wang ◽  
Yang Zhang

Purpose This study aimed to examine whether abstract knowledge of word-level linguistic prosody is independent of or integrated with phonetic knowledge. Method Event-related potential (ERP) responses were measured from 18 adult listeners while they listened to native and nonnative word-level prosody in speech and in nonspeech. The prosodic phonology (speech) conditions included disyllabic pseudowords spoken in Chinese and in English matched for syllabic structure, duration, and intensity. The prosodic acoustic (nonspeech) conditions were hummed versions of the speech stimuli, which eliminated the phonetic content while preserving the acoustic prosodic features. Results We observed language-specific effects on the ERP that native stimuli elicited larger late negative response (LNR) amplitude than nonnative stimuli in the prosodic phonology conditions. However, no such effect was observed in the phoneme-free prosodic acoustic control conditions. Conclusions The results support the integration view that word-level linguistic prosody likely relies on the phonetic content where the acoustic cues embedded in. It remains to be examined whether the LNR may serve as a neural signature for language-specific processing of prosodic phonology beyond auditory processing of the critical acoustic cues at the suprasyllabic level.


Author(s):  
Wessam Mostafa Essawy

<p class="abstract"><strong>Background:</strong> Amblyaudia is a weakness in the listener’s binaural processing of auditory information. Subjects with amblyaudia also demonstrate binaural integration deficits and may display similar patterns in their evoked responses in terms of latency and amplitude of these responses. The purpose of this study was to identify the presence of amblyaudia in a population of young children subjects and to measure mismatch negativity (MMN), P300 and cortical auditory evoked potentials (CAEPs) for those individuals.</p><p class="abstract"><strong>Methods:</strong> Subjects included in this study were divided into 2 groups control group that consisted of 20 normal hearing subjects with normal developmental milestones and normal speech development. The study group (GII) consisted of 50 subjects with central auditory processing disorders (CAPDs) diagnosed by central auditory screening tests. </p><p class="abstract"><strong>Results:</strong> With using dichotic tests including dichotic digits test (DDT) and competing sentence test (CST), we could classify these cases into normal, dichotic dysaudia, amblyaudia, and amblyaudia plus with percentages (40%, 14%, 38%, 8% respectively). Using event related potentials, we found that P300 and MMN are more specific in detecting neurocognitive dysfunction related to allocation of attentional resources and immediate memory in these cases.</p><p class="abstract"><strong>Conclusions:</strong> The presence of amblyaudia in cases of central auditory processing disorders (CAPDs) and event related potentials is an objective tool for diagnosis, prognosis and follow up after rehabilitation.</p>


2019 ◽  
Vol 9 (12) ◽  
pp. 362
Author(s):  
Antonia M. Karellas ◽  
Paul Yielder ◽  
James J. Burkitt ◽  
Heather S. McCracken ◽  
Bernadette A. Murphy

Multisensory integration (MSI) is necessary for the efficient execution of many everyday tasks. Alterations in sensorimotor integration (SMI) have been observed in individuals with subclinical neck pain (SCNP). Altered audiovisual MSI has previously been demonstrated in this population using performance measures, such as reaction time. However, neurophysiological techniques have not been combined with performance measures in the SCNP population to determine differences in neural processing that may contribute to these behavioral characteristics. Electroencephalography (EEG) event-related potentials (ERPs) have been successfully used in recent MSI studies to show differences in neural processing between different clinical populations. This study combined behavioral and ERP measures to characterize MSI differences between healthy and SCNP groups. EEG was recorded as 24 participants performed 8 blocks of a simple reaction time (RT) MSI task, with each block consisting of 34 auditory (A), visual (V), and audiovisual (AV) trials. Participants responded to the stimuli by pressing a response key. Both groups responded fastest to the AV condition. The healthy group demonstrated significantly faster RTs for the AV and V conditions. There were significant group differences in neural activity from 100–140 ms post-stimulus onset, with the control group demonstrating greater MSI. Differences in brain activity and RT between individuals with SCNP and a control group indicate neurophysiological alterations in how individuals with SCNP process audiovisual stimuli. This suggests that SCNP alters MSI. This study presents novel EEG findings that demonstrate MSI differences in a group of individuals with SCNP.


2020 ◽  
Vol 14 ◽  
Author(s):  
Luiza Kirasirova ◽  
Vladimir Bulanov ◽  
Alexei Ossadtchi ◽  
Alexander Kolsanov ◽  
Vasily Pyatin ◽  
...  

A P300 brain-computer interface (BCI) is a paradigm, where text characters are decoded from event-related potentials (ERPs). In a popular implementation, called P300 speller, a subject looks at a display where characters are flashing and selects one character by attending to it. The selection is recognized as the item with the strongest ERP. The speller performs well when cortical responses to target and non-target stimuli are sufficiently different. Although many strategies have been proposed for improving the BCI spelling, a relatively simple one received insufficient attention in the literature: reduction of the visual field to diminish the contribution from non-target stimuli. Previously, this idea was implemented in a single-stimulus switch that issued an urgent command like stopping a robot. To tackle this approach further, we ran a pilot experiment where ten subjects operated a traditional P300 speller or wore a binocular aperture that confined their sight to the central visual field. As intended, visual field restriction resulted in a replacement of non-target ERPs with EEG rhythms asynchronous to stimulus periodicity. Changes in target ERPs were found in half of the subjects and were individually variable. While classification accuracy was slightly better for the aperture condition (84.3 ± 2.9%, mean ± standard error) than the no-aperture condition (81.0 ± 2.6%), this difference was not statistically significant for the entire sample of subjects (N = 10). For both the aperture and no-aperture conditions, classification accuracy improved over 4 days of training, more so for the aperture condition (from 72.0 ± 6.3% to 87.0 ± 3.9% and from 72.0 ± 5.6% to 97.0 ± 2.2% for the no-aperture and aperture conditions, respectively). Although in this study BCI performance was not substantially altered, we suggest that with further refinement this approach could speed up BCI operations and reduce user fatigue. Additionally, instead of wearing an aperture, non-targets could be removed algorithmically or with a hybrid interface that utilizes an eye tracker. We further discuss how a P300 speller could be improved by taking advantage of the different physiological properties of the central and peripheral vision. Finally, we suggest that the proposed experimental approach could be used in basic research on the mechanisms of visual processing.


2020 ◽  
Vol 25 (5) ◽  
pp. 237-248
Author(s):  
Maojin Liang ◽  
Jiahao Liu ◽  
Yuexin Cai ◽  
Fei Zhao ◽  
Suijun Chen ◽  
...  

Objective: The present study investigated the characteristics of visual processing in the auditory-associated cortex in adults with hearing loss using event-related potentials. Methods: Ten subjects with bilateral postlingual hearing loss were recruited. Ten age- and sex-matched normal-hearing subjects were included as controls. Visual (“sound” and “non-sound” photos)-evoked potentials were performed. The P170 response in the occipital area as well as N1 and N2 responses in FC3 and FC4 were analyzed. Results: Adults with hearing loss had higher P170 amplitudes, significantly higher N2 amplitudes, and shorter N2 latency in response to “sound” and “non-sound” photo stimuli at both FC3 and FC4, with the exception of the N2 amplitude which responded to “sound” photo stimuli at FC3. Further topographic mapping analysis revealed that patients had a large difference in response to “sound” and “non-sound” photos in the right frontotemporal area, starting from approximately 200 to 400 ms. Localization of source showed the difference to be located in the middle frontal gyrus region (BA10) at around 266 ms. Conclusions: The significantly stronger responses to visual stimuli indicate enhanced visual processing in the auditory-associated cortex in adults with hearing loss, which may be attributed to cortical visual reorganization involving the right frontotemporal cortex.


Sign in / Sign up

Export Citation Format

Share Document