Frontal Brain Activity during Episodic and Semantic Retrieval: Insights from Event-Related Potentials

1999 ◽  
Vol 11 (6) ◽  
pp. 598-609 ◽  
Author(s):  
Charan Ranganath ◽  
Ken A. Paller

Previous neuropsychological and neuroimaging results have implicated the prefrontal cortex in memory retrieval, although its precise role is unclear. In the present study, we examined patterns of brain electrical activity during retrieval of episodic and semantic memories. In the episodic retrieval task, participants retrieved autobiographical memories in response to event cues. In the semantic retrieval task, participants generated exemplars in response to category cues. Novel sounds presented intermittently during memory retrieval elicited a series of brain potentials including one identifiable as the P3a potential. Based on prior research linking P3a with novelty detection and with the frontal lobes, we predicted that P3a would be reduced to the extent that novelty detection and memory retrieval interfere with each other. Results during episodic and semantic retrieval tasks were compared to results during a task in which subjects attended to the auditory stimuli. P3a amplitudes were reduced during episodic retrieval, particularly at right lateral frontal scalp locations. A similar but less lateralized pattern of frontal P3a reduction was observed during semantic retrieval. These findings support the notion that the right prefrontal cortex is engaged in the service of memory retrieval, particularly for episodic memories.

Author(s):  
Hisashi Toyoshima ◽  
◽  
Takahiro Yamanoi ◽  
Toshimasa Yamazaki ◽  
Shin-ichi Ohnishi ◽  
...  

The 19-channel Event-Related Potentials (ERPs) we recorded during recognition of hiragana (one type of Japanese phonetic characters) were simultaneously and independently presented as a word and a nonword to opposite eyes using a field-sequential stereoscopic 3D display with a liquid-crystal shutter, a word and a non-word were simultaneously and independently presented to the left (right) and the right (left) eyes, respectively. Each word consists of 3 hiragana characters. Three subjects were instructed to press a button when they understood the meaning of the visual stimuli after 3,000 ms poststimulus. Equivalent Current Dipole source Localization (ECDL) with 3 unconstrained ECDs was applied to the ERPs. In the case of right-handed subjects, the ECDs were localized to the Wernicke’s area at around 600 ms. In the case of left-handed subject, the ECD was localized to the Wernicke’s homologue. After that ECDs were then localized to the prefrontal area, the superior frontal gyrus, and the middle frontal gyrus. At around 800 ms, the ECDs were localized to the Broca’s area, then after that ECDs were relocalized to the the Wernicke’s area and to the Broca’s area.


1979 ◽  
Vol 47 (4) ◽  
pp. 450-459 ◽  
Author(s):  
Judith M Ford ◽  
Walton T Roth ◽  
Richard C Mohs ◽  
William F Hopkins ◽  
Bert S Kopell

Author(s):  
Takahiro Yamanoi ◽  
◽  
Hisashi Toyoshima ◽  
Toshimasa Yamazaki ◽  
Shin-ichi Ohnishi ◽  
...  

In order to develop a brain machine interface, the authors have investigated the brain activity during human recognition of characters and symbols representing directional meaning. They have recorded electroencephalograms (EEGs) from subjects viewing four types of Kanji (Chinese characters being used currently in the Japanese language) and arrows that were presented on a CRT. Each of the four characters or symbols denoted direction for upward, downward, leftward and rightward, respectively. Subjects were asked to read the characters or symbols, silently. EEGs were averaged for each stimulus type and direction, and event related potentials (ERPs) were obtained. The equivalent current dipole source localization (ECDL) method has been applied to these ERPs. In both cases, equivalent current dipoles (ECDs) were localized to areas related to the working memory for spatial perception, such as the right upper or the right middle frontal areas. Taking into account these facts, the authors have investigated a single trial EEGs of the subject precisely after the latency at 400 ms, and it was determined effective sampling latencies for the discriminant analysis to four types of arrow: ↑, ↓, ←, and →. EEG data have been sampled at latency from 400 ms to 900 ms at 25 ms interval by the three channels in the right upper and the right middle frontal gyri. Results of the discriminant analysis for four type objective variates, presented discriminant rates were above 80%. By four type code of infrared rays according to the discrimination results from a PC, the authors have controlled a micro robot, the e-puck, with four orders: forward, rotate clockwise, rotate counterclockwise and stop.


2012 ◽  
Vol 108 (11) ◽  
pp. 3068-3072 ◽  
Author(s):  
Jean Decety ◽  
Stephanie Cacioppo

Neuroscience research indicates that moral reasoning is underpinned by distinct neural networks including the posterior superior temporal sulcus (pSTS), amygdala, and ventromedial prefrontal cortex, which support communication between computational systems underlying affective states, cognitions, and motivational processes. To characterize real-time neural processing underpinning moral computations, high-density event-related potentials were measured in participants while they viewed short, morally laden visual scenarios depicting intentional and accidental harmful actions. Current source density maxima in the right pSTS as fast as 62 ms poststimulus first distinguished intentional vs. accidental actions. Responses in the amygdala/temporal pole (122 ms) and ventromedial prefrontal cortex (182 ms) were then evoked by the perception of harmful actions, indicative of fast information processing associated with early stages of moral cognition. Our data strongly support the notion that intentionality is the first input to moral computations. They also demonstrate that emotion acts as a gain antecedent to moral judgment by alerting the individual to the moral salience of a situation and provide evidence for the pervasive role of affect in moral sensitivity and reasoning.


2007 ◽  
Vol 105 (2) ◽  
pp. 587-608 ◽  
Author(s):  
Metehan Çiçek ◽  
Ereian Nalçaci ◽  
Canan Kalaycioğlu

The aim of this study was to investigate the dynamic nature of the cortical visuospatial attention processes during the line bisection test, which is sensitive to perceptual asymmetries. EEGs of 26 normal volunteers were recorded during the administration of a computerized line bisection test, which requires participants mark the midline of lines using a mouse. Two event-related potentials subsequent and time locked to the line presentations, namely, P300 and a positive slow wave, were obtained. Findings suggested that both potentials were related to the test performance, and the right hemisphere was more active. Analysis suggested a right parietotemporal and superior parietal locus for the P300 and right prefrontal activity for the positive slow wave. A dynamic asymmetrical activity was identified, such that after primary visual perception, spatial processing is then initiated in the right parietotemporal cortex and then proceeds to the right prefrontal cortex.


2011 ◽  
Vol 301-303 ◽  
pp. 834-839 ◽  
Author(s):  
Du Hong Peng ◽  
Jian Chen ◽  
Hong Tao Wei

Strategy use is a mental operation which aims directly at its target. In order to investigate brain activity during mental arithmetic strategies using, subtraction answers-discrimination tasks were used through 8 healthy subjects with the Event-related Potentials (ERPs) technique. The results showed: (1) The principal components of ERP were: N1, P1, P2, P300 and N400. (2) At the F3, C3 point, stage 4 induced more negative wave than stage 1; At F4, C4, P3 and P4 point, stage 4 induced more positive wave than stage1. (3) Activation of brain areas mainly included Temporal lobe, Prefrontal cortex and Parietooccipital. (4) With the subjects gradually skilled in using strategies, activated brain region gradually moved from back to front; meanwhile the right side had higher active level than the left of brain. Further exploration need to describe more detailed brain potentials during mental arithmetic strategies using in other conditions.


2021 ◽  
Vol 11 (1) ◽  
pp. 48
Author(s):  
John Stein

(1) Background—the magnocellular hypothesis proposes that impaired development of the visual timing systems in the brain that are mediated by magnocellular (M-) neurons is a major cause of dyslexia. Their function can now be assessed quite easily by analysing averaged visually evoked event-related potentials (VERPs) in the electroencephalogram (EEG). Such analysis might provide a useful, objective biomarker for diagnosing developmental dyslexia. (2) Methods—in adult dyslexics and normally reading controls, we recorded steady state VERPs, and their frequency content was computed using the fast Fourier transform. The visual stimulus was a black and white checker board whose checks reversed contrast every 100 ms. M- cells respond to this stimulus mainly at 10 Hz, whereas parvocells (P-) do so at 5 Hz. Left and right visual hemifields were stimulated separately in some subjects to see if there were latency differences between the M- inputs to the right vs. left hemispheres, and these were compared with the subjects’ handedness. (3) Results—Controls demonstrated a larger 10 Hz than 5 Hz fundamental peak in the spectra, whereas the dyslexics showed the reverse pattern. The ratio of subjects’ 10/5 Hz amplitudes predicted their reading ability. The latency of the 10 Hz peak was shorter during left than during right hemifield stimulation, and shorter in controls than in dyslexics. The latter correlated weakly with their handedness. (4) Conclusion—Steady state visual ERPs may conveniently be used to identify developmental dyslexia. However, due to the limited numbers of subjects in each sub-study, these results need confirmation.


2002 ◽  
Vol 13 (01) ◽  
pp. 001-013 ◽  
Author(s):  
James Jerger ◽  
Rebecca Estes

We studied auditory evoked responses to the apparent movement of a burst of noise in the horizontal plane. Event-related potentials (ERPs) were measured in three groups of participants: children in the age range from 9 to 12 years, young adults in the age range from 18 to 34 years, and seniors in the age range from 65 to 80 years. The topographic distribution of grand-averaged ERP activity was substantially greater over the right hemisphere in children and seniors but slightly greater over the left hemisphere in young adults. This finding may be related to age-related differences in the extent to which judgments of sound movement are based on displacement versus velocity information.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Saugat Bhattacharyya ◽  
Davide Valeriani ◽  
Caterina Cinel ◽  
Luca Citi ◽  
Riccardo Poli

AbstractIn this paper we present, and test in two realistic environments, collaborative Brain-Computer Interfaces (cBCIs) that can significantly increase both the speed and the accuracy of perceptual group decision-making. The key distinguishing features of this work are: (1) our cBCIs combine behavioural, physiological and neural data in such a way as to be able to provide a group decision at any time after the quickest team member casts their vote, but the quality of a cBCI-assisted decision improves monotonically the longer the group decision can wait; (2) we apply our cBCIs to two realistic scenarios of military relevance (patrolling a dark corridor and manning an outpost at night where users need to identify any unidentified characters that appear) in which decisions are based on information conveyed through video feeds; and (3) our cBCIs exploit Event-Related Potentials (ERPs) elicited in brain activity by the appearance of potential threats but, uniquely, the appearance time is estimated automatically by the system (rather than being unrealistically provided to it). As a result of these elements, in the two test environments, groups assisted by our cBCIs make both more accurate and faster decisions than when individual decisions are integrated in more traditional manners.


Sign in / Sign up

Export Citation Format

Share Document