Neuronal activity encoding temporal-order memory of visual objects in the macaque medial temporal lobe

2010 ◽  
Vol 68 ◽  
pp. e68-e69
Author(s):  
Yuji Naya ◽  
Wendy A. Suzuki
2017 ◽  
Vol 114 (51) ◽  
pp. 13555-13560 ◽  
Author(s):  
Yuji Naya ◽  
He Chen ◽  
Cen Yang ◽  
Wendy A. Suzuki

Neuropsychological and neurophysiological studies have emphasized the role of the prefrontal cortex (PFC) in maintaining information about the temporal order of events or items for upcoming actions. However, the medial temporal lobe (MTL) has also been considered critical to bind individual events or items to their temporal context in episodic memory. Here we characterize the contributions of these brain areas by comparing single-unit activity in the dorsal and ventral regions of macaque lateral PFC (d-PFC and v-PFC) with activity in MTL areas including the hippocampus (HPC), entorhinal cortex, and perirhinal cortex (PRC) as well as in area TE during the encoding phase of a temporal-order memory task. The v-PFC cells signaled specific items at particular time periods of the task. By contrast, MTL cortical cells signaled specific items across multiple time periods and discriminated the items between time periods by modulating their firing rates. Analysis of the temporal dynamics of these signals showed that the conjunctive signal of item and temporal-order information in PRC developed earlier than that seen in v-PFC. During the delay interval between the two cue stimuli, while v-PFC provided prominent stimulus-selective delay activity, MTL areas did not. Both regions of PFC and HPC exhibited an incremental timing signal that appeared to represent the continuous passage of time during the encoding phase. However, the incremental timing signal in HPC was more prominent than that observed in PFC. These results suggest that PFC and MTL contribute to the encoding of the integration of item and timing information in distinct ways.


Brain ◽  
1990 ◽  
Vol 113 (4) ◽  
pp. 1093-1112 ◽  
Author(s):  
GARY HEIT ◽  
MICHAEL E. SMITH ◽  
ERIC HALGREN

2015 ◽  
Vol 114 (2) ◽  
pp. 1227-1238 ◽  
Author(s):  
Janice Chen ◽  
Paul A. Cook ◽  
Anthony D. Wagner

Emerging human, animal, and computational evidence suggest that, within the hippocampus, stored memories are compared with current sensory input to compute novelty, i.e., detecting when inputs deviate from expectations. Hippocampal subfield CA1 is thought to detect mismatches between past and present, and detected novelty is thought to modulate encoding processes, providing a mechanism for gating the entry of information into memory. Using high-resolution functional MRI, we examined human hippocampal subfield and medial temporal lobe cortical activation during prediction violations within a sequence of events unfolding over time. Subjects encountered sequences of four visual stimuli that were then reencountered in the same temporal order (Repeat) or a rearranged order (Violation). Prediction strength was manipulated by varying whether the sequence was initially presented once (Weak) or thrice (Strong) prior to the critical Repeat or Violation sequence. Analyses of blood oxygen level-dependent signals revealed that task-responsive voxels in anatomically defined CA1, CA23/dentate gyrus, and perirhinal cortex were more active when expectations were violated than when confirmed. Additionally, stronger prediction violations elicited greater activity than weaker violations in CA1, and CA1 contained the greatest proportion of voxels displaying this prediction violation pattern relative to other medial temporal lobe regions. Finally, a memory test with a separate group of subjects showed that subsequent recognition memory was superior for items that had appeared in prediction violation trials than in prediction confirmation trials. These findings indicate that CA1 responds to temporal order prediction violations, and that this response is modulated by prediction strength.


2021 ◽  
Author(s):  
Vincent van de Ven ◽  
Guyon Kleuters ◽  
Joey Stuiver

We memorize our daily life experiences, which are often multisensory in nature, by segmenting them into distinct event models, in accordance with perceived contextual or situational changes. However, very little is known about how multisensory integration affects segmentation, as most studies have focused on unisensory (visual or audio) segmentation. In three experiments, we investigated the effect of multisensory integration on segmentation in memory and perception. In Experiment 1, participants encoded lists of visual objects while audio and visual contexts changed synchronously or asynchronously. After each list, we tested recognition and temporal associative memory for pictures that were encoded in the same audio-visual context or that crossed a synchronous or an asynchronous multisensory change. We found no effect of multisensory integration for recognition memory: Synchronous and asynchronous changes similarly impaired recognition for pictures encoded at those changes, compared to pictures encoded further away from those changes. Multisensory integration did affect temporal associative memory, which was worse for pictures encoded at synchronous than at asynchronous changes. Follow up experiments showed that this effect was not due to the higher complexity of multisensory over unisensory contexts (Experiment 2), nor that it was due to the temporal unpredictability of contextual changes inherent to Experiment 1 (Experiment 3). We argue that participants formed situational expectations through multisensory integration, such that synchronous multisensory changes deviated more strongly from those expectations than asynchronous changes. We discuss our findings in light of supportive and conflicting findings of uni- and multisensory segmentation.


2020 ◽  
Author(s):  
Nora A. Herweg ◽  
Lukas Kunz ◽  
Armin Brandt ◽  
Paul A. Wanda ◽  
Ashwini D. Sharan ◽  
...  

AbstractWe examined the information coded by the human medial temporal lobe (MTL) during navigation of a complex virtual environment. Using multivariate decoding, we show that local field potentials in the MTL represent places, their semantic meaning, and the temporal order in which they are visited. These representations strengthen with experience, suggesting that the MTL constructs a learned map of physical and semantic spaces that can be used to predict future trajectories.


2020 ◽  
Author(s):  
Susan L. Benear ◽  
Elizabeth A. Horwath ◽  
Emily Cowan ◽  
M. Catalina Camacho ◽  
Chi Ngo ◽  
...  

The medial temporal lobe (MTL) undergoes critical developmental change throughout childhood, which aligns with developmental changes in episodic memory. We used representational similarity analysis to compare neural pattern similarity for children and adults in hippocampus and parahippocampal cortex during naturalistic viewing of clips from the same movie or different movies. Some movies were more familiar to participants than others. Neural pattern similarity was generally lower for clips from the same movie, indicating that related content taxes pattern separation-like processes. However, children showed this effect only for movies with which they were familiar, whereas adults showed the effect consistently. These data suggest that children need more exposures to stimuli in order to show mature pattern separation processes.


Sign in / Sign up

Export Citation Format

Share Document