scholarly journals A shared representation of order between encoding and recognition in visual short-term memory

2016 ◽  
Author(s):  
Kristjan Kalm ◽  
Dennis Norris

AbstractMost complex tasks require people to bind individual stimuli into a sequence in short term memory (STM). For this purpose information about the order of the individual stimuli in the sequence needs to be in active and accessible form in STM over a period of few seconds. Here we investigated how the temporal order information is shared between the presentation and response phases of an STM task. We trained a classification algorithm on the fMRI activity patterns from the presentation phase of the STM task to predict the order of the items during the subsequent recognition phase. While voxels in a number of brain regions represented positional information during either presentation and recognition phases, only voxels in the lateral prefrontal cortex (PFC) and the anterior temporal lobe (ATL) represented position consistently across task phases. A shared positional code in the ATL might reflect verbal recoding of visual sequences to facilitate the maintenance of order information over several seconds.

2016 ◽  
Author(s):  
Michele Veldsman ◽  
Daniel J. Mitchell ◽  
Rhodri Cusack

AbstractRecent evidence suggests that visual short-term memory (VSTM) capacity estimated using simple objects, such as colours and oriented bars, may not generalise well to more naturalistic stimuli. More visual detail can be stored in VSTM when complex, recognisable objects are maintained compared to simple objects. It is not yet known if it is recognisability that enhances memory precision, nor whether maintenance of recognisable objects is achieved with the same network of brain regions supporting maintenance of simple objects.We used a novel stimulus generation method to parametrically warp photographic images along a continuum, allowing separate estimation of the precision of memory representations and the number of items retained. The stimulus generation method was also designed to create unrecognisable, though perceptually matched, stimuli, to investigate the impact of recognisability on VSTM. We adapted the widely-used change detection and continuous report paradigms for use with complex, photographic images.Across three functional magnetic resonance imaging (fMRI) experiments, we demonstrated greater precision for recognisable objects in VSTM compared to unrecognisable objects. This clear behavioural advantage was not the result of recruitment of additional brain regions, or of stronger mean activity within the core network. Representational similarity analysis revealed greater variability across item repetitions in the representations of recognisable, compared to unrecognisable complex objects. We therefore propose that a richer range of neural representations support VSTM for complex recognisable objects.


2003 ◽  
Vol 26 (6) ◽  
pp. 752-753
Author(s):  
William A. Phillips

Although visual long-term memory (VLTM) and visual short-term memory (VSTM) can be distinguished from each other (and from visual sensory storage [SS]), they are embodied within the same modality-specific brain regions, but in very different ways: VLTM as patterns of connectivity and VSTM as patterns of activity. Perception and VSTM do not “activate” VLTM. They use VLTM to create novel patterns of activity relevant to novel circumstances.


Author(s):  
Kevin Dent

In two experiments participants retained a single color or a set of four spatial locations in memory. During a 5 s retention interval participants viewed either flickering dynamic visual noise or a static matrix pattern. In Experiment 1 memory was assessed using a recognition procedure, in which participants indicated if a particular test stimulus matched the memorized stimulus or not. In Experiment 2 participants attempted to either reproduce the locations or they picked the color from a whole range of possibilities. Both experiments revealed effects of dynamic visual noise (DVN) on memory for colors but not for locations. The implications of the results for theories of working memory and the methodological prospects for DVN as an experimental tool are discussed.


Author(s):  
Yuhong Jiang

Abstract. When two dot arrays are briefly presented, separated by a short interval of time, visual short-term memory of the first array is disrupted if the interval between arrays is shorter than 1300-1500 ms ( Brockmole, Wang, & Irwin, 2002 ). Here we investigated whether such a time window was triggered by the necessity to integrate arrays. Using a probe task we removed the need for integration but retained the requirement to represent the images. We found that a long time window was needed for performance to reach asymptote even when integration across images was not required. Furthermore, such window was lengthened if subjects had to remember the locations of the second array, but not if they only conducted a visual search among it. We suggest that a temporal window is required for consolidation of the first array, which is vulnerable to disruption by subsequent images that also need to be memorized.


2013 ◽  
Author(s):  
Deepna T. Devkar ◽  
Wei Ji Ma ◽  
Jeffrey S. Katz ◽  
Anthony A. Wright

Sign in / Sign up

Export Citation Format

Share Document