scholarly journals Reflections of Idiographic Long-Term Memory Characteristics In Resting-State Neuroimaging Data

2020 ◽  
Author(s):  
Peiyun Zhou ◽  
Florian Sense ◽  
Hedderik van Rijn ◽  
Andrea Stocco

AbstractTranslational applications of cognitive science depend on having predictive models at the individual, or idiographic, level. However, idiographic model parameters, such as working memory capacity, often need to be estimated from specific tasks, making them dependent on task-specific assumptions. Here, we explore the possibility that idiographic parameters reflect an individual’s biology and can be identified from task-free neuroimaging measures. To test this hypothesis, we correlated a reliable behavioral trait, the individual rate of forgetting in long-term memory, with a readily available task-free neuroimaging measure, the resting-state EEG spectrum. Using an established, adaptive fact-learning procedure, the rate of forgetting for verbal and visual materials was measured in a sample of 50 undergraduates from whom we also collected eyes-closed resting-state EEG data. Statistical analyses revealed that the individual rates of forgetting were significantly correlated across verbal and visual materials. Importantly, both rates correlated with resting-state power levels low (13-15 Hz) and upper (15-17 Hz) portion of the beta frequency bands. These correlations were particularly strong for visuospatial materials, were distributed over multiple fronto-parietal locations, and remained significant even after a correction for multiple comparisons (False Discovery Rate) and robust correlations methods were applied. These results suggest that computational models could be individually tailored for prediction using idiographic parameter values derived from inexpensive, task-free imaging recordings.

Author(s):  
Stoo Sepp ◽  
Steven J. Howard ◽  
Sharon Tindall-Ford ◽  
Shirley Agostinho ◽  
Fred Paas

In 1956, Miller first reported on a capacity limitation in the amount of information the human brain can process, which was thought to be seven plus or minus two items. The system of memory used to process information for immediate use was coined “working memory” by Miller, Galanter, and Pribram in 1960. In 1968, Atkinson and Shiffrin proposed their multistore model of memory, which theorized that the memory system was separated into short-term memory, long-term memory, and the sensory register, the latter of which temporarily holds and forwards information from sensory inputs to short term-memory for processing. Baddeley and Hitch built upon the concept of multiple stores, leading to the development of the multicomponent model of working memory in 1974, which described two stores devoted to the processing of visuospatial and auditory information, both coordinated by a central executive system. Later, Cowan’s theorizing focused on attentional factors in the effortful and effortless activation and maintenance of information in working memory. In 1988, Cowan published his model—the scope and control of attention model. In contrast, since the early 2000s Engle has investigated working memory capacity through the lens of his individual differences model, which does not seek to quantify capacity in the same way as Miller or Cowan. Instead, this model describes working memory capacity as the interplay between primary memory (working memory), the control of attention, and secondary memory (long-term memory). This affords the opportunity to focus on individual differences in working memory capacity and extend theorizing beyond storage to the manipulation of complex information. These models and advancements have made significant contributions to understandings of learning and cognition, informing educational research and practice in particular. Emerging areas of inquiry include investigating use of gestures to support working memory processing, leveraging working memory measures as a means to target instructional strategies for individual learners, and working memory training. Given that working memory is still debated, and not yet fully understood, researchers continue to investigate its nature, its role in learning and development, and its implications for educational curricula, pedagogy, and practice.


2019 ◽  
Vol 34 (2) ◽  
pp. 268-281 ◽  
Author(s):  
Lea M. Bartsch ◽  
Vanessa M. Loaiza ◽  
Klaus Oberauer

2003 ◽  
Vol 26 (6) ◽  
pp. 742-742
Author(s):  
Janice M. Keenan ◽  
Jukka Hyönä ◽  
Johanna K. Kaakinen

Ruchkin et al.'s view of working memory as activated long-term memory is more compatible with language processing than models such as Baddeley's, but it raises questions about individual differences in working memory and the validity of domain-general capacity estimates. Does it make sense to refer to someone as having low working memory capacity if capacity depends on particular knowledge structures tapped by the task?


2020 ◽  
pp. 116-149 ◽  
Author(s):  
Klaus Oberauer

Working memory provides a medium for building and manipulating new representations that control our thoughts and actions. To fulfil this function, a working memory system needs to meet six requirements: (1) it must have a mechanism for rapidly forming temporary bindings to combine elements into new structures; (2) it needs a focus of attention for selectively accessing individual elements for processing; (3) it must hold both declarative representations of what is the case, and procedural representations of how to act on the current situation; (4) it needs a process for rapid updating, including rapid removal of outdated contents. Moreover, contents of working memory (5) need to be shielded from interference from long-term memory, while (6) working memory should be able to use information in long-term memory when it is useful. This chapter summarizes evidence in support of these mechanisms and processes. It presents three computational models that each implement some of these mechanisms, and explains different subsets of empirical findings about working memory: the SOB-CS model accounts for behaviour in tests of immediate serial recall, including complex-span tasks. The interference model explains data from a common test of visual working memory, the continuous-reproduction task. The set-selection model explains how people learn memory sets and task sets, how these sets are retrieved from long-term memory, and how these mechanisms enable switching between memory sets and task sets.


1998 ◽  
Vol 21 (6) ◽  
pp. 845-846 ◽  
Author(s):  
John Sweller

The metric devised by Halford, Wilson & Phillips may have considerable potential in distinguishing between the working memory demands of different tasks but may be less effective in distinguishing working memory capacity between individuals. Despite the strengths of the metric, determining whether an effect is caused by relational complexity or by differential levels of expertise is currently problematic.


Sign in / Sign up

Export Citation Format

Share Document