scholarly journals Promoting Long-lasting Learning Through Instructional Design

Author(s):  
Patrik Hultberg ◽  
David Santandreu Calonge ◽  
Askhat Eugene Safiullin Lee

Passively listening to a lecture (deWinstanley & Bjork, 2002), skimming a textbook chapter, or googling for an answer to a homework problem is not conducive to deep and lasting high-order learning. At the same time, presenting complex concepts in problem-based classes might overload students’ working memory capacity. Effective student learning necessitates students to process information in their working memories, as well as storing information, facts and skills, in their long-term memories. Students must then be able to retrieve this information into their working memory in the future, in order to apply the information to new situations. That is, students must be able to recognize the characteristics of a future situation or problem and correctly retrieve the appropriate information stored in their long-term memory (Kirschner, Sweller, & Clark, 2006) to tackle the issue. Using the framework of Cognitive Load Theory, this article proposes an instructional model that promotes five strategies for learning and teaching; i.e. spacing, retrieval practice, elaboration, interleaving, and concrete examples, to effectively help students store and retrieve information from their long-term memory.

Author(s):  
Stoo Sepp ◽  
Steven J. Howard ◽  
Sharon Tindall-Ford ◽  
Shirley Agostinho ◽  
Fred Paas

In 1956, Miller first reported on a capacity limitation in the amount of information the human brain can process, which was thought to be seven plus or minus two items. The system of memory used to process information for immediate use was coined “working memory” by Miller, Galanter, and Pribram in 1960. In 1968, Atkinson and Shiffrin proposed their multistore model of memory, which theorized that the memory system was separated into short-term memory, long-term memory, and the sensory register, the latter of which temporarily holds and forwards information from sensory inputs to short term-memory for processing. Baddeley and Hitch built upon the concept of multiple stores, leading to the development of the multicomponent model of working memory in 1974, which described two stores devoted to the processing of visuospatial and auditory information, both coordinated by a central executive system. Later, Cowan’s theorizing focused on attentional factors in the effortful and effortless activation and maintenance of information in working memory. In 1988, Cowan published his model—the scope and control of attention model. In contrast, since the early 2000s Engle has investigated working memory capacity through the lens of his individual differences model, which does not seek to quantify capacity in the same way as Miller or Cowan. Instead, this model describes working memory capacity as the interplay between primary memory (working memory), the control of attention, and secondary memory (long-term memory). This affords the opportunity to focus on individual differences in working memory capacity and extend theorizing beyond storage to the manipulation of complex information. These models and advancements have made significant contributions to understandings of learning and cognition, informing educational research and practice in particular. Emerging areas of inquiry include investigating use of gestures to support working memory processing, leveraging working memory measures as a means to target instructional strategies for individual learners, and working memory training. Given that working memory is still debated, and not yet fully understood, researchers continue to investigate its nature, its role in learning and development, and its implications for educational curricula, pedagogy, and practice.


2019 ◽  
Vol 34 (2) ◽  
pp. 268-281 ◽  
Author(s):  
Lea M. Bartsch ◽  
Vanessa M. Loaiza ◽  
Klaus Oberauer

2003 ◽  
Vol 26 (6) ◽  
pp. 742-742
Author(s):  
Janice M. Keenan ◽  
Jukka Hyönä ◽  
Johanna K. Kaakinen

Ruchkin et al.'s view of working memory as activated long-term memory is more compatible with language processing than models such as Baddeley's, but it raises questions about individual differences in working memory and the validity of domain-general capacity estimates. Does it make sense to refer to someone as having low working memory capacity if capacity depends on particular knowledge structures tapped by the task?


1998 ◽  
Vol 21 (6) ◽  
pp. 845-846 ◽  
Author(s):  
John Sweller

The metric devised by Halford, Wilson & Phillips may have considerable potential in distinguishing between the working memory demands of different tasks but may be less effective in distinguishing working memory capacity between individuals. Despite the strengths of the metric, determining whether an effect is caused by relational complexity or by differential levels of expertise is currently problematic.


Sign in / Sign up

Export Citation Format

Share Document