The Data-augmentation Techniques in Item Response Modeling: Current Approaches and New Developments

2014 ◽  
Vol 22 (6) ◽  
pp. 1036
Author(s):  
Wei TIAN ◽  
Tao XIN ◽  
Chunhua KANG
2004 ◽  
Author(s):  
Kate E. Walton ◽  
Brent W. Roberts ◽  
Avshalom Caspi ◽  
Terrie E. Moffitt

2021 ◽  
Vol 6 ◽  
Author(s):  
John Fitzgerald Ehrich ◽  
Steven J. Howard ◽  
Sahar Bokosmaty ◽  
Stuart Woodcock

The accurate measurement of the cognitive load a learner encounters in a given task is critical to the understanding and application of Cognitive Load Theory (CLT). However, as a covert psychological construct, cognitive load represents a challenging measurement issue. To date, this challenge has been met mostly by subjective self-reports of cognitive load experienced in a learning situation. In this paper, we find that a valid and reliable index of cognitive load can be obtained through item response modeling of student performance. Specifically, estimates derived from item response modeling of relative difficulty (i.e., the difference between item difficulty and person ability locations) can function as a linear measure that combines the key components of cognitive load (i.e., mental load, mental effort, and performance). This index of cognitive load (relative difficulty) was tested for criterion (concurrent) validity in Year 2 learners (N = 91) performance on standardized educational numeracy and literacy assessments. Learners’ working memory (WM) capacity significantly predicted our proposed cognitive load (relative difficulty) index across both numeracy and literacy domains. That is, higher levels of WM were related to lower levels of cognitive load (relative difficulty), in line with fundamental predictions of CLT. These results illustrate the validity, utility and potential of this objective item response modeling approach to capturing individual differences in cognitive load across discrete learning tasks.


Sign in / Sign up

Export Citation Format

Share Document