scholarly journals Retrofitting Diagnostic Classification Models to Responses From IRT-Based Assessment Forms

2017 ◽  
Vol 78 (3) ◽  
pp. 357-383 ◽  
Author(s):  
Ren Liu ◽  
Anne Corinne Huggins-Manley ◽  
Okan Bulut

Developing a diagnostic tool within the diagnostic measurement framework is the optimal approach to obtain multidimensional and classification-based feedback on examinees. However, end users may seek to obtain diagnostic feedback from existing item responses to assessments that have been designed under either the classical test theory or item response theory frameworks. Retrofitting diagnostic classification models to existing assessments designed under other psychometric frameworks could be a plausible approach to obtain more actionable scores or understand more about the constructs themselves. This study (a) discusses the possibility and problems of retrofitting, (b) proposes a step-by-step retrofitting framework, and (c) explores the information one can gain from retrofitting through an empirical application example. While retrofitting may not always be an ideal approach to diagnostic measurement, this article aims to invite discussions through presenting the possibility, challenges, process, and product of retrofitting.

2020 ◽  
Author(s):  
Kazuhiro Yamaguchi ◽  
Jonathan Templin

Quantifying the reliability of latent variable estimates in diagnostic classification models has been a difficult topic, complicated by the classification-based nature of these models. In this study, we derive observed score reliability indices based on diagnostic classification models as an extension of classical test theory-based reliability. Additionally, we derive conditional observed sum- and sub-score distributions. In this manner, various conditional expectations and conditional standard error of measurement estimates can be calculated for both total- and sub-scores of a test. The proposed methods provide a variety of expectations and standard errors for attribute estimates, which we demonstrate in an analysis of an empirical test.


2017 ◽  
Vol 78 (6) ◽  
pp. 1072-1088 ◽  
Author(s):  
Ren Liu ◽  
Hong Qian ◽  
Xiao Luo ◽  
Ada Woo

Subscore reporting under item response theory models has always been a challenge partly because the test length of each subdomain is limited for precisely locating individuals on multiple continua. Diagnostic classification models (DCMs), providing a pass/fail decision and associated probability of pass on each subdomain, are promising alternatives for subscore reporting. However, it may not be appropriate to provide those binary decisions or probabilities to examinees when (1) an overall score is provided with a pass/fail decision from standard-setting procedures or (2) absolute decisions or probabilities are not the purpose of providing subscores. To satisfy such score reporting scenarios, this article introduces the relative diagnostic profile (RDP), a framework using DCMs to score item responses but withholding absolute decisions and probabilities. In the RDP framework, a person’s overall ability is a pie, where stronger subdomains are bigger slices. Beyond the within-individual information, this framework can also be used to compare a person’s relative strengths to those of a chosen group of examinees.


2019 ◽  
Vol 45 (1) ◽  
pp. 5-31
Author(s):  
Matthew S. Johnson ◽  
Sandip Sinharay

One common score reported from diagnostic classification assessments is the vector of posterior means of the skill mastery indicators. As with any assessment, it is important to derive and report estimates of the reliability of the reported scores. After reviewing a reliability measure suggested by Templin and Bradshaw, this article suggests three new measures of reliability of the posterior means of skill mastery indicators and methods for estimating the measures when the number of items on the assessment and the number of skills being assessed render exact calculation computationally burdensome. The utility of the new measures is demonstrated using simulated and real data examples. Two of the suggested measures are recommended for future use.


Sign in / Sign up

Export Citation Format

Share Document