scholarly journals An Automated Test Assembly Design for a Large-Scale Chinese Proficiency Test

2016 ◽  
Vol 40 (3) ◽  
pp. 233-237 ◽  
Author(s):  
Shiyu Wang ◽  
Yi Zheng ◽  
Chanjin Zheng ◽  
Ya-Hui Su ◽  
Peize Li
Psych ◽  
2021 ◽  
Vol 3 (2) ◽  
pp. 96-112
Author(s):  
Benjamin Becker ◽  
Dries Debeer ◽  
Karoline A. Sachse ◽  
Sebastian Weirich

Combining items from an item pool into test forms (test assembly) is a frequent task in psychological and educational testing. Although efficient methods for automated test assembly exist, these are often unknown or unavailable to practitioners. In this paper we present the R package eatATA, which allows using several mixed-integer programming solvers for automated test assembly in R. We describe the general functionality and the common work flow of eatATA using a minimal example. We also provide four more elaborate use cases of automated test assembly: (a) The assembly of multiple test forms for a pilot study; (b) the assembly of blocks of items for a multiple matrix booklet design in the context of a large-scale assessment; (c) the assembly of two linear test forms for individual diagnostic purposes; (d) the assembly of multi-stage testing modules for individual diagnostic purposes. All use cases are accompanied with example item pools and commented R code.


Psych ◽  
2020 ◽  
Vol 2 (4) ◽  
pp. 315-337
Author(s):  
Giada Spaccapanico Proietti ◽  
Mariagiulia Matteucci ◽  
Stefania Mignani

In testing situations, automated test assembly (ATA) is used to assemble single or multiple test forms that share the same psychometric characteristics, given a set of specific constraints, by means of specific solvers. However, in complex situations, which are typical of large-scale assessments, ATA models may be infeasible due to the large number of decision variables and constraints involved in the problem. The purpose of this paper is to formalize a standard procedure and two different strategies—namely, additive and subtractive—for overcoming practical ATA concerns with large-scale assessments and to show their effectiveness in two case studies. The MAXIMIN and MINIMAX ATA methods are used to assemble multiple test forms based on item response theory models for binary data. The main results show that the additive strategy is able to identify the specific constraints that make the model infeasible, while the subtractive strategy is a faster but less accurate process, which may not always be optimal. Overall, the procedures are able to produce parallel test forms with similar measurement precision and contents, and they minimize the number of items shared among the test forms. Further research could be done to investigate the properties of the proposed approaches under more complex testing conditions, such as multi-stage testing, and to blend the proposed approaches in order to obtain the solution that satisfies the largest set of constraints.


2013 ◽  
Vol 221 (3) ◽  
pp. 190-200 ◽  
Author(s):  
Jörg-Tobias Kuhn ◽  
Thomas Kiefer

Several techniques have been developed in recent years to generate optimal large-scale assessments (LSAs) of student achievement. These techniques often represent a blend of procedures from such diverse fields as experimental design, combinatorial optimization, particle physics, or neural networks. However, despite the theoretical advances in the field, there still exists a surprising scarcity of well-documented test designs in which all factors that have guided design decisions are explicitly and clearly communicated. This paper therefore has two goals. First, a brief summary of relevant key terms, as well as experimental designs and automated test assembly routines in LSA, is given. Second, conceptual and methodological steps in designing the assessment of the Austrian educational standards in mathematics are described in detail. The test design was generated using a two-step procedure, starting at the item block level and continuing at the item level. Initially, a partially balanced incomplete item block design was generated using simulated annealing, whereas in a second step, items were assigned to the item blocks using mixed-integer linear optimization in combination with a shadow-test approach.


2011 ◽  
Vol 35 (8) ◽  
pp. 643-644 ◽  
Author(s):  
Ryoungsun Park ◽  
Jiseon Kim ◽  
Barbara G. Dodd ◽  
Hyewon Chung

JPLEX, short for Java simPLEX, is an automated test assembly (ATA) program. It is a mixed integer linear programming (MILP) solver written in Java. It reads in a configuration file, solves the minimization problem, and produces an output file for postprocessing. It implements the simplex algorithm to create a fully relaxed solution and branch-and-bound (BB) method to find an integer solution (Bazaraa, Jarvis, & Sherali, 1990). The input configuration file format for linear programming (LP) modeling (van der Linden, 2005) is tailored for ATA and easy to build without the help of other modeling languages. It was originally designed for midsized test assembly applications, but there is no restriction on the number of decision variables. This solver might be useful for research students who need to assemble a test from an item pool and for those who want to learn or teach LP concepts.


Sign in / Sign up

Export Citation Format

Share Document