scholarly journals ANALISIS RESPONS BUTIR PADA TES BAKAT SKOLASTIK

2018 ◽  
Vol 17 (1) ◽  
pp. 1
Author(s):  
Farida Agus Setiawati ◽  
Rita Eka Izzaty ◽  
Veny Hidayat

This study aims to analyze the characteristics of the Scholastic Aptitude Test (SAT), consisting of both verbal and numerical subtests. We used a descriptive quantitative approach by describing the characteristics of SAT based on the degree of item difficulty, item discrimination index, pseudoguessing index, test information function and standard error measurement. The data are responses of the SAT instrument, collected from 1,047 subjects in Yogyakarta using the documentation technique. Data were then analyzed by Item Response Theory (IRT) approach with the help of the BILOG program on all logistic parameter models, preceded by identifying item suitability with the model. Analysis concludes that: verbal subtest tends to compliment the 2-PL and 3-PL model, meanwhile, numerical subtest only fit the 2-PL model. Majority items of SAT have a good characteristic on index of item difficulty, item discrimination, and pseudoguessing, and based of test information function, SAT is accurate to be used in the 1-PL, 2-PL, and 3-PL IRT models for all level of ability.

2011 ◽  
Vol 17 (1) ◽  
pp. 42-61 ◽  
Author(s):  
Natalja Kosareva ◽  
Aleksandras Krylovas

In this paper the new approach to the forecasting the results of knowledge testing, proposed earlier by authors, is extended with four classes of parametric functions, the best fitting one from which is selected to approximate item characteristic function. Mathematical model is visualized by two numerical experiments. The first experiment was performed with the purpose to show the procedure of selecting the most appropriate item characteristic function and adjusting the parameters of the model. Goodness-of-fit statistic for detecting misfit of the selected model is calculated. In the second experiment a test of 10 items is constructed for the population with latent ability having normal distribution. Probability distribution of total test result and test information function are calculated when item characteristic functions are selected from four classes of parametric functions. In the next step it is shown how test information function value could be increased by adjusting parameters of item characteristic functions to the observed population. This model could be used not only for knowledge testing but also when solving diagnostic tasks in various fields of human activities. Other advantage of this method is the reduction of resources of testing process by more precise adjustment of the model parameters and decreasing the standard error of measurement of the estimated examinee ability. In the presented example the methodology is applied for solving the problem of microclimate evaluation in office rooms.


This study aimed at revealing the effect of the number of alternatives in the multiple choice tests on the information function of the item and the test according to the three-parameter model under the item response theory. To achieve the objective of the study, a multi-choice achievement test was constructed in the second part of mathematics subject for the 10th grade students in the public schools in the capital city of Amman. The final test consists of 38 paragraphs and three models are prepared, which differ only in the number of item alternatives. The sample of the study consisted of 1530 students. The results of the study showed statistically significant differences in reliability in favor of the five- alternative form, as well as the fouralternative form, the results also showed no statistically significant differences between the arithmetic means of the information function due to the variable number of item alternatives.


2015 ◽  
Vol 9 (1) ◽  
pp. 161
Author(s):  
Eman Rasmi Abed ◽  
Mohammad Mustafa Al-Absi ◽  
Yousef Abdelqader Abu shindi

<p class="apa">The purpose of the present study is developing a test to measure the numerical ability for students of education. The sample of the study consisted of (504) students from 8 universities in Jordan. The final draft of the test contains 45 items distributed among 5 dimensions.</p><p class="apa">The results revealed that acceptable psychometric properties of the test; items parameters (difficulty, discrimination) were estimated by item response theory IRT, the reliability of the test was assessed by: Cronbach’s Alpha, average of inter-item correlation, and test information function (IRT), and the validity of the test was assessed by: arbitrator's views, factor analysis, RMSR, and Tanaka Index.</p><p class="apa">The numerical ability test can be used to measure the strength and weaknesses in numerical ability for educational faculty students, and the test can be used to classify students on levels of numerical ability.</p>


2001 ◽  
Vol 26 (2) ◽  
pp. 180-198 ◽  
Author(s):  
Ing-Long Wu

This paper discusses the simultaneous approach for test generation of two-stage and multistage tests from an item bank. However, most previous test generation problems involved binary programming models, and the efficiency of available solution algorithms is the major concern for these problems. Therefore, this study considers two important concepts on the solution process, i.e., alternative ways to formulate mathematical models and alternative solution algorithms. Based on these two concepts, this paper is two-fold. First, two binary programming models with a special network structure that would be explored computationally are presented for modeling these problems. The first model maximizes test information function at one specified ability point and the second model matches the target test information function at several specified ability points as closely as possible. Second, an efficient special-purpose network algorithm is then used to solve these two models. Hence, the test construction process tends to be improved in terms of both computational efforts and quality of tests. An empirical study shows the results in line with the above two criteria.


Sign in / Sign up

Export Citation Format

Share Document