Gender differences in item performance and predictive validity on the DAT Quantitative Reasoning Test

1989 ◽  
Vol 53 (12) ◽  
pp. 708-711 ◽  
Author(s):  
RM Smith ◽  
GA Kramer ◽  
AT Kubiak
2019 ◽  
Vol 35 (3) ◽  
pp. 392-402 ◽  
Author(s):  
Ricarda Steinmayr ◽  
Birgit Spinath

Abstract. Gender differences in the numerical domain vary greatly according to the assessment method used. We suggest that strict time constraints, as employed on most numerical intelligence tests but not on mathematical competency tests, unduly increase the gender gap in measured numerical intelligence if the test focuses reasoning. Two studies were conducted. First, 666 11th and 12th graders were randomly assigned to speeded or nonspeeded versions of verbal, figural, and numerical reasoning tests. Extending the test time reduced gender differences in numerical but not in verbal and figural reasoning. To rule out ceiling effects and to test for potential motivational and emotional effects on test performance, a second sample of 542 students completed both a speeded and a nonspeeded numerical reasoning test as well as several motivational and emotional questionnaires. In the nonspeeded condition, girls increased their performance more than boys. This effect was especially strong for female students with medium and high performances and was largely but not fully explained by emotional and motivational factors. We conclude that girls are prevented from showing their actual potential on speeded numerical reasoning tests.


2020 ◽  
Vol 74 (4_Supplement_1) ◽  
pp. 7411500048p1
Author(s):  
Catherine Cavaliere ◽  
Pamela Story ◽  
Joanna Such ◽  
Aileen Burke ◽  
Kathryn Kendrick

AERA Open ◽  
2017 ◽  
Vol 3 (1) ◽  
pp. 233285841769299
Author(s):  
Benjamin W. Domingue ◽  
David Lang ◽  
Martha Cuevas ◽  
Melisa Castellanos ◽  
Carolina Lopera ◽  
...  

Technical schools are an integral part of the education system, and yet, little is known about student learning at such institutions. We consider whether assessments of student learning can be jointly administered to both university and technical school students. We examine whether differential test functioning may bias inferences regarding the relative performance of students in quantitative reasoning and critical reading. We apply item response theory models that allow for differences in response behavior as a function of school context. Items show small yet consistent differential functioning in favor of university students, especially for the quantitative reasoning test. These differences are shown to affect inferences regarding effect size differences between the university and technical students (effect sizes can fall by 44% in quantitative reasoning and 24% in critical reading). Differential test functioning influences the rank orderings of institutions by up to roughly 5 percentile points on average.


Sign in / Sign up

Export Citation Format

Share Document