Effect of Text Coherence on Item Difficulty for the Most Difficult Questions in the 2019 College Scholastic Aptitude Test

2020 ◽  
Vol 25 (4) ◽  
pp. 703-734
Author(s):  
Hoyeol Ryu
1968 ◽  
Vol 23 (1) ◽  
pp. 119-134 ◽  
Author(s):  
Leon H. Belcher ◽  
Joel T. Campbell

Two word-association lists of 50 words were each administered to 50 Negro college students. 41 words were taken from the Kent-Rosanoff list, 29 from the Palermo-Jenkins list, and 30 were words used in analogy items of the Scholastic Aptitude Test. Comparisons with previous normative studies showed generally similar results. The present study did result in slightly smaller proportions of matching from class primary responses to noun, pronoun, and adverb stimulus words and of opposite responses to “opposite-evoking stimuli.” A number of the responses indicated reading difficulty or misunderstanding of the word.


1984 ◽  
Vol 54 (4) ◽  
pp. 389-413 ◽  
Author(s):  
Brian Powell ◽  
Lala Carr Steelman

Public attention has been drawn to recent reports of state-by-state variation in standardized test scores, in particular the Scholastic Aptitude Test (SAT). In this paper, Brian Powell and Lala Carr Steelman attempt to show how the dissemination of uncorrected state SAT scores may have created an inaccurate public and governmental perception of the variation in educational quality. Their research demonstrates that comparing state SAT averages is illadvised unless these ratings are corrected for compositional and demographic factors for which states may not be directly responsible.


Sign in / Sign up

Export Citation Format

Share Document