Reviews of English Language Proficiency Tests. J. Charles Alderson, Karl J. Krahnke, and Charles W. Stansfield (Eds.). Washington, DC: Teachers of English to Speakers of Other Languages, 1987. Pp. iv + 88. $16.50.

1989 ◽  
Vol 11 (3) ◽  
pp. 358-358
Author(s):  
Dan Douglas
2017 ◽  
Vol 35 (2) ◽  
pp. 297-317 ◽  
Author(s):  
Tanya Longabach ◽  
Vicki Peyton

K–12 English language proficiency tests that assess multiple content domains (e.g., listening, speaking, reading, writing) often have subsections based on these content domains; scores assigned to these subsections are commonly known as subscores. Testing programs face increasing customer demands for the reporting of subscores in addition to the total test scores in today’s accountability-oriented educational environment. Although reporting subscores can provide much-needed information for teachers, administrators, and students about proficiency in the test domains, one of the major drawbacks of subscore reporting includes their lower reliability as compared to the test as a whole. In addition, viewing language domains as if they were not interrelated, and reporting subscores without considering this relationship between domains, may be contradictory to the theory of language acquisition. This study explored several methods of assigning subscores to the four domains of a state English language proficiency test, including classical test theory (CTT)-based number correct, unidimensional item response theory (UIRT), augmented item response theory (A-IRT), and multidimensional item response theory (MIRT), and compared the reliability and precision of these different methods across language domains and grade bands. The first two methods assessed proficiency in the domains separately, without considering the relationship between domains; the last two methods took into consideration relationships between domains. The reliability and precision of the CTT and UIRT methods were similar and lower than those of A-IRT and MIRT for most domains and grade bands; MIRT was found to be the most reliable method. Policy implications and limitations of this study, as well as directions for further research, are discussed.


2019 ◽  
Vol 9 (3) ◽  
pp. 319
Author(s):  
Aisha M. Alhussain

The IELTS English proficiency tests are discussed as being highly effective in determining students’ level of proficiency in the language. However, the study points out that the processes involved in the administration of the tests along with the associated cost make affect the effectiveness of its use in the assessment of learners. A Sentence Pattern test is offered as an alternative with 97 participants taking part in the assessment to test its effectiveness. Each of the non-native study participants is subjected to both the SP test and the IELTS test for the establishment of the correlation in the results posted for the two tests. The findings demonstrate that the students’ performance in the IELTS test correlated with their corresponding SP test results. High performing students in the IELTS test also posted high scores in their SP test. As demonstrated in the study, the correlation in the results illustrates the effectiveness of the SP test as an alternative for the IELTS tests in proving English language proficiency.


Author(s):  
Talip Karanfil ◽  
Steve Neufeld

High-stakes and high-volume English language proficiency tests typically rely on multiple-choice questions (MCQs) to assess reading and listening skills. Due to the Covid-19 pandemic, more institutions are using MCQs via online assessment platforms, which facilitate shuffling the order of options within test items to minimize cheating. There is scant research on the role that order and sequence of options plays in MCQs, so this study examined the results of a paper-based, high-stakes English proficiency test administered in two versions. Each version had identical three-option MCQs but with different ordering of options. The test-takers were chosen to ensure a very similar profile of language ability and level for the groups who took the two versions. The findings indicate that one in four questions exhibited significantly different levels of difficulty and discrimination between the two versions. The study identifies order dominance and sequence priming as two factors that influence the outcomes of MCQs, both of which can accentuate or diminish the power of attraction of the correct and incorrect options. These factors should be carefully considered when designing MCQs in high-stakes language proficiency tests and shuffling of options in either paper-based or computer-based testing.


2016 ◽  
Vol 9 (5) ◽  
pp. 147 ◽  
Author(s):  
Smitha Dev ◽  
Sura Qiqieh

<p class="apa">The present study aims to find out the relationship between English Language proficiency, self-esteem, and academic achievement of the students in Abu Dhabi University (ADU). The variables were analyzed using ‘t’ test, chi-squire and Pearson’s product moment correlation. In addition, Self-rating scale, Self-esteem inventory and Language proficiency tests were used to measure the variables. The data were collected from 200 male and female students from Abu Dhabi University. The study could not find out any positive relationship among the variables. It is also revealed that language fluency (IELTS) has no direct impact on the ADU students’ self-esteem scores and academic achievement (GPA).</p>


1990 ◽  
Vol 74 (2) ◽  
pp. 224
Author(s):  
Mary Emily Call ◽  
J. Charles Alderson ◽  
Karl J. Krahnke ◽  
Charles Stansfield

2019 ◽  
Vol 9 (2) ◽  
Author(s):  
Anik Nunuk Wulyani ◽  
Irina Elgort ◽  
Averil Coxhead

This paper reports on a study looking at the reading and writing proficiency and vocabulary knowledge of Indonesian EFL teachers, the relationship between proficiency and years of service, and the teachers’ own perceptions of their proficiency in English. Three proficiency tests (Vocabulary Levels Test/VLT, Reading and Writing Tests), questionnaire, and interview were used to collect data. The results point to mixed levels of English language proficiency, negative correlations between years of service and vocabulary, reading and writing test results, and that teachers themselves had difficulties in judging their own English language proficiency. Factors that inhibit the capacity of teachers to focus on their English proficiency are presented. Limitations of the study as well as implications for EFL teachers’ professional development (PD) and future research are also discussed.


2019 ◽  
Vol 3 (2) ◽  
pp. 153
Author(s):  
Nurti Rahayu

The growing importance of English competence in various industries is one of the pull factors of the increasing need of English Language Proficiency (ELP) for university graduates. One of the ways to motivate students’ English competence is to set English language proficiency as graduation benchmark. This study aims to describe students’ perspectives on such policy and how it affects the students’ learning. A descriptive qualitative research is employed. A questionnaire is used to gain the primary data. The research participants are students of Trisakti School of tourism who has experienced such policy. The questionnaires comprise of two parts: demographic data and questions of students’ perceptions on various aspects on the usage of English language proficiency tests. Students’ perception on various campus program to improve English competence is also asked.  The results reveal that students are familiar with the international standardized tests such as TOEIC, TOEFL and IELTS and they do make various preparation for the tests. As much as 60% of the respondents agree with the campus policy to set TOEIC and TOEFL score for graduation requirements. As for students’ perception on campus program, the results reveal that awarding scholarship for the top achiever students and live in program in English speaking program are considered to be the most effective program. The least effective programs are class grouping based on test score. The results can be of a valuable source for policy makers in the institution to develop a more effective program ti improve the students’  English competence.


Sign in / Sign up

Export Citation Format

Share Document