language testing
Recently Published Documents


TOTAL DOCUMENTS

824
(FIVE YEARS 146)

H-INDEX

34
(FIVE YEARS 2)

2022 ◽  
pp. 19-35
Author(s):  
Marcelo L. Berthier
Keyword(s):  

2022 ◽  
Vol 12 (1) ◽  
pp. 203-211
Author(s):  
Ibtessam Abdulaziz Binnahedh

[1] Alderson, J. C., & Wall, D. (1993). Does washback exist? Applied Linguistics, 14, 115–129. [2] Ali, M. M., & Hamid, M. O. (2020). Teaching English to the test: Why Does negative washback exist within secondary education in Bangladesh? Language Assessment Quarterly, 17(2), 129-146. [3] Cheng, L., & Curtis, A. (2004). Washback or backwash: A review of the impact of testing on teaching and learning. In L. Cheng, Y. Watanabe, & A. Curtis (Eds.), Washback in language testing: Research contexts and methods (pp. 3–18). Mahwah, NJ: Lawrence Erlbaum Associates. [4] Da'asin, K. A. (2016). The attitude of Ash-Shobak University College Students to E-Exam for Intermediate University Degree in Jordan. Journal of Education and Practice, 7(9), 10-17. [5] Hughes, A. (1994). Backwash and TOEFL 2000. Unpublished manuscript, commissioned by Educational Testing Service (ETS). The University of Reading. [6] Hung, S. T. A. (2012). A washback study on e-portfolio assessment in an English as a Foreign Language teacher preparation program. Computer Assisted Language Learning, 25(1), 21-36. [7] Jiamin, X., Jinyan, L., & Tianyi, M. (2021). The wash-back effect of reformed CET 6 listening comprehension test. Asian Journal of Education and Training, 7(1), 70-73. [8] Johnson, M.& Shaw, S.(2019). What is computer-based testing washback, how can it be evaluated and how can this support practitioner research?, Journal of Further and Higher Education, 43:9, 1255-1270, DOI: 10.1080/0309877X.2018.1471127 [9] Meseke, Christopher A., Rita Nafziger, and Jamie K. Meseke. (2010). "Student attitudes, satisfaction, and learning in a collaborative testing environment." Journal of Chiropractic Education 24, no. 1: 19-29. [10] Messick, S. (1996). Validity and wash-back in language testing. Language testing. 13(3), 241-256. [11] Saglam, A. L. G. (2018). Can exams change how and what teachers teach? Investigating the washback effect of a university English language proficiency test in the Turkish context. Eurasian Journal of Applied Linguistics, 4(2), 155-176. [12] Tayeb, Y. A., Abd Aziz, M. S., Ismail, K., & Khan, A. B. M. A. (2014). The wash-back effect of the general secondary English examination (GSEE) on teaching and learning. GEMA Online® Journal of Language Studies, 14(3),83-103. [13] Wall, D. (1997). Impact and wash-back in language testing. In C. Clapham & D. Corson (Eds.). Encyclopedia of Language and Education (pp. 291-302). Dordrecht: Kluwer Academic Publishers


2022 ◽  
Vol 12 (1) ◽  
pp. 55-64
Author(s):  
Shifa Alotibi ◽  
Abdullah Alshakhi

This study seeks to explore the factors that influence EFL instructors’ rating decisions while using holistic and analytic rubrics. Few studies have been conducted on the factors that influence the rating practices of EFL instructors, specifically, in the Saudi EFL context. This study addresses this gap and contributes more broadly to the understanding of the interplay between EFL instructors and the use of holistic and analytic rubrics. The data were collected in a Saudi university at a preparatory year program (PYP). This study utilizes semi-structured interviews with eleven EFL instructors from different nationalities. Guided by the critical language testing as a theoretical framework and with qualitative analysis, the study reveals that critical language testing can minimize the negative consequences of writing assessment done by graders; however, students’ low English proficiency, time constraints, heavy workload can negatively affect the rating practices. Finally, several pedagogical implications, insights, and recommendations for future research are put forward in the conclusion.


2022 ◽  
Vol 39 (1) ◽  
pp. 3-6
Author(s):  
Luke Harding ◽  
Paula Winke

2021 ◽  
pp. 403-416
Author(s):  
Phuong Nguyen ◽  
Volker Hegelheimer

In second language (L2) spoken assessment, one challenge has been to mimic real-life situations. New technologies may help improve test authenticity by placing language learners into authentic settings. This chapter provides an overview of the use and usefulness of new technologies in L2 spoken assessment and outlines inherent opportunities and challenges presented by these technologies. Specifically, it discusses commonly used technologies for test design, delivery, and scoring of examinees’ responses. It concludes with a visitation of Douglas’s warning that “language testing … driven by technology, rather than technology being employed in the services of language testing, is likely to lead us down a road best not traveled” in light of recent technological advances.


2021 ◽  
Author(s):  
◽  
Lộc Thị Huỳnh Nguyễn

<p>The importance of teachers’ assessment literacy has been increasingly emphasised in the literature. However, very little research has paid attention to pre-service EFL teachers’ assessment literacy and how they develop this area during teacher training programmes. Moreover, there is a paucity of research on Vietnamese pre-service EFL teachers’ assessment literacy. This study was conducted in three phases to address these gaps: (1) Phase 1 provided a description of current assessment training at four Vietnamese teacher training universities, (2) Phase 2 attempted to map out pre-service EFL teachers’ confidence levels in assessment literacy, and (3) Phase 3 mainly focused on the development of four pre-service EFL teachers’ assessment literacy during their nine-week practicum at Bach Dang University (pseudonym).  Phase 1 relied on individual semi-structured interviews with four Vietnamese teacher-trainers to describe the current status of assessment training for pre-service EFL teachers at four key teacher training universities in terms of: (1) teacher-trainers’ background, (2) course content, (3) method of instruction, (4) support for assessment training, and (5) constraints of assessment training. The teacher-trainers noted their lack of professional development in testing and assessment. The method of instruction varied for different teacher training universities. The results showed a greater emphasis on training in summative rather than formative assessment. Also, teacher-trainers identified two main constraints in the current training programmes including: (i) the lack of systematic innovation in language testing and assessment and (ii) the lack of labour, facilities and time for language testing and assessment training.  In Phase 2, a questionnaire of pre-service EFL teachers’ confidence levels in assessment literacy was developed and validated. It was then administered to 365 pre-service EFL teachers. The results indicated pre-service EFL teachers’ high confidence levels in assessment literacy. Moreover, gender and career choice did not influence their confidence levels in assessment literacy while teaching experience and training in language testing and assessment did. However, those who had had more training scored lower confidence levels in assessment literacy.  Phase 3 was conducted in two parts to focus on assessment literacy development of four pre-service EFL teachers. Part 1 had two stages. Stage 1 administered the same questionnaire as in Phase 2 to thirty-one pre-service EFL teachers to investigate their confidence levels in assessment literacy over three time periods: before their language testing and assessment course, before their practicum, and after their practicum. The findings showed a significant statistical increase in their assessment literacy confidence levels. In Stage 2, eighteen pre-service EFL teachers in Stage 1 participated in two semi-structured focus group interviews to check if their confidence levels reflected their assessment literacy. The results indicated a need for data triangulation to claim their assessment literacy based on confidence levels.  Part 2 employed different research instruments including interviews, observation, stimulated recalls, and questionnaires to examine assessment literacy development of four pre-service EFL teachers over a nine-week practicum. The data indicated three main themes in pre-service EFL teachers’ development in assessment literacy: (1) pre-service EFL teachers’ development in: (i) giving feedback, (ii) designing test items, (iii) administering tests, (iv) observing students’ learning, (v) giving instructions, and (vi) improving their content knowledge, (2) pre-service EFL teachers’ individual differences in their assessment literacy development, and (3) incident-based learning of assessment literacy.  Overall, this study offered insights into the dynamic, situated and developmental nature of pre-service EFL teachers’ assessment literacy, which has useful implications for theory, research methodology and assessment training for pre-service EFL teachers. Moreover, the findings are very practical for different levels of administration, and for my role as a teacher-trainer in Vietnam.</p>


2021 ◽  
Author(s):  
◽  
Lộc Thị Huỳnh Nguyễn

<p>The importance of teachers’ assessment literacy has been increasingly emphasised in the literature. However, very little research has paid attention to pre-service EFL teachers’ assessment literacy and how they develop this area during teacher training programmes. Moreover, there is a paucity of research on Vietnamese pre-service EFL teachers’ assessment literacy. This study was conducted in three phases to address these gaps: (1) Phase 1 provided a description of current assessment training at four Vietnamese teacher training universities, (2) Phase 2 attempted to map out pre-service EFL teachers’ confidence levels in assessment literacy, and (3) Phase 3 mainly focused on the development of four pre-service EFL teachers’ assessment literacy during their nine-week practicum at Bach Dang University (pseudonym).  Phase 1 relied on individual semi-structured interviews with four Vietnamese teacher-trainers to describe the current status of assessment training for pre-service EFL teachers at four key teacher training universities in terms of: (1) teacher-trainers’ background, (2) course content, (3) method of instruction, (4) support for assessment training, and (5) constraints of assessment training. The teacher-trainers noted their lack of professional development in testing and assessment. The method of instruction varied for different teacher training universities. The results showed a greater emphasis on training in summative rather than formative assessment. Also, teacher-trainers identified two main constraints in the current training programmes including: (i) the lack of systematic innovation in language testing and assessment and (ii) the lack of labour, facilities and time for language testing and assessment training.  In Phase 2, a questionnaire of pre-service EFL teachers’ confidence levels in assessment literacy was developed and validated. It was then administered to 365 pre-service EFL teachers. The results indicated pre-service EFL teachers’ high confidence levels in assessment literacy. Moreover, gender and career choice did not influence their confidence levels in assessment literacy while teaching experience and training in language testing and assessment did. However, those who had had more training scored lower confidence levels in assessment literacy.  Phase 3 was conducted in two parts to focus on assessment literacy development of four pre-service EFL teachers. Part 1 had two stages. Stage 1 administered the same questionnaire as in Phase 2 to thirty-one pre-service EFL teachers to investigate their confidence levels in assessment literacy over three time periods: before their language testing and assessment course, before their practicum, and after their practicum. The findings showed a significant statistical increase in their assessment literacy confidence levels. In Stage 2, eighteen pre-service EFL teachers in Stage 1 participated in two semi-structured focus group interviews to check if their confidence levels reflected their assessment literacy. The results indicated a need for data triangulation to claim their assessment literacy based on confidence levels.  Part 2 employed different research instruments including interviews, observation, stimulated recalls, and questionnaires to examine assessment literacy development of four pre-service EFL teachers over a nine-week practicum. The data indicated three main themes in pre-service EFL teachers’ development in assessment literacy: (1) pre-service EFL teachers’ development in: (i) giving feedback, (ii) designing test items, (iii) administering tests, (iv) observing students’ learning, (v) giving instructions, and (vi) improving their content knowledge, (2) pre-service EFL teachers’ individual differences in their assessment literacy development, and (3) incident-based learning of assessment literacy.  Overall, this study offered insights into the dynamic, situated and developmental nature of pre-service EFL teachers’ assessment literacy, which has useful implications for theory, research methodology and assessment training for pre-service EFL teachers. Moreover, the findings are very practical for different levels of administration, and for my role as a teacher-trainer in Vietnam.</p>


Sign in / Sign up

Export Citation Format

Share Document