Computer Assisted Language Testing and the Washback Effect on Language Learning

Author(s):  
Zhang Hongjun ◽  
Pan Feng
2004 ◽  
Vol 37 (1) ◽  
pp. 66-69

04–83Akiyama, Tomoyasu (U. Melbourne, Australia). Assessing speaking: issues in school-based assessment and the introduction of speaking tests into the Japanese senior high school entrance examination. JALT Journal (Tokyo, Japan), 25, 2 (2003), 117–141.04–84Chiang, Steve (Yuan Ze University, Taiwan). The importance of cohesive conditions to perceptions of writing quality at the early stages of foreign language learning. System (Oxford, UK), 31 (2003), 471–484.04–85Escamilla, Kathy, Mahon, Elizabeth, Riley-Bernal, Heather and Rutledge, David (U. of Colorado, Boulder, USA). High-stakes testing, Latinos, and English language learners: lessons from Colorado. Bilingual Research Journal (Arizona, USA), 27, 1 (2003), 25–49.04–86Gorsuch, Greta (Texas Tech U., USA; Email: [email protected]). Test takers' experiences with computer-administered listening comprehension tests: interviewing for qualitative explorations of test validity. Calico Journal (Texas, USA), 21, 2 (2004), 339–371.04–87Hardcastle, Peter.How to not test language (Part 2). Language Testing Update (Lancaster, UK), 33 (2003), 28–35.04–88Hemard, D. and Cushion, S. (London Metropolitan, University, UK; Email: [email protected]). Design and evaluation of an online test: assessment conceived as a complementary CALL tool. Computer Assisted Language Learning (Lisse, The Netherlands), 16, 2–3 (2003), 119–139.04–89Ishii, David N. and Baba, Kyoko (U. of Toronto, Canada; Email: [email protected]). Locally developed oral skills evaluation in ESL/EFL classrooms: a checklist for developing meaningful assessment procedures. TESL Canada Journal/Revue TESL du Canada (Burnaby, Canada), 21, 1 (2003), 79–96.04–90Iwashita, Noriko and Grove, Elizabeth (University of Melbourne, Australia). A comparison of analytic and holistic scales in the context of a specific-purpose speaking test. Prospect (Sydney, Australia), 18, 3 (2003), 25–35.04–91Lee, Yong-Won (Educational Testing Service, Princeton, NJ, US; Email: [email protected]). Examining passage-related local item dependence (LID) and measurement construct using Q3statistics in an EFL reading comprehension test. Language Testing (London, UK), 21, 1 (2004), 74–100.04–92Qian, David D. (Hong Kong Polytechnic U., Hong Kong; Email: [email protected]) and Schedl, Mary (Educational Testing Service, Princeton, NJ, US). Evaluation of an in-depth vocabulary knowledge measure for assessing reading performance. Language Testing (London, UK), 21, 1 (2004), 28–52.04–93Rea-Dickins, Pauline (University of Bristol, UK). Classroom assessment of English as an additonal language: Key stage 1 contexts – summary of research findings. Language Testing Update (Lancaster, UK), 33 (2003), 48–53.04–94Rodgers, Catherine, Meara, Paul and Jacobs, Gabriel (U. of Wales Swansea, UK). Factors affecting the standardisation of translation examinations. Language Learning Journal (London, UK), 28 (Winter 2003), 49–54.


ReCALL ◽  
2002 ◽  
Vol 14 (1) ◽  
pp. 167-181 ◽  
Author(s):  
MARIE J. MYERS

With innovative ways available to assess language performance through the use of computer technology, practitioners have to rethink their preferred strategies of language testing. It is necessary to take into account both the new developments in language learning and teaching research and also the latest features computers have to offer to help with language assessment. In addition to best practices developed over the years in the field, it is necessary for provision to be made for authentic assessments of intercultural communication abilities. After a review of the latest language-testing literature and a discussion of the current problems identified in it, this paper explores the latest developments in computer technology and proposes areas of language testing in the light of the new findings. A practical application follows. This is an adaptation, in a school board in Ontario, of the latest evaluation model. The model represents unit planning as an isosceles triangle with assessed assignments stacked in horizontal bands from the base to the vertex, i.e. the top. The suggestion is offered that this approach can be enriched, by changing the triangle into a pyramid with a different model on each side. Access to the four sides by rotation of the pyramid allows a broader range of activities culminating in one final assessment task at the summit.


2002 ◽  
pp. 3-6
Author(s):  
Magdolna Feketéné Silye ◽  
Troy B. Wiwczaroski

Over the past few decades, a fairly large literature examining the effectiveness of computer-assisted language learning (CALL) has been developed. The findings indicate that language learners have generally positive attitudes toward using computers in the classroom. Less is known, however, about the more specific areas of computers in language testing. The purpose of this article is to examine recent developments in language testing that directly involve computer use. After a brief overview of computer-based testing (CBT) in general, web-based testing (WBT) is defined and certain issues reviewed.


2018 ◽  
Vol 3 (2) ◽  
Author(s):  
Ramia DIRAR SHEHADEH MUSMAR

Integrating scaffolding-learning technologies has been recognized for its potential to create intellectual and engaging classroom interactions. In the United Arab Emirates, having language teachers employ computers as a medium of new pedagogical instrument for teaching second languages generated the idea of Computer-Assisted Language Learning (CALL) as a medium of an innovative pedagogical instrument for facilitating and scaffolding language learning, with an aspiration that it will lead to improved English language attainment and better assessment results. This study aims at investigating the perspectives of students and teachers on the advantageous and disadvantageous impacts of CALL on learning and teaching English as a second language in one public school in the emirate of Abu Dhabi. The results show that CALL has a facilitating role in L2 classroom and that using CALL activities is advantageous in reducing English learning tension, boosting motivation, catering for student diversity, promoting self-directed language learning and scaffolding while learning English. The results additionally report that numerous aspects like time constraints, teachers’ unsatisfactory computer skills, insufficient computer facilities, and inflexible school courses undesirably affect the implementation of CALL in English classrooms. It is recommended that further studies should be undertaken to investigate the actual effect of CALL on students’ language proficiency. 


2020 ◽  
Vol 0 (0) ◽  
Author(s):  
Fridah Katushemererwe ◽  
Andrew Caines ◽  
Paula Buttery

AbstractThis paper describes an endeavour to build natural language processing (NLP) tools for Runyakitara, a group of four closely related Bantu languages spoken in western Uganda. In contrast with major world languages such as English, for which corpora are comparatively abundant and NLP tools are well developed, computational linguistic resources for Runyakitara are in short supply. First therefore, we need to collect corpora for these languages, before we can proceed to the design of a spell-checker, grammar-checker and applications for computer-assisted language learning (CALL). We explain how we are collecting primary data for a new Runya Corpus of speech and writing, we outline the design of a morphological analyser, and discuss how we can use these new resources to build NLP tools. We are initially working with Runyankore–Rukiga, a closely-related pair of Runyakitara languages, and we frame our project in the context of NLP for low-resource languages, as well as CALL for the preservation of endangered languages. We put our project forward as a test case for the revitalization of endangered languages through education and technology.


Sign in / Sign up

Export Citation Format

Share Document