scholarly journals Designing, implementing, and evaluating an automated writing evaluation tool for improving EFL graduate students’ abstract writing: a case in Taiwan

2015 ◽  
Author(s):  
Hui-Hsien Feng
2020 ◽  
Author(s):  
Anongnad Petchprasert

Abstract Recently, the integration of linguistics and technology has been promoted and widely used in the field of linguistics and English writing research for several purposes. One of those purposes is to evaluate English as a Foreign Language (EFL) writing ability by using electronic assessment tools. In the current study, an automated writing evaluation tool (Coh-Metrix) was used to indicate English-major students’ writing performances based on the discourse components of the texts. The English texts generated for each writing task on two different topics were collected. The corpus analyses gathered from Coh-Metrix identified linguistic and discourse features that were interpreted to determine the 40 EFL undergraduate students’ English writing abilities. The students wrote and revised their essays in hand-written essays in class and resubmitted their essays in digital forms with corrections made. The results showed that these students demonstrated linguistic flexibility across writing assignments that they produced. The analyses also indicated that the length of the texts and the uses of the word concreteness, and the referential and deep cohesion had impacts on the students’ writing performances across the writing tasks. Besides, the findings suggest practical value in using an automated text analysis to support teachers’ instructional decisions that could help to identify improvement of students’ writing skill.


Author(s):  
Dragana Lazic

The poster discusses the possibilities of technology-assisted peer feedback in English as a Foreign Language (EFL) writing classrooms among low proficiency students. It is a part of an ongoing research project developed after a study conducted in the first half of 2019 (Lazic & Tsuji, 2020a, 2020b). The first goal is to explore the effectiveness of in-class activities, which include technology-assisted peer feedback, in improving global aspects of writing, i.e. paragraph structure and content, and to examine the uptake of peer feedback delivered via an Automated Writing Evaluation tool (AWE), Educational Testing Service (ETS) Criterion®. Second, the study looks at students’ perceptions. Participants were 15 first-year students taking an academic writing class.


Author(s):  
Anongnad Petchprasert

AbstractRecently, the integration of linguistics and technology has been promoted and widely used in the field of linguistics and English writing research for several purposes. One of those purposes is to evaluate English as a Foreign Language (EFL) writing ability by using electronic assessment tools. In the current study, an automated writing evaluation tool (Coh-Metrix) was used to indicate English-major students’ writing performances based on the discourse components of the texts. The English texts generated for each writing task on two different topics were collected. The corpus analyses gathered from Coh-Metrix identified linguistic and discourse features that were interpreted to determine the 40 EFL undergraduate students’ English writing abilities. The students wrote and revised their essays in hand-written essays in class and resubmitted their essays in digital forms with corrections made. The results showed that these students demonstrated linguistic flexibility across writing assignments that they produced. The analyses also indicated that the length of the texts and the uses of the word concreteness, and the referential and deep cohesion had impacts on the students’ writing performances across the writing tasks. Besides, the findings suggest practical value in using an automated text analysis to support teachers’ instructional decisions that could help to identify improvement of students’ writing skill.


2021 ◽  
Vol 27 (1) ◽  
pp. 41
Author(s):  
Meilisa Sindy Astika Ariyanto ◽  
Nur Mukminatien ◽  
Sintha Tresnadewi

Automated Writing Evaluation (AWE) programs have emerged as the latest trend in EFL writing classes. AWE programs act as a supplementary to teacher feedback and offer automated suggestions and corrections to students' linguistic errors such as grammar, vocabulary, or mechanics. As there is a need for better recognition of different AWE brands utilized for different levels of students, this research sheds light on identifying six university students’ views of an AWE program, namely ProWritingAid (PWA). The six students are categorized as having high or low writing achievement. This descriptive study delineates the students’ perceptions qualitatively. A semi-structured interview was used to collect the data. The findings suggest the students’ positive views of PWA because it could make class time more effective; it had useful feedback on grammar, vocabulary choices, and mechanics; and it built students‘ self-confidence over their compositions. In addition, for different reasons, the students engaged differently with PWA to enhance their drafts, e.g. using PWA only for the first drafts or for the first and final drafts. Finally, despite of the students’ constructive views on PWA, there was a risk that students only engaged superficially with the program by hitting the correction directly.


Sign in / Sign up

Export Citation Format

Share Document