Utilizing an Automated Tool Analysis to Evaluate EFL Students’ Writing Performance
Abstract Recently, the integration of linguistics and technology has been promoted and widely used in the field of linguistics and English writing for several purposes. One of those purposes is to evaluate EFL writing ability by using electronic assessment tools in language teaching or rhetorical studies. In this study, an automated writing evaluation tool (Coh-Metrix version 3.0) was used to indicate English-major students’ writing performance based on the six discourse components of the texts and to determine the associations between those six results of Coh-Metrix analyses. The 80 EFL texts produced for each scheme of writing tasks on two different topics were collected. The corpus analyses gathered from Coh-Metrix identify linguistic and discourse features that were interpreted to determine the 40 EFL undergraduate students’ English writing abilities. The students wrote and revised their essays in a hand written form in class and resubmitted their essays in digital forms with corrections made. The results showed that these writers demonstrated linguistic flexibility across writing prompts that they produced. The analyses also indicated that the length of the texts, the word concreteness and the uses of the referential and deep cohesion had impacts on the students’ writing performances across the writing tasks. Besides, the findings suggest practical value in using the Coh-Metrix to support teachers’ instructional decisions that could help to identify improvement of students’ writing skill.