automated feedback
Recently Published Documents


TOTAL DOCUMENTS

221
(FIVE YEARS 101)

H-INDEX

19
(FIVE YEARS 4)

2022 ◽  
Vol 112 ◽  
pp. 103631
Author(s):  
Jennifer Jacobs ◽  
Karla Scornavacco ◽  
Charis Harty ◽  
Abhijit Suresh ◽  
Vivian Lai ◽  
...  

2021 ◽  
pp. 238-242
Author(s):  
Allan Nicholas ◽  
John Blake ◽  
Maxim Mozgovoy

Email remains a key mode of communication between faculty and students in higher education institutions. Composing appropriate email texts is an important skill for learners; however, little technological support is available for the pragmatic aspect of email communication – the ways in which social context influences language choices. Furthermore, pragmatics can be undertaught in the language classroom. One approach to providing support for learners while also addressing the issue of giving instruction to large class sizes is via computerisation. In this ongoing research project, we describe the development of a Computerised Diagnostic Language Assessment (C-DLA) of L2 English email writing for Japanese English as a Foreign Language (EFL) learners in Japanese higher education. The C-DLA provides automated feedback to learners on the pragmatic aspects of their draft email texts, with feedback adapting to learners’ success in resolving identified issues. We report on the development phases of the project, challenges encountered, and implications for further research.


2021 ◽  
Author(s):  
Vivien Challis ◽  
Roger Cook ◽  
Pranit Anand

This paper outlines an initiative that involved implementing ‘Numbas’ as a computer-based tool to support mathematics learning. ‘Numbas’ was implemented within the existing learning management system at Queensland University of Technology, where students engaged in formative assessment activities independently and were provided with automated feedback along the way. An initial evaluation was undertaken by learning designers using the ‘Assessment Design Decisions Framework’, and although more rigorous evaluation is underway, results indicate positive outcomes, and appropriate adjustment is likely to be made before rolling it out to other units within the School of Mathematical Sciences. This paper will be of interest to other educators looking for ways to embed independent computer-aided learning of mathematics.


2021 ◽  
Vol 14 (12) ◽  
pp. 189
Author(s):  
Ameni Benali

It is undeniable that attempts to develop automated feedback systems that support and enhance language learning and assessment have increased in the last few years. The growing demand for using technology in the classroom and the promotions provided by automated- written-feedback program developers and designers, drive many educational institutions to acquire and use these tools for educational purposes (Chen & Cheng, 2008). It remains debatable, however, whether students’ use of these tools leads to improvement in their essay quality or writing outcomes. In this paper I investigate the affordances and shortcomings of automated writing evaluation (AWE) on students’ writing in ESL/EFL contexts. My discussion shows that AWE can improve the quality of writing and learning outcomes if it is integrated with and supported by human feedback. I provide recommendations for further research into improving AWE tools to give more effective and constructive feedback.


2021 ◽  
Vol 19 (6) ◽  
pp. pp559-574
Author(s):  
Olav Dæhli ◽  
Bjørn Kristoffersen ◽  
Per Lauvås jr ◽  
Tomas Sandnes

Data modeling is an essential part of IT studies. Learning how to design and structure a database is important when storing data in a relational database and is common practice in the IT industry. Most students need much practice and tutoring to master the skill of data modeling and database design. When a student is in a learning process, feedback is important. As class sizes grow and teaching is no longer campus based only, providing feedback to each individual student may be difficult. Our study proposes a tool to use when introducing database modeling to students. We have developed a web-based tool named LearnER to teach basic data modeling skills, in a collaborative project between the University of South-Eastern Norway (USN) and Kristiania University College (KUC). The tool has been used in six different courses over a period of four academic years. In LearnER, the student solves modeling assignments with different levels of difficulty. When they are done, or they need help, they receive automated feedback including visual cues. To increase the motivation for solving many assignments, LearnER also includes gamifying elements. Each assignment has a maximum score. When students ask for help, points are deducted from the score. When students manage to solve many assignments with little help, they may end up at a leaderboard. This paper tries to summarize how the students use and experience LearnER. We look to see if the students find the exercises interesting, useful and of reasonable difficulty. Further, we investigate if the automated feedback is valuable, and if the gamifying elements contribute to their learning. As we have made additions and refinements to LearnER over several years, we also compare student responses on surveys and interviews during these years. In addition, we analyze usage data extracted from the application to learn more about student activity. The results are promising. We find that student activity increases in newer versions of LearnER. Most students report that the received feedback helps them to correct mistakes when solving modeling assignments. The gamifying elements are also well received. Based on LearnER usage data, we find and describe typical errors the students do and what types of assignments they prefer to solve.


2021 ◽  
Vol 5 (1) ◽  
pp. 29
Author(s):  
Sumin Wang ◽  
Yizhong Xu

The present study is intended to construct a college EFL self-access writing mode based on automated feedback under the guidance of Formative Assessment Theory and Autonomous Learning Theory and attempts to apply it into college EFL teaching practice. Findings of this empirical-based study suggest that this self-access writing mode contributes to the enhancement of students’ English writing competence, English writing motivation as well as their autonomy in self-revision.


2021 ◽  
pp. 364-370
Author(s):  
G. R. Marvez ◽  
Joshua Littenberg-Tobias ◽  
Teresa Ortega ◽  
Joel Breakstone ◽  
Justin Reich

2021 ◽  
Author(s):  
Nathan Laundry ◽  
Denis Nikitenko ◽  
Dan Gillis ◽  
Judi McCuaig

Sign in / Sign up

Export Citation Format

Share Document