scholarly journals Online Peer Assessment in Teacher Education

Author(s):  
Fatma Betül KURNAZ ADIBATMAZ
2014 ◽  
Vol 31 (1) ◽  
pp. 1-15 ◽  
Author(s):  
Fred Phillips

ABSTRACTThis paper describes an online system that facilitates peer assessment of students' course work and then uses data from individual case writing assignments in introductory financial accounting to empirically examine associations between peer assessment and case writing performance. Through this description and empirical analysis, the paper addresses the following questions: (1) Why use peer assessment? (2) How does online peer assessment work? (3) Is student peer assessment reliable? (4) What do students think of peer assessment? (5) Does student peer assessment contribute to academic performance? Three key findings from this study are that students at the sophomore level were able to generate reasonably reliable feedback for peers, they valued the experiences involved in providing peer feedback, and giving quality feedback had a more significant and enduring impact on students' accounting case analyses than did receiving quality feedback, after controlling for differences in accounting knowledge and case writing skills.


Author(s):  
Ioannis Giannoukos ◽  
Ioanna Lykourentzou ◽  
Giorgos Mpardis ◽  
Vassilis Nikolopoulos ◽  
Vassili Loumos ◽  
...  

2019 ◽  
Author(s):  
Leonam Oliveira ◽  
Wellton Costa de Oliveira ◽  
Selma Santos Rosa ◽  
Andrey Pimentel

2020 ◽  
Vol 8 (2) ◽  
pp. 43
Author(s):  
Ren-Yu Liao ◽  
Ching-Tao Chang ◽  
Chun-Ying Chen

This paper reports on a study involving the design of online peer assessment (PA) activities to support university students’ small-group project-based learning in an introductory course. The study aimed to investigate the influences of different types of PA in terms of the rubric (quantitative ratings), peer feedback (qualitative comments) and hybrid (a combination of the rubric and peer feedback) on students’ project performance, and to explore further students’ perspectives on online PA. The quantitative findings suggested that (a) students in the hybrid condition likely had better project performance than those in the peer feedback condition did, and (b) students in the rubric condition could perform equally well as those in both of the hybrid and peer feedback conditions. The qualitative findings suggested that besides types of assessment, other possible confounding variables that might affect performance included perceived learning benefits, professional assessment, acceptance, and the online PA system.


Author(s):  
Michiko Kobayashi

The study investigated the effects of anonymity on online peer assessment and compared three different conditions. Fifty-eight preservice teachers at a mid-size US university engaged in a series of online peer assessments during fall 2017. Peer assessment was embedded in a blended course as a required asynchronous activity using the Canvas learning management system. Students were randomly assigned to three different peer assessment conditions: anonymous, partially anonymous, and identifiable. They were asked to provide feedback comments and rate the quality of peers’ work. The researcher examined to what extent three different conditions had influenced the quality of feedback comments, measured quantitatively through the number of words and negative statements. At the end of the semester, a survey that included a 5-point Likert scale and several open-ended questions was also distributed to analyse students’ perceptions about peer assessment and anonymity. The results indicate that although students prefer anonymity, it may not be a necessary condition for increasing student engagement.


Sign in / Sign up

Export Citation Format

Share Document