The Development of an Online Peer Assessment Tool Using Google Applications

Author(s):  
Pongrapee Kaewsaiha ◽  
Sumalee Chanchalor
Author(s):  
Peter M. Ostafichuk ◽  
Carol P. Jaeger

 Abstract This paper explores the implementation, outcomes, and student perceptions of the use of an online tool for anonymous peer assessment of student work. Peer assessment, where one student assesses the work of another, provides an opportunity for important skill development, as well as a fully-scalable strategy for rich, timely, and frequent feedback.  In first and third year engineering courses at the University of British Columbia, we have begun using an online peer assessment tool (peerScholar). The tool divides the peer assessment process into three phases: a creation phase where the work is written or uploaded, an assessment phase where students are randomly assigned to assess the work of a set number of their peers, and a review phase where students review the feedback they received, with options to revise their work or assess the quality of feedback received. We have successfully used this tool in two large (n = 750) classes and one moderate-sized (n = 130) class, with a wide range of different types of student work, including letters, technical memoranda, detailed design reports, and video presentations.  Through surveys, student feedback with the tool and the process has been positive. Students at both year levels overwhelmingly recognize the importance of peer assessment—over 90% identified it as an essential skill for an engineer, and over 85% felt opportunities for peer assessment should be embedded in the curriculum. Both groups indicate that they felt the process of reviewing others’ work was beneficial for their own understanding of the material; however, first year students were more likely than third year students to put more effort into their work knowing it would be peer assessed, and that they found the content of the feedback received more helpful to their learning. Student acceptance has been good.  In a third year mechanical design course, three different design assignments were independently assessed by students using peerScholar and by teaching assistants. The outcomes across all measures were encouraging: for each assignment, the students and teaching assistants had similar mean, standard deviation, minimum, and maximum values, as well as reasonable correlation (r = 0.5 overall).  Overall, we consider the adoption of peerScholar a success. Students have been receptive, challenges have been minor, and feedback is more detailed and frequent.  


2014 ◽  
Vol 31 (1) ◽  
pp. 1-15 ◽  
Author(s):  
Fred Phillips

ABSTRACTThis paper describes an online system that facilitates peer assessment of students' course work and then uses data from individual case writing assignments in introductory financial accounting to empirically examine associations between peer assessment and case writing performance. Through this description and empirical analysis, the paper addresses the following questions: (1) Why use peer assessment? (2) How does online peer assessment work? (3) Is student peer assessment reliable? (4) What do students think of peer assessment? (5) Does student peer assessment contribute to academic performance? Three key findings from this study are that students at the sophomore level were able to generate reasonably reliable feedback for peers, they valued the experiences involved in providing peer feedback, and giving quality feedback had a more significant and enduring impact on students' accounting case analyses than did receiving quality feedback, after controlling for differences in accounting knowledge and case writing skills.


Author(s):  
Ioannis Giannoukos ◽  
Ioanna Lykourentzou ◽  
Giorgos Mpardis ◽  
Vassilis Nikolopoulos ◽  
Vassili Loumos ◽  
...  

2019 ◽  
Vol 10 ◽  
pp. 1478-1487
Author(s):  
Sonia Dutta Gupta ◽  
Fatimah Abdullah ◽  
Gu Li ◽  
Yang Xueshuang

Peer assessment has attracted more attention an effective assessment tool in recent years. Peer assessment refers to the arrangement for peers to consider the quality of learning outcomes of others of similar status, it received attention of various studies due to the growing interest in the student centered approach as learners need to be involved in the learning process even in the assessment. This paper is a critical review of previous studies on peer assessment in English as Second/Foreign Language (ESL/EFL) context. Fifteen peer assessment studies from 2004 to 2017 were extensively reviewed and systematically analyzed. Peer assessment of  the reviewed studies were on the quality of the writing outcomes of students in the EFL/ESL contexts, as writing skill occupies an important role in teaching English language.  As a critical review paper of these studies, this paper highlights practical use of peer assessment and the important challenges or issues that need to be considered when utilizing peer assessment in the classroom. This paper hopes that practical measures of peer assessment will be utilized effectively by educators in the ESL and EFL classrooms in the near future.


2019 ◽  
Author(s):  
Leonam Oliveira ◽  
Wellton Costa de Oliveira ◽  
Selma Santos Rosa ◽  
Andrey Pimentel

2020 ◽  
Vol 8 (2) ◽  
pp. 43
Author(s):  
Ren-Yu Liao ◽  
Ching-Tao Chang ◽  
Chun-Ying Chen

This paper reports on a study involving the design of online peer assessment (PA) activities to support university students’ small-group project-based learning in an introductory course. The study aimed to investigate the influences of different types of PA in terms of the rubric (quantitative ratings), peer feedback (qualitative comments) and hybrid (a combination of the rubric and peer feedback) on students’ project performance, and to explore further students’ perspectives on online PA. The quantitative findings suggested that (a) students in the hybrid condition likely had better project performance than those in the peer feedback condition did, and (b) students in the rubric condition could perform equally well as those in both of the hybrid and peer feedback conditions. The qualitative findings suggested that besides types of assessment, other possible confounding variables that might affect performance included perceived learning benefits, professional assessment, acceptance, and the online PA system.


Sign in / Sign up

Export Citation Format

Share Document