peer rating
Recently Published Documents


TOTAL DOCUMENTS

76
(FIVE YEARS 8)

H-INDEX

13
(FIVE YEARS 0)

2021 ◽  
pp. 1-57
Author(s):  
Lydia Garms ◽  
Siaw-Lynn Ng ◽  
Elizabeth A. Quaglia ◽  
Giulia Traverso

When peers rate each other, they may rate inaccurately to boost their own reputation or unfairly lower another’s. This could be mitigated by having a reputation server incentivise accurate ratings with a reward. However, assigning rewards becomes challenging when ratings are anonymous, since the reputation server cannot tell which peers to reward for rating accurately. To address this, we propose an anonymous peer rating system in which users can be rewarded for accurate ratings, and we formally define its model and security requirements. In our system ratings are rewarded in batches, so that users claiming their rewards only reveal they authored one in this batch of ratings. To ensure the anonymity set of rewarded users is not reduced, we also split the reputation server into two entities, the Rewarder, who knows which ratings are rewarded, and the Reputation Holder, who knows which users were rewarded. We give a provably secure construction satisfying all the security properties required. For our construction we use a modification of a Direct Anonymous Attestation scheme to ensure that peers can prove their own reputation when rating others, and that multiple feedback on the same subject can be detected. We then use Linkable Ring Signatures to enable peers to be rewarded for their accurate ratings, while still ensuring that ratings are anonymous. Our work results in a system which allows accurate ratings to be rewarded, whilst still providing anonymity of ratings with respect to the central entities managing the system.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Ali Khodi

AbstractThe present study attempted to to investigate  factors  which affect EFL writing scores through using generalizability theory (G-theory). To this purpose, one hundred and twenty students participated in one independent and one integrated writing tasks. Proceeding, their performances were scored by six raters: one self-rating,  three peers,-rating and two instructors-rating. The main purpose of the sudy was to determine the relative and absolute contributions of different facets such as student, rater, task, method of scoring, and background of education  to the validity of writing assessment scores. The results indicated three major sources of variance: (a) the student by task by method of scoring (nested in background of education) interaction (STM:B) with 31.8% contribution to the total variance, (b) the student by rater by task by method of scoring (nested in background of education) interaction (SRTM:B) with 26.5% of contribution to the total variance, and (c) the student by rater by method of scoring (nested in background of education) interaction (SRM:B) with 17.6% of the contribution. With regard to the G-coefficients in G-study (relative G-coefficient ≥ 0.86), it was also found that the result of the assessment was highly valid and reliable. The sources of error variance were detected as the student by rater (nested in background of education) (SR:B) and rater by background of education with 99.2% and 0.8% contribution to the error variance, respectively. Additionally, ten separate G-studies were conducted to investigate the contribution of different facets across rater, task, and methods of scoring as differentiation facet. These studies suggested that peer rating, analytical scoring method, and integrated writing tasks were the most reliable and generalizable designs of the writing assessments. Finally, five decision-making studies (D-studies) in optimization level were conducted and it was indicated that at least four raters (with G-coefficient = 0.80) are necessary for a valid and reliable assessment. Based on these results, to achieve the greatest gain in generalizability, teachers should have their students take two writing assessments and their performance should be rated on at least two scoring methods by at least four raters.


2021 ◽  
Vol 124 ◽  
pp. 01001
Author(s):  
Hairuzila Idrus ◽  
Herri Mulyono ◽  
Raihan Mahirah Ramli ◽  
Ena Bhattacharyya ◽  
Siti Zulaiha

Online assessment has been increasingly popular in the tertiary education setting nowadays and has had significant changes to the process of assessment. Several studies have been conducted on online assessment, particularly on online peer assessment and found that it benefits students in teaching, learning and assessment. This paper will discuss the result of a study conducted in a Chemical Engineering class which uses online peer assessment as one of the tools to conduct assessment. The objective of this study was to investigate the effectiveness of online peer assessment to students’ learning from the students’ perspective. The perspective of the students was under study because they are the ones who are going through the process, thus, the effectiveness of the assessment to their learning can only be known if the information is obtained from them. The students were divided in small groups and asked to conduct online peer assessment using any free online survey tools. Each group had to conduct 2 or 3 cycles of peer assessment to ensure validity of the rating. The peer rating was used to produce individual marks for group work. At the end of the semester, the students were asked to write a reflection on the effectiveness of online peer assessment in their learning process of the course. The result of this study found that students have positive perspectives on the effectiveness of this assessment. This will be further discussed in the paper. In addition, pedagogical implications will also be discussed on the use of such online peer assessment in student’s learning task.


Author(s):  
Lydia Garms ◽  
Siaw–Lynn Ng ◽  
Elizabeth A. Quaglia ◽  
Giulia Traverso
Keyword(s):  

Author(s):  
Maizey Benner ◽  
Daniel Ferguson ◽  
Behzad Beigpourian ◽  
Mathew Ohland
Keyword(s):  

2018 ◽  
Vol 44 (6) ◽  
pp. 835-847 ◽  
Author(s):  
Jin Liu ◽  
Xiuyan Guo ◽  
Ruiqin Gao ◽  
Paul Fram ◽  
Yu Ling ◽  
...  

Author(s):  
Giulia Traverso ◽  
Denis Butin ◽  
Johannes Buchmann ◽  
Alex Palesandro
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document