Student evaluations of teaching: the impact of faculty procedures on response rates

2018 ◽  
Vol 44 (1) ◽  
pp. 37-49 ◽  
Author(s):  
Karen Young ◽  
Jeffrey Joines ◽  
Trey Standish ◽  
Victoria Gallagher
PeerJ ◽  
2017 ◽  
Vol 5 ◽  
pp. e3299 ◽  
Author(s):  
Bob Uttl ◽  
Dylan Smibert

Anonymous student evaluations of teaching (SETs) are used by colleges and universities to measure teaching effectiveness and to make decisions about faculty hiring, firing, re-appointment, promotion, tenure, and merit pay. Although numerous studies have found that SETs correlate with various teaching effectiveness irrelevant factors (TEIFs) such as subject, class size, and grading standards, it has been argued that such correlations are small and do not undermine the validity of SETs as measures of professors’ teaching effectiveness. However, previous research has generally used inappropriate parametric statistics and effect sizes to examine and to evaluate the significance of TEIFs on personnel decisions. Accordingly, we examined the influence of quantitative vs. non-quantitative courses on SET ratings and SET based personnel decisions using 14,872 publicly posted class evaluations where each evaluation represents a summary of SET ratings provided by individual students responding in each class. In total, 325,538 individual student evaluations from a US mid-size university contributed to theses class evaluations. The results demonstrate that class subject (math vs. English) is strongly associated with SET ratings, has a substantial impact on professors being labeled satisfactory vs. unsatisfactory and excellent vs. non-excellent, and the impact varies substantially depending on the criteria used to classify professors as satisfactory vs. unsatisfactory. Professors teaching quantitative courses are far more likely not to receive tenure, promotion, and/or merit pay when their performance is evaluated against common standards.


2021 ◽  
Vol 4 (1) ◽  
Author(s):  
Angela Page ◽  
◽  
Jennifer Charteris ◽  

It is well established that student evaluations of teaching in universities have long been contested. Many see value in them for ongoing improvement and to hold faculty to account for their pedagogical practice. However, the anonymity of these online surveys that permit students enrolled in units to provide feedback on teaching and learning can produce ‘keyboard warriors’. Anonymous surveys can serve to provide a platform for students to engage in cyber-aggressive behaviours that are damaging for staff health and wellbeing and are of a concern to workplace safety. We draw on published results from an existing study of student evaluations of teaching to signal that in the worst instances student evaluations of teaching evoke student cyberaggression.


2021 ◽  
Vol 5 (1) ◽  
Author(s):  
Edvard Pedersen

Background: Summative student evaluations of teaching is widely used to evaluatecourse content and instruction. However, this feedback does not benefit the students who are pro­viding the feedback, and may postpone feedback such that the evaluation is not useful in correctingissues in the course. There are several issues with this type of feedback that are tacled in this work:(i) The evaluation is not a help for the current students. (ii) The feedback is often imprecise. (iii)Evaluations focus on the quality of the teaching, rather than the impact of the learning. Previouswork [1, 2] have shown that continuous student involvement can improve the feedback received.The idea in this project is to perform this student panel interaction in an informal setting, conti­nously throughout the semester.Methods: This intervention has been performed for two consecutive semesters in different courses.During the first lecture of the semester, students were recruited to participate in a weekly informalmeeting to discuss the instruction in the course. Weekly meetings were held for one full semester,with each meeting lasting around 45 minutes. Notes were taken on all actionable items, and asummarizing report was written. This intervention has been evaluated through the inspection ofactionable items in the notes, as well as the implemented items.Results: 104/116 actionable items were identified during spring/fall 2020, of which 57/65 were im­plemented immidiately. Participation in the student panel group was high and stable.


Sign in / Sign up

Export Citation Format

Share Document