Embracing Change in the Pursuit of Excellence: Transitioning to the Clinical Internship Evaluation Tool for Student Clinical Performance Assessment

2020 ◽  
Vol 34 (4) ◽  
pp. 313-320
Author(s):  
Sara North ◽  
Amanda Sharp
2012 ◽  
Vol 68 (3) ◽  
Author(s):  
C. Joseph ◽  
J. Frantz ◽  
C. Hendricks ◽  
M. Smith

Clinical practice is an essential requirement of any graduatephysiotherapy programme. For this purpose, valid and reliable assessment toolsare paramount for the measurement of key competencies in the real-worldsetting. This study aims to determine the internal consistency and inter-raterreliability of a newly developed and validated clinical performance assessmentform. A cross-sectional quantitative research design was used, which includedpaired evaluations of 32 (17 treatment and 15 assessment) student examinationsperformed by two independent clinical educators. Chronbachs alpha was computedto assess internal consistency and intraclass correlation coefficient (ICC’s) withconfidence intervals of 95% were computed to determine the percentage agreement between paired examiners. Thedegree of internal consistency was substantial for all key performance areas of both examinations, except for timeand organisational management (0.21) and professionalism (0.42) in the treatment and evaluation examinationsrespectively. The overall internal consistency was 0.89 and 0.73 for both treatment and assessment examinations,indicating substantial agreement. With regard to agreement between raters, the ICC’s for the overall marks were0.90 and 0.97 for both treatment and assessment examinations. Clinical educators demonstrated a high level ofreliability in the assessment of students’ competence using the newly developed clinical performance assessment form.These findings greatly underscore the reliability of results obtained through observation of student examinations, andadd another tool to the basket of ensuring quality assurance in physiotherapy clinical practice assessment.


2002 ◽  
Vol 14 (2) ◽  
pp. 124-132 ◽  
Author(s):  
Robert J. Bulik ◽  
Ann W. Frye ◽  
Michael R. Callaway ◽  
Cecilia M. Romero ◽  
Diedra J. Walters

2002 ◽  
Vol 36 (9) ◽  
pp. 827-832 ◽  
Author(s):  
Simone Gorter ◽  
Jan-Joost Rethans ◽  
Désirée Van Der Heijde ◽  
Albert Scherpbier ◽  
Harry Houben ◽  
...  

2006 ◽  
Vol 12 (2) ◽  
pp. 239-260 ◽  
Author(s):  
Marjan J. B. Govaerts ◽  
Cees P. M. van der Vleuten ◽  
Lambert W. T. Schuwirth ◽  
Arno M. M. Muijtjens

2021 ◽  
Vol Publish Ahead of Print ◽  
Author(s):  
Ignacio Pla-Gil ◽  
María Aragonés Redó ◽  
Tomàs Pérez-Carbonell ◽  
Paz Martínez-Beneyto ◽  
Miguel Orts Alborch ◽  
...  

Author(s):  
John Woller ◽  
Sean Tackett ◽  
Ariella Apfel ◽  
Janet Record ◽  
Danelle Cayea ◽  
...  

We aimed to determine whether it was feasible to assess medical students as they completed a virtual sub-internship. Six students (out of 31 who completed an in-person sub-internship) participated in a 2-week virtual sub-internship, caring for patients remotely. Residents and attendings assessed those 6 students in 15 domains using the same assessment measures from the in-person sub-internship. Raters marked “unable to assess” in 75/390 responses (19%) for the virtual sub-internship versus 88/3,405 (2.6%) for the in-person sub-internship (P=0.01), most frequently for the virtual sub-internship in the domains of the physical examination (21, 81%), rapport with patients (18, 69%), and compassion (11, 42%). Students received complete assessments in most areas. Scores were higher for the in-person than the virtual sub-internship (4.67 vs. 4.45, P<0.01) for students who completed both. Students uniformly rated the virtual clerkship positively. Students can be assessed in many domains in the context of a virtual sub-internship.


Sign in / Sign up

Export Citation Format

Share Document