scholarly journals Direct Observation of Clinical Skills Feedback Scale: Development and Validity Evidence

2016 ◽  
Vol 28 (4) ◽  
pp. 385-394 ◽  
Author(s):  
Samantha Halman ◽  
Nancy Dudek ◽  
Timothy Wood ◽  
Debra Pugh ◽  
Claire Touchie ◽  
...  
Methodology ◽  
2018 ◽  
Vol 14 (4) ◽  
pp. 156-164 ◽  
Author(s):  
Keith A. Markus

Abstract. Bollen and colleagues have advocated the use of formative scales despite the fact that formative scales lack an adequate underlying theory to guide development or validation such as that which underlies reflective scales. Three conceptual impediments impede the development of such theory: the redefinition of measurement restricted to the context of model fitting, the inscrutable notion of conceptual unity, and a systematic conflation of item scores with attributes. Setting aside these impediments opens the door to progress in developing the needed theory to support formative scale use. A broader perspective facilitates consideration of standard scale development concerns as applied to formative scales including scale development, item analysis, reliability, and item bias. While formative scales require a different pattern of emphasis, all five of the traditional sources of validity evidence apply to formative scales. Responsible use of formative scales requires greater attention to developing the requisite underlying theory.


JAMA ◽  
2009 ◽  
Vol 302 (12) ◽  
pp. 1316 ◽  
Author(s):  
Jennifer R. Kogan ◽  
Eric S. Holmboe ◽  
Karen E. Hauer

2021 ◽  
Author(s):  
Todd Guth ◽  
Yoon Soo Park ◽  
Janice Hanson ◽  
Rachel Yudkowsky

Abstract Background The Core Physical Exam (CPE) has been proposed as a set of key physical exam (PE) items for teaching and assessing PE skills in medical students, and as the basis of a Core + Cluster curriculum. Beyond the initial development of the CPE and proposal of the CPE and the Core + Cluster curriculum, no additional validity evidence has been presented for use of the CPE to teach or assess PE skills of medical students. As a result, a modified version of the CPE was developed by faculty at the University of Colorado School of Medicine (UCSOM) and implemented in the school’s clinical skills course in the context of an evolving Core + Cluster curriculum. Methods Validity evidence for the 25-item University of Colorado School of Medicine (UCSOM) CPE was analyzed using longitudinal assessment data from 366 medical students (Classes of 2019 and 2020), obtained from September 2015 through December 2019. Using Messick's unified validity framework, validity evidence specific to content, response process, internal structure, relationship to other variables, and consequences was gathered. Results Content and response process validity evidence included expert content review and rater training. For internal structure, a generalizability study phi coefficient of 0.258 suggests low reliability for a single assessment due to variability in learner performance by occasion and CPE items. Correlations of performance on the UCSOM CPE with other PE assessments were low, ranging from .00-.34. Consequences were explored through determination of a pass-fail cut score. Following a modified Angoff process, clinical skills course directors selected a consensus pass-fail cut score of 80% as a defensible and practical threshold for entry into precepted clinical experiences. Conclusions Validity evidence supports the use of the UCSOM CPE as an instructional strategy for teaching PE skills and as a formative assessment of readiness for precepted clinical experiences. The low generalizability coefficient suggests that inferences about PE skills based on the UCSOM CPE alone should be made with caution, and that the UCSOM CPE in isolation should be used primarily as a formative assessment.


2011 ◽  
Vol 11 (5) ◽  
pp. 394-402 ◽  
Author(s):  
Ellen K. Hamburger ◽  
Sandra Cuzzi ◽  
Dale A. Coddington ◽  
Angela M. Allevi ◽  
Joseph Lopreiato ◽  
...  

Author(s):  
Reza M. Munandar ◽  
Yoyo Suhoyo ◽  
Tridjoko Hadianto

Background: Mini-CEX was developed to assess clinical skills by direct observation. Mini-CEX as a clinical skills assessment tool had to fulfill four requirements: validity, reliability, effects on students, and practicality. The purpose of this study is to understand validity, reliability, and feasibility of Mini-CEX as a clinical skills assessment tool in medical core clerkship program at Faculty of Medicine Universitas Gadjah Mada.Method: Seventy four clerkship students from Internal Medicine and 42 clerkship students from Neurology Department were asked to do an observed Mini-CEX encounter for minimum amount of four in Internal Medicine and two in Neurology Department in the period of September 2010 to January 2011. The validity was analyzed with Kruskal-Wallis method for Internal Medicine, and Mann-Whitney Method for neurology Department, reliability was analyzed based on G coefficient, and feasibility was analyzed using descriptive statistic.Results: Mini-CEX’s validity is shown by p < 0,001 in Internal Medicine and p = 0,250 in Neurology Department, G coefficient for Internal Medicine and Neurology Department is 0,98 and 0,61 respectively. Feasibility in Internal Medicine and Neurology Department is 79,7 % and 100% respectively.Conclusion: Mini-CEX is valid and reliable in Internal Medicine but not in Neurology Department. Feasibility is good for both Internal Medicine and Neurology Department.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Shideh Dabir ◽  
Mohammad Hoseinzadeh ◽  
Faramarz Mosaffa ◽  
Behnam Hosseini ◽  
Mastaneh Dahi ◽  
...  

Background: The ultimate result of patient care is one of the most important outcomes in medical education. Several methods, including the direct observation of procedural skills (DOPS), have been proposed to assess professional competencies in clinical practice. Objectives: This study aimed to assess the effects of the Repeated DOPS (R-DOPS) method on the performance of procedural skills in anesthesiology residents. Methods: The procedural skill performance of anesthesiology residents was assessed using a standard DOPS protocol from May to October 2019. Their scores were then objectively recorded, and the satisfaction rates regarding the 2 DOPS exams were assessed. Results: We found a considerable improvement in anesthesiology residents’ procedural skill performance, especially in the anesthesiology residency curriculum’s basic items. Besides, anesthesiology residents’ satisfaction was significantly improved after the 2nd DOPS. Conclusions: R-DOPS leads to improved training outcomes, including assessing the procedural skills, time to feedback to trainees, and trainee satisfaction.


2016 ◽  
Author(s):  
Samantha Halman ◽  
Nancy Dudek ◽  
Timothy Wood ◽  
Debra Pugh ◽  
Claire Touchie ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document