The validity and reliability of a Direct Observation of Procedural Skills assessment tool: assessing colonoscopic skills of senior endoscopists

2012 ◽  
Vol 75 (3) ◽  
pp. 591-597 ◽  
Author(s):  
John Roger Barton ◽  
Sally Corbett ◽  
Cees Petronella van der Vleuten
2019 ◽  
Vol 115 (2) ◽  
pp. 234-243 ◽  
Author(s):  
Keith Siau ◽  
James Crossley ◽  
Paul Dunckley ◽  
Gavin Johnson ◽  
Mark Feeney ◽  
...  

Anaesthesia ◽  
2014 ◽  
Vol 69 (6) ◽  
pp. 604-612 ◽  
Author(s):  
M. J. Watson ◽  
D. M. Wong ◽  
R. Kluger ◽  
A. Chuan ◽  
M. D. Herrick ◽  
...  

2016 ◽  
Vol 44 (2) ◽  
pp. 201-206 ◽  
Author(s):  
A. Chuan ◽  
S. Thillainathan ◽  
P. L. Graham ◽  
B. Jolly ◽  
D. M. Wong ◽  
...  

Author(s):  
Reza M. Munandar ◽  
Yoyo Suhoyo ◽  
Tridjoko Hadianto

Background: Mini-CEX was developed to assess clinical skills by direct observation. Mini-CEX as a clinical skills assessment tool had to fulfill four requirements: validity, reliability, effects on students, and practicality. The purpose of this study is to understand validity, reliability, and feasibility of Mini-CEX as a clinical skills assessment tool in medical core clerkship program at Faculty of Medicine Universitas Gadjah Mada.Method: Seventy four clerkship students from Internal Medicine and 42 clerkship students from Neurology Department were asked to do an observed Mini-CEX encounter for minimum amount of four in Internal Medicine and two in Neurology Department in the period of September 2010 to January 2011. The validity was analyzed with Kruskal-Wallis method for Internal Medicine, and Mann-Whitney Method for neurology Department, reliability was analyzed based on G coefficient, and feasibility was analyzed using descriptive statistic.Results: Mini-CEX’s validity is shown by p < 0,001 in Internal Medicine and p = 0,250 in Neurology Department, G coefficient for Internal Medicine and Neurology Department is 0,98 and 0,61 respectively. Feasibility in Internal Medicine and Neurology Department is 79,7 % and 100% respectively.Conclusion: Mini-CEX is valid and reliable in Internal Medicine but not in Neurology Department. Feasibility is good for both Internal Medicine and Neurology Department.


2010 ◽  
Vol 2 (1) ◽  
pp. 108-110 ◽  
Author(s):  
Joseph Gigante ◽  
Rebecca Swan

Abstract Background The Accreditation Council for Graduate Medical Education promotes direct observation of residents as a key assessment tool for competency in patient care, professionalism, and communication skills. Although tools exist, validity and reliability have not been demonstrated for most, and many tools may have limited feasibility because of time constraints and other reasons. We conducted a study to measure feasibility of a simplified observation tool to evaluate these competencies and provide timely feedback. Methods In the pediatric resident continuity clinic of a large children's hospital, we used a direct observation form with a 3-point scale for 16 items in the domains of patient care, professionalism, and communication skills. The form was divided by portion of visit, with specific items mapped to 1 or more of the competencies, and was used to provide direct oral feedback to the resident. Faculty and residents completed surveys rating the process (ease of use, satisfaction, and self-assessed usefulness) on a 5-point Likert scale. Results The study encompassed 89 surveys completed by attending physicians; 98% (87 of 89) of the time the form was easy to use, 99% (88) of the time its use did not interfere with patient flow, and 93% (83) of the observations provided useful information for resident feedback. Residents completed 70 surveys, with the majority (69%, 48) reporting they were comfortable about being observed by an attending physician; 87% (61) thought that direct observation did not significantly affect their efficiency. Ninety-seven percent of the time (68) residents reported that direct observation provided useful feedback. Conclusion The data suggest the form was well-received by both faculty and residents, and enabled attending physicians to provide useful feedback.


Endoscopy ◽  
2018 ◽  
Vol 50 (08) ◽  
pp. 770-778 ◽  
Author(s):  
Keith Siau ◽  
Paul Dunckley ◽  
Roland Valori ◽  
Mark Feeney ◽  
Neil Hawkes ◽  
...  

Abstract Background Direct Observation of Procedural Skills (DOPS) is an established competence assessment tool in endoscopy. In July 2016, the DOPS scoring format changed from a performance-based scale to a supervision-based scale. We aimed to evaluate the impact of changes to the DOPS scale format on the distribution of scores in novice trainees and on competence assessment. Methods We performed a prospective, multicenter (n = 276), observational study of formative DOPS assessments in endoscopy trainees with ≤ 100 lifetime procedures. DOPS were submitted in the 6-months before July 2016 (old scale) and after (new scale) for gastroscopy (n = 2998), sigmoidoscopy (n = 1310), colonoscopy (n = 3280), and polypectomy (n = 631). Scores for old and new DOPS were aligned to a 4-point scale and compared. Results 8219 DOPS (43 % new and 57 % old) submitted for 1300 trainees were analyzed. Compared with old DOPS, the use of the new DOPS was associated with greater utilization of the lowest score (2.4 % vs. 0.9 %; P < 0.001), broader range of scores, and a reduction in competent scores (60.8 % vs. 86.9 %; P < 0.001). The reduction in competent scores was evident on subgroup analysis across all procedure types (P < 0.001) and for each quartile of endoscopy experience. The new DOPS was superior in characterizing the endoscopy learning curve by demonstrating progression of competent scores across quartiles of procedural experience. Conclusions Endoscopy assessors applied a greater range of scores using the new DOPS scale based on degree of supervision in two cohorts of trainees matched for experience. Our study provides construct validity evidence in support of the new scale format.


Anaesthesia ◽  
2015 ◽  
Vol 70 (12) ◽  
pp. 1401-1411 ◽  
Author(s):  
A. Chuan ◽  
P. L. Graham ◽  
D. M. Wong ◽  
M. J. Barrington ◽  
D. B. Auyong ◽  
...  

2020 ◽  
Vol 10 (9) ◽  
pp. 47
Author(s):  
Martha Mbewe ◽  
Natalia Mbewe ◽  
Catherine M. Ngoma

Direct observation of procedural skills is an evidence-based assessment tool used for assessing competence in the practical procedures that nursing students undertake during clinical placement. Having knowledge about students’ satisfaction with their educational issues is very important as it helps both faculty and students to achieve educational goals. However factors that may influence student satisfaction with this method of assessment are not known in the school of nursing sciences at the University of Zambia. The purpose of this study was to investigate factors influencing student satisfaction with direct observation of procedural skills in order to get students views on this assessment tool. A cross sectional quantitative survey was used on ninety two (92) conveniently sampled final year undergraduate nursing students. Data were collected using a validated self-reported questionnaire and analysed using IBM SPSS Version 20. Fisher’s exact tests were used to determine association between student satisfaction and the independent variables. A p-value of ≤ .05 was considered statistically significant. Major findings revealed that poor clinical environment 98.9% and faculty non availability 98% influenced student’s satisfaction negatively. Other significant associations where clinical experiences, and feedback, with p ≤ .05. In conclusion, the major factors that influenced student satisfaction included poor clinical environment which was not conducive for assessments as it lacked many essential medical surgical supplies and non-availability of faculty in the clinical area who were needed to provide feedback, guidance and supervision to the students. Improving the clinical environment, ensuring faculty availability in order to provide timely and constructive feedback may help students achieve their learning objectives. Direct observation of procedural skills still remains an appropriate tool to assess student clinical competence. However, further research and development of cheap, less stressful and economic methods of clinical evaluation to blend with this tool is required in the school of nursing sciences.


Sign in / Sign up

Export Citation Format

Share Document