scholarly journals Colonoscopy Direct Observation of Procedural Skills Assessment Tool for Evaluating Competency Development During Training

2019 ◽  
Vol 115 (2) ◽  
pp. 234-243 ◽  
Author(s):  
Keith Siau ◽  
James Crossley ◽  
Paul Dunckley ◽  
Gavin Johnson ◽  
Mark Feeney ◽  
...  
Anaesthesia ◽  
2014 ◽  
Vol 69 (6) ◽  
pp. 604-612 ◽  
Author(s):  
M. J. Watson ◽  
D. M. Wong ◽  
R. Kluger ◽  
A. Chuan ◽  
M. D. Herrick ◽  
...  

2016 ◽  
Vol 44 (2) ◽  
pp. 201-206 ◽  
Author(s):  
A. Chuan ◽  
S. Thillainathan ◽  
P. L. Graham ◽  
B. Jolly ◽  
D. M. Wong ◽  
...  

2019 ◽  
Vol 34 (1) ◽  
pp. 105-114 ◽  
Author(s):  
Keith Siau ◽  
◽  
James Crossley ◽  
Paul Dunckley ◽  
Gavin Johnson ◽  
...  

AbstractBackgroundValidated competency assessment tools and the data supporting milestone development during gastroscopy training are lacking. We aimed to assess the validity of the formative direct observation of procedural skills (DOPS) assessment tool in diagnostic gastroscopy and study competency development using DOPS.MethodsThis was a prospective multicentre (N = 275) analysis of formative gastroscopy DOPS assessments. Internal structure validity was tested using exploratory factor analysis and reliability estimated using generalisability theory. Item and global DOPS scores were stratified by lifetime procedure count to define learning curves, using a threshold determined from receiver operator characteristics (ROC) analysis. Multivariable binary logistic regression analysis was performed to identify independent predictors of DOPS competence.ResultsIn total, 10086 DOPS were submitted for 987 trainees. Exploratory factor analysis identified three distinct item groupings, representing ‘pre-procedure’, ‘technical’, and ‘post-procedure non-technical’ skills. From generalisability analyses, sources of variance in overall DOPS scores included trainee ability (31%), assessor stringency (8%), assessor subjectivity (18%), and trainee case-to-case variation (43%). The combination of three assessments from three assessors was sufficient to achieve the reliability threshold of 0.70. On ROC analysis, a mean score of 3.9 provided optimal sensitivity and specificity for determining competency. This threshold was attained in the order of ‘pre-procedure’ (100–124 procedures), ‘technical’ (150–174 procedures), ‘post-procedure non-technical’ skills (200–224 procedures), and global competency (225–249 procedures). Higher lifetime procedure count, DOPS count, surgical trainees and assessors, higher trainee seniority, and lower case difficulty were significant multivariable predictors of DOPS competence.ConclusionThis study establishes milestones for competency acquisition during gastroscopy training and provides validity and reliability evidence to support gastroscopy DOPS as a competency assessment tool.


2019 ◽  
Vol 28 (1) ◽  
pp. 33-40
Author(s):  
Keith Siau ◽  
James Crossley ◽  
Paul Dunckley ◽  
Gavin Johnson ◽  
Adam Haycock ◽  
...  

Background & Aims: Data supporting milestone development during flexible sigmoidoscopy (FS) training are lacking. We aimed to present validity evidence for our formative direct observation of procedural skills (DOPS) assessment in FS, and use DOPS to establish competency benchmarks and define learning curves for a national training cohort.Methods: This prospective UK-wide (211 centres) study included all FS formative DOPS assessments submitted to the national e-portfolio. Reliability was estimated from generalisability theory analysis. Item and global DOPS scores were correlated with lifetime procedure count to study learning curves, with competency benchmarks defined using contrasting groups analysis. Multivariable binary logistic regression was performedto identify independent predictors of DOPS competence.Results: This analysis included 3,616 DOPS submitted for 468 trainees. From generalisability analysis, sources of overall competency score variance included: trainee ability (27%), assessor stringency (15%), assessor subjectivity attributable to the trainee (18%) and case-to-case variation (40%), which enabled the modelling of reliability estimates. The competency benchmark (mean DOPS score: 3.84) was achieved after 150-174 procedures. Across the cohort, competency development occurred in the order of: pre-procedural (50-74), non-technical (75-149), technical (125-174) and post-procedural (175-199) skills. Lifetime procedural count (p<0.001), case difficulty (p<0.001), and lifetime formative DOPS count (p=0.001) were independently associated with DOPS competence, but not trainee or assessor specialty.Conclusion: Sigmoidoscopy DOPS can provide valid and reliable assessments of competency during training and can be used to chart competency development. Contrary to earlier studies, based on destination-orientated endpoints, overall competency in sigmoidoscopy was attained after 150 lifetime procedures.


Author(s):  
Reza M. Munandar ◽  
Yoyo Suhoyo ◽  
Tridjoko Hadianto

Background: Mini-CEX was developed to assess clinical skills by direct observation. Mini-CEX as a clinical skills assessment tool had to fulfill four requirements: validity, reliability, effects on students, and practicality. The purpose of this study is to understand validity, reliability, and feasibility of Mini-CEX as a clinical skills assessment tool in medical core clerkship program at Faculty of Medicine Universitas Gadjah Mada.Method: Seventy four clerkship students from Internal Medicine and 42 clerkship students from Neurology Department were asked to do an observed Mini-CEX encounter for minimum amount of four in Internal Medicine and two in Neurology Department in the period of September 2010 to January 2011. The validity was analyzed with Kruskal-Wallis method for Internal Medicine, and Mann-Whitney Method for neurology Department, reliability was analyzed based on G coefficient, and feasibility was analyzed using descriptive statistic.Results: Mini-CEX’s validity is shown by p < 0,001 in Internal Medicine and p = 0,250 in Neurology Department, G coefficient for Internal Medicine and Neurology Department is 0,98 and 0,61 respectively. Feasibility in Internal Medicine and Neurology Department is 79,7 % and 100% respectively.Conclusion: Mini-CEX is valid and reliable in Internal Medicine but not in Neurology Department. Feasibility is good for both Internal Medicine and Neurology Department.


Endoscopy ◽  
2018 ◽  
Vol 50 (08) ◽  
pp. 770-778 ◽  
Author(s):  
Keith Siau ◽  
Paul Dunckley ◽  
Roland Valori ◽  
Mark Feeney ◽  
Neil Hawkes ◽  
...  

Abstract Background Direct Observation of Procedural Skills (DOPS) is an established competence assessment tool in endoscopy. In July 2016, the DOPS scoring format changed from a performance-based scale to a supervision-based scale. We aimed to evaluate the impact of changes to the DOPS scale format on the distribution of scores in novice trainees and on competence assessment. Methods We performed a prospective, multicenter (n = 276), observational study of formative DOPS assessments in endoscopy trainees with ≤ 100 lifetime procedures. DOPS were submitted in the 6-months before July 2016 (old scale) and after (new scale) for gastroscopy (n = 2998), sigmoidoscopy (n = 1310), colonoscopy (n = 3280), and polypectomy (n = 631). Scores for old and new DOPS were aligned to a 4-point scale and compared. Results 8219 DOPS (43 % new and 57 % old) submitted for 1300 trainees were analyzed. Compared with old DOPS, the use of the new DOPS was associated with greater utilization of the lowest score (2.4 % vs. 0.9 %; P < 0.001), broader range of scores, and a reduction in competent scores (60.8 % vs. 86.9 %; P < 0.001). The reduction in competent scores was evident on subgroup analysis across all procedure types (P < 0.001) and for each quartile of endoscopy experience. The new DOPS was superior in characterizing the endoscopy learning curve by demonstrating progression of competent scores across quartiles of procedural experience. Conclusions Endoscopy assessors applied a greater range of scores using the new DOPS scale based on degree of supervision in two cohorts of trainees matched for experience. Our study provides construct validity evidence in support of the new scale format.


Sign in / Sign up

Export Citation Format

Share Document