scholarly journals Psychometric evaluation of a direct observation of procedural skills assessment tool for ultrasound-guided regional anaesthesia

Anaesthesia ◽  
2014 ◽  
Vol 69 (6) ◽  
pp. 604-612 ◽  
Author(s):  
M. J. Watson ◽  
D. M. Wong ◽  
R. Kluger ◽  
A. Chuan ◽  
M. D. Herrick ◽  
...  
2016 ◽  
Vol 44 (2) ◽  
pp. 201-206 ◽  
Author(s):  
A. Chuan ◽  
S. Thillainathan ◽  
P. L. Graham ◽  
B. Jolly ◽  
D. M. Wong ◽  
...  

2019 ◽  
Vol 115 (2) ◽  
pp. 234-243 ◽  
Author(s):  
Keith Siau ◽  
James Crossley ◽  
Paul Dunckley ◽  
Gavin Johnson ◽  
Mark Feeney ◽  
...  

Anaesthesia ◽  
2015 ◽  
Vol 70 (12) ◽  
pp. 1401-1411 ◽  
Author(s):  
A. Chuan ◽  
P. L. Graham ◽  
D. M. Wong ◽  
M. J. Barrington ◽  
D. B. Auyong ◽  
...  

Author(s):  
Reza M. Munandar ◽  
Yoyo Suhoyo ◽  
Tridjoko Hadianto

Background: Mini-CEX was developed to assess clinical skills by direct observation. Mini-CEX as a clinical skills assessment tool had to fulfill four requirements: validity, reliability, effects on students, and practicality. The purpose of this study is to understand validity, reliability, and feasibility of Mini-CEX as a clinical skills assessment tool in medical core clerkship program at Faculty of Medicine Universitas Gadjah Mada.Method: Seventy four clerkship students from Internal Medicine and 42 clerkship students from Neurology Department were asked to do an observed Mini-CEX encounter for minimum amount of four in Internal Medicine and two in Neurology Department in the period of September 2010 to January 2011. The validity was analyzed with Kruskal-Wallis method for Internal Medicine, and Mann-Whitney Method for neurology Department, reliability was analyzed based on G coefficient, and feasibility was analyzed using descriptive statistic.Results: Mini-CEX’s validity is shown by p < 0,001 in Internal Medicine and p = 0,250 in Neurology Department, G coefficient for Internal Medicine and Neurology Department is 0,98 and 0,61 respectively. Feasibility in Internal Medicine and Neurology Department is 79,7 % and 100% respectively.Conclusion: Mini-CEX is valid and reliable in Internal Medicine but not in Neurology Department. Feasibility is good for both Internal Medicine and Neurology Department.


Endoscopy ◽  
2018 ◽  
Vol 50 (08) ◽  
pp. 770-778 ◽  
Author(s):  
Keith Siau ◽  
Paul Dunckley ◽  
Roland Valori ◽  
Mark Feeney ◽  
Neil Hawkes ◽  
...  

Abstract Background Direct Observation of Procedural Skills (DOPS) is an established competence assessment tool in endoscopy. In July 2016, the DOPS scoring format changed from a performance-based scale to a supervision-based scale. We aimed to evaluate the impact of changes to the DOPS scale format on the distribution of scores in novice trainees and on competence assessment. Methods We performed a prospective, multicenter (n = 276), observational study of formative DOPS assessments in endoscopy trainees with ≤ 100 lifetime procedures. DOPS were submitted in the 6-months before July 2016 (old scale) and after (new scale) for gastroscopy (n = 2998), sigmoidoscopy (n = 1310), colonoscopy (n = 3280), and polypectomy (n = 631). Scores for old and new DOPS were aligned to a 4-point scale and compared. Results 8219 DOPS (43 % new and 57 % old) submitted for 1300 trainees were analyzed. Compared with old DOPS, the use of the new DOPS was associated with greater utilization of the lowest score (2.4 % vs. 0.9 %; P < 0.001), broader range of scores, and a reduction in competent scores (60.8 % vs. 86.9 %; P < 0.001). The reduction in competent scores was evident on subgroup analysis across all procedure types (P < 0.001) and for each quartile of endoscopy experience. The new DOPS was superior in characterizing the endoscopy learning curve by demonstrating progression of competent scores across quartiles of procedural experience. Conclusions Endoscopy assessors applied a greater range of scores using the new DOPS scale based on degree of supervision in two cohorts of trainees matched for experience. Our study provides construct validity evidence in support of the new scale format.


2020 ◽  
Vol 10 (9) ◽  
pp. 47
Author(s):  
Martha Mbewe ◽  
Natalia Mbewe ◽  
Catherine M. Ngoma

Direct observation of procedural skills is an evidence-based assessment tool used for assessing competence in the practical procedures that nursing students undertake during clinical placement. Having knowledge about students’ satisfaction with their educational issues is very important as it helps both faculty and students to achieve educational goals. However factors that may influence student satisfaction with this method of assessment are not known in the school of nursing sciences at the University of Zambia. The purpose of this study was to investigate factors influencing student satisfaction with direct observation of procedural skills in order to get students views on this assessment tool. A cross sectional quantitative survey was used on ninety two (92) conveniently sampled final year undergraduate nursing students. Data were collected using a validated self-reported questionnaire and analysed using IBM SPSS Version 20. Fisher’s exact tests were used to determine association between student satisfaction and the independent variables. A p-value of ≤ .05 was considered statistically significant. Major findings revealed that poor clinical environment 98.9% and faculty non availability 98% influenced student’s satisfaction negatively. Other significant associations where clinical experiences, and feedback, with p ≤ .05. In conclusion, the major factors that influenced student satisfaction included poor clinical environment which was not conducive for assessments as it lacked many essential medical surgical supplies and non-availability of faculty in the clinical area who were needed to provide feedback, guidance and supervision to the students. Improving the clinical environment, ensuring faculty availability in order to provide timely and constructive feedback may help students achieve their learning objectives. Direct observation of procedural skills still remains an appropriate tool to assess student clinical competence. However, further research and development of cheap, less stressful and economic methods of clinical evaluation to blend with this tool is required in the school of nursing sciences.


2020 ◽  
Vol 3 (Supplement_1) ◽  
pp. 149-150
Author(s):  
U Khan ◽  
A N Barkun ◽  
E I Benchimol ◽  
M Salim ◽  
J J Telford ◽  
...  

Abstract Background Previous studies have demonstrated that many graduating trainees may not have all of the skills needed to independently practice endoscopic retrograde cholangiopancreatography (ERCP) safely and effectively. As a part of competency-based learning curriculum development, it is essential to provide formative feedback to trainees and ensure they acquire the knowledge and skills for independent practice. Aims To assess the performance of advanced endoscopy trainees across Canada using the Canadian Direct Observation of Procedural Skills (CanDOPS) ERCP assessment tool. Procedural items evaluated include both technical (cannulation, sphincterotomy, stone extraction, tissue sampling, and stent placement) and non-technical (leadership, communication and teamwork, judgment and decision making) skills. Methods We conducted a prospective national multi-centre prospective study. Advanced endoscopy trainees with at least two years of gastroenterology training or five years of general surgery in North America and minimal experience performing ERCPs (less than 100 ERCP procedures) were invited to participate. The CanDOPS tool was used to measure every fifth ERCP performed by trainees over a 12-month fellowship training period. ERCPs were evaluated by experienced staff endoscopists at each study site under standard clinical protocol. Cumulative sum (CUSUM) analyses were used to generate learning curves. Results The data from five Canadian sites and 11 trainees participated in the study. A total of 261ERCP evaluations were completed. Median number of evaluations by site and trainee was 49 (IQR 31–76) and 15 (IQR 11–45). Median number of cases trainees performed prior to their ERCP training was 50 (IQR 25–400). There was a significant improvement in almost all scores over time, including selective cannulation, sphincterotomy, biliary stenting and all non-technical skills (P&lt;0.01). CUSUM analyses using acceptable and unacceptable failure rates of 20% and 50% demonstrated trainees achieved competency for most measures in their final month of their training. Competency in tissue sampling was not achieved within a one-year training period. Conclusions This is the first ERCP performance evaluation tool that examines multiple technical and non-technical aspects of the procedure. Although trainee ERCP skills do improve during their training period, there exists a notable variability in time to competency for the different skills measured using the CanDOPS tool. Large prospective research is required to determine if competency is achieved using more stringent definitions of ERCP competency and to determine factors associated with reaching competency. Funding Agencies None


Sign in / Sign up

Export Citation Format

Share Document