scholarly journals Test of Integrated Professional Skills: Objective Structured Clinical Examination/Simulation Hybrid Assessment of Obstetrics-Gynecology Residents' Skill Integration

2014 ◽  
Vol 6 (1) ◽  
pp. 117-122 ◽  
Author(s):  
Abigail Ford Winkel ◽  
Colleen Gillespie ◽  
Marissa T. Hiruma ◽  
Alice R. Goepfert ◽  
Sondra Zabar ◽  
...  

Abstract Background Assessment of obstetrics-gynecology residents' ability to integrate clinical judgment, interpersonal skills, and technical ability in a uniform fashion is required to document achievement of benchmarks of competency. An observed structured clinical examination that incorporates simulation and bench models uses direct observation of performance to generate formative feedback and standardized evaluation. Methods The Test of Integrated Professional Skills (TIPS) is a 5-station performance-based assessment that uses standardized patients and complex scenarios involving ultrasonography, procedural skills, and evidence-based medicine. Standardized patients and faculty rated residents by using behaviorally anchored checklists. Mean scores reflecting performance in TIPS were compared across competency domains and by developmental level (using analysis of variance) and then compared to standard faculty clinical evaluations (using Spearman ρ). Participating faculty and residents were also asked to evaluate the usefulness of the TIPS. Results Twenty-four residents participated in the TIPS. Checklist items used to assess competency were sufficiently reliable, with Cronbach α estimates from 0.69 to 0.82. Performance improved with level of training, with wide variation in performance. Standard faculty evaluations did not correlate with TIPS performance. Several residents who were rated as average or above average by faculty performed poorly on the TIPS (> 1 SD below the mean). Both faculty and residents found the TIPS format useful, providing meaningful evaluation and opportunity for feedback. Conclusions A simulation-based observed structured clinical examination facilitates observation of a range of skills, including competencies that are difficult to observe and measure in a standardized way. Debriefing with faculty provides an important interface for identification of performance gaps and individualization of learning plans.

1999 ◽  
Vol 91 (1) ◽  
pp. 288-298 ◽  
Author(s):  
Armin Schubert ◽  
John E. Tetzlaff ◽  
Ming Tan ◽  
Victor J. Ryckman ◽  
Edward Mascha

Background Oral practice examinations (OPEs) are used extensively in many anesthesiology programs for various reasons, including assessment of clinical judgment. Yet oral examinations have been criticized for their subjectivity. The authors studied the reliability, consistency, and validity of their OPE program to determine if it was a useful assessment tool. Methods From 1989 through 1993, we prospectively studied 441 OPEs given to 190 residents. The examination format closely approximated that used by the American Board of Anesthesiology. Pass-fail grade and an overall numerical score were the OPE results of interest. Internal consistency and inter-rater reliability were determined using agreement measures. To assess their validity in describing competence, OPE results were correlated with in-training examination results and faculty evaluations. Furthermore, we analyzed the relationship of OPE with implicit indicators of resident preparation such as length of training. Results The internal consistency coefficient for the overall numerical score was 0.82, indicating good correlation among component scores. The interexaminer agreement was 0.68, indicating moderate or good agreement beyond that expected by chance. The actual agreement among examiners on pass-fail was 84%. Correlation of overall numerical score with in-training examination scores and faculty evaluations was moderate (r = 0.47 and 0.41, respectively; P < 0.01). OPE results were significantly (P < 0.01) associated with training duration, previous OPE experience, trainee preparedness, and trainee anxiety. Conclusion Our results show the substantial internal consistency and reliability of OPE results at a single institution. The positive correlation of OPE scores with in-training examination scores, faculty evaluations, and other indicators of preparation suggest that OPEs are a reasonably valid tool for assessment of resident performance.


1998 ◽  
Vol 3 (1) ◽  
pp. 4301 ◽  
Author(s):  
Kathryn L. Lovell ◽  
Brian E. Mavis ◽  
Jane L. Turner ◽  
Karen S. Ogle ◽  
Marilee Griffith

2016 ◽  
Vol 39 (3) ◽  
pp. 388-399 ◽  
Author(s):  
Xuemei Zhu ◽  
Li Yang ◽  
Ping Lin ◽  
Guizhi Lu ◽  
Ningning Xiao ◽  
...  

The objectives of this study were to develop, implement, and evaluate an innovative modified Objective Structured Clinical Examination (OSCE) model, and to compare students’ performance of different clinical skills as assessed by standardized patients and OSCE examiners. Data were obtained from final year undergraduate students undergoing the modified OSCE as a graduation examination. Seventy-seven students rotated through four stations (nine substations). Standardized patients scored students higher than examiners in history taking (9.14 ± 0.92 vs. 8.42 ± 0.85), response to emergency event (8.88 ± 1.12 vs. 7.62 ± 1.54), executive medical orders (8.77 ± 0.96 vs. 8.25 ± 1.43), technical operation (18.21 ± 1.26 vs. 16.91 ± 1.35), nursing evaluation (4.53 ± 0.28 vs. 4.29 ± 0.52), and health education stations (13.79 ± 1.31 vs. 11.93 ± 2.25; p < .01). In addition, the results indicated that the difference between standardized patient and examiner scores for physical examination skills was nonsignificant (8.70 ± 1.18 vs. 8.80 ± 1.27; p > .05). The modified, problem-focused, and nursing process–driven OSCE model effectively assessed nursing students’ clinical competencies, and clinical and critical thinking.


2017 ◽  
Author(s):  
Thane Blinman

Competency Based Medical Education CBME has been pushed into place as the dominant paradigm for medical and surgical education. Premised on the notion that medical proficiency can be rendered into predefined observable behaviors, CBME takes an explicitly reductionistic and behaviorist approach to professional education. Unfortunately, this approach is fatally flawed, relying on false premises of human action, employing unworkable methods, and unsupported by empirical data. Deployment of CBME has produced disintegration of surgical training, with loss of technical ability, degradation of professional ownership, and worsening clinical judgment. In contrast, an approach that updates working Halstedian techniques, explicitly favors cognitive integration, and deemphasizes artificial metrics of advancement holds more promise for creating proficient surgeons.


2021 ◽  
Vol 1 (3) ◽  
pp. 151-152
Author(s):  
M. Z. Nasyrov ◽  
Yu. P. Soldatov

The training of transosseous osteosynthesis for traumatologists-orthopedists is actual. Purpose of the research: development of OSKE stations in teaching the method of transosseous osteosynthesis. Material and methods. In the course of the simulation the technique of passing the spokes through the synthetic bone, the installation of the apparatus and the biomechanics of control are practiced. Results. The developed OSKE stations allowed us to objectively assess the level of training within a specified time frame. Conclusion: OSKE is an effective tool for the control of knowledge, including the methods of transosseous osteosynthesis. The inclusion of transosseous osteosynthesis station in the accreditation of orthopedic traumatologists is an urgent task.


Sign in / Sign up

Export Citation Format

Share Document