132 Assessment Tools to Measure Clinical Reasoning While Attending Simulation-Based Courses

Author(s):  
Emad Almomani ◽  
Guillaume Alinier ◽  
Natalie Pattison ◽  
Jisha Samuel

Clinical reasoning is interconnected with decision-making which is a critical element to ensure patient safety [1]. To avoid practice mistakes, healthcare professionals should be competent with effective clinical reasoning skills. To develop effective clinical reasoning skills, healthcare professionals should get the chance to practise and be exposed to various experiences and levels of patient complexities. Simulation can immerse learners in scenarios that mimic clinical situations, simultaneously mitigating safety risks and increasing standardization in healthcare education [2]. Through simulation, learners can get the chance to practise clinical reasoning with focussed learning opportunities [3]. Several assessment tools have been used to measure clinical reasoning while attending simulation-based activities. However, we would like to explore the most valid and reliable tools to assess clinical reasoning while attending simulation, in addition to finding out whether these tools have considered the seniority and competency levels of their users.A scoping review was undertaken to answer the questions: What are the best available valid and reliable tools to evaluate clinical reasoning while attending simulation-based activities? Do we have valid and reliable clinical reasoning assessment tools for simulation that measure clinical reasoning considering different seniority and competency levels? We searched Medline, Scopus, Education Research Complete, and Google Scholar to identify relevant recent primary research conducted on this topic from 2000 onwards. The search included MeSH topics of: ‘Clinical reasoning’, ‘Simulation-based courses’ and ‘Clinical Reasoning tools’. The inclusion criteria were primary studies that described the use of tools measuring clinical reasoning while attending simulation-based courses. Two independent researchers agreed on the inclusion of the identified papers for full-text review. This review followed the review guidelines of Joanne Briggs institute.There are valid and reliable tools to evaluate clinical reasoning while attending simulation which is Clinical Reasoning Evaluation Simulation Tool CREST [1]; 
Lasater Clinical Judgment Rubric LCJR [4]; Creighton Competency Evaluation Instrument Creighton C-SEI- Tool [5]. 
However, the validity and reliability of these tools were tested on undergraduate student nurses, and there was no consideration for different seniority and competence levels, and applicability to other healthcare professions.There is an adequate number of tools to measure clinical reasoning while attending simulation. However, there is a significant basis to test the reliability and validity of these tools against different competence and seniority levels, and applicability to other healthcare professions.

2020 ◽  
Vol 5 (2) ◽  
pp. 28-35
Author(s):  
Jenny Novina Sitepu

Backgroud: Clinical skills is one of competency as a doctor. Objective Structured Clinical Examination (OSCE) is an ideal way to assess clinical skills for undergraduated, graduated, and postdraduated clinical students. The low score in some OSCE station can be an input for teaching and curriculum improvement. This study aim to analyzed student competency achievement in first term in 2017/2018 academic year in  Fakultas Kedokteran Universitas HKBP Nommensen. Methods: This study was qualitative study with descriptive design. The sample was OSCE score in first term in 2017/2018 academic year. Student achievement was the mean score of every student in all station in OSCE. Competency achievement was the mean of students score for every competency in OSCE. Next, the stations was categorized in practice/ procedure skills station and clinical reasoning skills station. Skills achievement was got form the mean of score (in percent) of procedure skills and clinical reasoning station. Indept interview with students and lectures was held to knowed their perception about OSCE. Results: Students’ achievement in OSCE of first term academic year 2017/2018 was 62.4% for 2015’s students, and 64.6% for 2016’ students. The lowest competency achievement of 2015’s students was diagnosis and differential diagnosis. For the 2016’s students, it was farmacology treatment. Practice/ procedure skills achievement in OSCE of first term academic year 2017/2018 was 61.34% (2015’s students) and 74.4% (2016’s students). The clinical reasoning skills achievement was 62.80% (2015’s students), and 58.77% (2016’s students). Based on indept interview, the things that make student’s achievement low were the clinical reasoning ability of students was still low, the standard patient that involved in OSCE didn’t acted properly, the students’ knowledge about medicine and prescription was poor, and there were lot of learning schedules and learning subjects that students must did and learned. Conclusions:  Students’ achievement in OSCE of first term academic year 2017/2018 is need to  be improved.


Author(s):  
Jordan D. Tayce ◽  
Ashley B. Saunders

The development of clinical reasoning skills is a high priority during clinical service, but an unpredictable case load and limited time for formal instruction makes it challenging for faculty to foster and assess students’ individual clinical reasoning skills. We developed an assessment for learning activity that helps students build their clinical reasoning skills based on a modified version of the script concordance test (SCT). To modify the standard SCT, we simplified it by limiting students to a 3-point Likert scale instead of a 5-point scale and added a free-text box for students to provide justification for their answer. Students completed the modified SCT during clinical rounds to prompt a group discussion with the instructor. Student feedback was positive, and the instructor gained valuable insight into the students’ thought process. A modified SCT can be adopted as part of a multimodal approach to teaching on the clinic floor. The purpose of this article is to describe our modifications to the standard SCT and findings from implementation in a clinical rounds setting as a method of formative assessment for learning and developing clinical reasoning skills.


Author(s):  
Fancy Paul K

Evidence-based educational practices always pave a way for better reasoning, judgment and decision making in clinical setting. Clinical Reasoning Web is a method of critical patient analysis in which relationships among nursing diagnoses supports the development of clinical reasoning skills. This method helps students to learn thinking like a nurse. Effective clinical reasoning ability promotes skills to collect data, solve problems, make decisions, provide quality care and survive in the workplace. Explaining relationships among nursing diagnoses supports the development of clinical reasoning skills. Explanations also encourage nurses and nursing students to reason forward from a problem to an outcome and also backwards from the outcome or effect to the current state of the patient. In this paper, the author has described the nursing care of a client, who was alleged with a road traffic accident and diagnosed to have right fronto-temporal-parietal contusion, brainstem contusion, acute extradural hematoma left temporal region. Cranio-cerebral trauma and traumatic brain injury are general designations to denote injury to the skull, brain, or both that is of sufficient magnitude to interfere with normal function and require treatment. The patient was unconscious, GCS- E 1VT M1 -2T/15 (verbal response cannot be assessed because of ET tube). Here the author illustrates how effectively a clinical reasoning web can be formulated by identifying keystone issue and related problems.


Diagnosis ◽  
2019 ◽  
Vol 6 (2) ◽  
pp. 115-119 ◽  
Author(s):  
Shwetha Iyer ◽  
Erin Goss ◽  
Casey Browder ◽  
Gerald Paccione ◽  
Julia Arnsten

Abstract Background Errors in medicine are common and often tied to diagnosis. Educating physicians about the science of cognitive decision-making, especially during medical school and residency when trainees are still forming clinical habits, may enhance awareness of individual cognitive biases and has the potential to reduce diagnostic errors and improve patient safety. Methods The authors aimed to develop, implement and evaluate a clinical reasoning curriculum for Internal Medicine residents. The authors developed and delivered a clinical reasoning curriculum to 47 PGY2 residents in an Internal Medicine Residency Program at a large urban hospital. The clinical reasoning curriculum consists of six to seven sessions with the specific aims of: (1) educating residents on cognitive steps and reasoning strategies used in clinical reasoning; (2) acknowledging the pitfalls of clinical reasoning and learning how cognitive biases can lead to clinical errors; (3) expanding differential diagnostic ability and developing illness scripts that incorporate discrete clinical prediction rules; and (4) providing opportunities for residents to reflect on their own clinical reasoning (also known as metacognition). Results Forty-seven PGY2 residents participated in the curriculum (2013–2016). Self-assessed comfort in recognizing and applying clinical reasoning skills increased in 15 of 15 domains (p < 0.05 for each). Resident mean scores on the knowledge assessment improved from 58% pre-curriculum to 81% post curriculum (p = 0.002). Conclusions A case vignette-based clinical reasoning curriculum can effectively increase residents’ knowledge of clinical reasoning concepts and improve residents’ self-assessed comfort in recognizing and applying clinical reasoning skills.


Sign in / Sign up

Export Citation Format

Share Document