Teaching Clinical Reasoning to Undergraduate Medical Students

2005 ◽  
pp. 53-71 ◽  
Author(s):  
Jochanan Benbassat
2019 ◽  
Vol 19 (1) ◽  
Author(s):  
Fares Gouzi ◽  
Christophe Hédon ◽  
Léo Blervaque ◽  
Emilie Passerieux ◽  
Nils Kuster ◽  
...  

Abstract Background Over-testing of patients is a significant problem in clinical medicine that can be tackled by education. Clinical reasoning learning (CRL) is a potentially relevant method for teaching test ordering and interpretation. The feasibility might be improved by using an interactive whiteboard (IWB) during the CRL sessions to enhance student perceptions and behaviours around diagnostic tests. Overall, IWB/CRL could improve their skills. Methods Third-year undergraduate medical students enrolled in a vertically integrated curriculum were randomized into two groups before clinical placement in either a respiratory disease or respiratory physiology unit: IWB-based CRL plus clinical mentoring (IWB/CRL + CM: n = 40) or clinical mentoring only (CM-only: n = 40). Feasibility and learning outcomes were assessed. In addition, feedback via questionnaire of the IWB students and their classmates (n = 233) was compared. Results Analyses of the IWB/CRL sessions (n = 40, 27 paperboards) revealed that they met validated learning objectives. Students perceived IWB as useful and easy to use. After the IWB/CRL + CM sessions, students mentioned more hypothesis-based indications in a test ordering file (p <  0.001) and looked for more nonclinical signs directly on raw data tests (p <  0.01) compared with students in the CM-only group. Last, among students who attended pre- and post-assessments (n = 23), the number of diagnostic tests ordered did not change in the IWB/CRL + CM group (+ 7%; p = N.S), whereas it increased among CM-only students (+ 30%; p <  0.001). Test interpretability increased significantly in the IWB/CRL + CM group (from 4.7 to 37.2%; p <  0.01) but not significantly in the CM-only group (from 2.4 to 9.8%; p = 0.36). Conclusions Integrating IWB into CRL sessions is feasible to teach test ordering and interpretation to undergraduate students. Moreover, student feedback and prospective assessment suggested a positive impact of IWB/CRL sessions on students’ learning.


2020 ◽  
Vol 20 (1) ◽  
Author(s):  
Sophie Fürstenberg ◽  
Tillmann Helm ◽  
Sarah Prediger ◽  
Martina Kadmon ◽  
Pascal O. Berberat ◽  
...  

Abstract Background The clinical reasoning process, which requires biomedical knowledge, knowledge about problem-solving strategies, and knowledge about reasons for diagnostic procedures, is a key element of physicians’ daily practice but difficult to assess. The aim of this study was to empirically develop a Clinical Reasoning Indicators-History Taking-Scale (CRI-HT-S) and to assess the clinical reasoning ability of advanced medical students during a simulation involving history taking. Methods The Clinical Reasoning Indictors-History Taking-Scale (CRI-HT-S) including a 5-point Likert scale for assessment was designed from clinical reasoning indicators identified in a qualitative study in 2017. To assess indicators of clinical reasoning ability, 65 advanced medical students (semester 10, n = 25 versus final year, n = 40) from three medical schools participated in a 360-degree competence assessment in the role of beginning residents during a simulated first workday in hospital. This assessment included a consultation hour with five simulated patients which was videotaped. Videos of 325 patient consultations were assessed using the CRI-HT-S. A factor analysis was conducted and the students’ results were compared according to their advancement in undergraduate medical training. Results The clinical reasoning indicators of the CRI-HT-S loaded on three factors relevant for clinical reasoning: 1) focusing questions, 2) creating context, and 3) securing information. Students reached significantly different scores (p < .001) for the three factors (factor 1: 4.07 ± .47, factor 2: 3.72 ± .43, factor 3: 2.79 ± .83). Students in semester 10 reached significantly lower scores for factor 3 than students in their final year (p < .05). Conclusions The newly developed CRI-HT-S worked well for quantitative assessment of clinical reasoning indicators during history taking. Its three-factored structure helped to explore different aspects of clinical reasoning. Whether the CRI-HT-S has the potential to be used as a scale in objective structured clinical examinations (OCSEs) or in workplace-based assessments of clinical reasoning has to be investigated in further studies with larger student cohorts.


2021 ◽  
Vol 2 (2) ◽  
pp. 27-32
Author(s):  
Alexandra Rogler ◽  
Sophie Freilinger ◽  
Peter Pokieser ◽  
Michaela Wagner-Menghin

Clinical reasoning, the application of medical knowledge to a patient’s problem, requires training in a safe environment. Learning tasks based on Virtual Patients (VP-tasks) simulate the clinical setting in a save way and integrate well into blended-learning environments, as synchronous tasks (face-to-face or online) or as asynchronous online tasks. The article presents the editorial process for developing VP-based self-study quizes (SSQ) and field-study results on students’ learning experiences and study habits. The editorial process initially only involved experienced clinical, educational and technical experts. To better match the tasks’ difficulty to students’ knowledge, junior doctors and advanced medical students joined in a later stage. Students (n = 351) rated the SSQs (n = 10) produced by the expanded team to match their knowledge better as compared to the SSQs (n = 13) developed by the initial expert editorial team. Students rated the online SSQs as more helpful as compared to similar face-to-face VP-tasks. Students’ free comments indicate their high acceptance of the SSQ-format. The SSQ-format is feasible for providing systematic online training in clinical reasoning, especially when working with a multi-level-educational editorial team and when a systematically structured blueprint of topics and learning goals drives the editorial work.


2021 ◽  
Author(s):  
Ruth Plackett ◽  
Angelos P. Kassianos ◽  
Sophie Mylan ◽  
Maria Kambouri ◽  
Rosalind Raine ◽  
...  

Abstract Background Use of virtual patient educational tools could fill the current gap in the teaching of clinical reasoning skills. However, there is a limited understanding of their effectiveness. The aim of this study was to synthesise the evidence to understand the effectiveness of virtual patient tools aimed at improving undergraduate medical students’ clinical reasoning skills. Methods We searched MEDLINE, EMBASE, CINAHL, ERIC, Scopus, Web of Science and PsycINFO from 1990 to October 2020, to identify all experimental articles testing the effectiveness of virtual patient educational tools on medical students’ clinical reasoning skills. Quality of the articles was assessed using an adapted form of the MERSQI and the Newcastle-Ottawa Scale. A narrative synthesis summarised intervention features, how virtual patient tools were evaluated and reported effectiveness. Results The search revealed 7,290 articles, with 20 articles meeting the inclusion criteria. Average study quality was moderate (M=7.1, SD=2.5), with around a third not reporting any measurement of validity or reliability for their clinical reasoning outcome measure (7/20, 35%). Eleven articles found a positive effect of virtual patient tools on reasoning (11/20, 55%). Seven (7/20, 35%) reported no significant effect or mixed effects and two found a significantly negative effect (2/20, 10%). Several domains of clinical reasoning were evaluated. Data gathering, ideas about diagnosis and patient management were more often found to improve after virtual patient use (27/46 analyses, 59%) than knowledge, flexibility in thinking, problem-solving, and critical thinking (4/10 analyses, 40%). Conclusions Using virtual patient tools could effectively complement current teaching especially if opportunities for face-to-face teaching or other methods are limited, as there was some evidence that virtual patient educational tools can improve undergraduate medical students’ clinical reasoning skills. Evaluations that measured more case specific clinical reasoning domains, such as data gathering, showed more consistent improvement than general measures like problem-solving. Case specific measures might be more sensitive to change given the context dependent nature of clinical reasoning. Consistent use of validated clinical reasoning measures is needed to enable a meta-analysis to estimate effectiveness.


2016 ◽  
Vol 4 (2) ◽  
pp. 113-114
Author(s):  
Maryam Baradaran ◽  
Fariba Salek Ranjbarzadeh ◽  
Parisa Nikasa

2015 ◽  
Vol 11 (2) ◽  
pp. 117-124 ◽  
Author(s):  
Katrina A. Bramstedt ◽  
Ben Ierna ◽  
Victoria Woodcroft-Brown

Social media is a valuable tool in the practice of medicine, but it can also be an area of ‘treacherous waters’ for medical students. Those in their upper years of study are off-site and scattered broadly, undertaking clinical rotations; thus, in-house (university lecture) sessions are impractical. Nonetheless, during these clinical years students are generally high users of social media technology, putting them at risk of harm if they lack appropriate ethical awareness. We created a compulsory session in social media ethics (Doctoring and Social Media) offered in two online modes (narrated PowerPoint file or YouTube video) to fourth- and fifth-year undergraduate medical students. The novelty of our work was the use of SurveyMonkey® to deliver the file links, as well as to take attendance and deliver a post-session performance assessment. All 167 students completed the course and provided feedback. Overall, 73% Agreed or Strongly Agreed the course session would aid their professionalism skills and behaviours, and 95% supported delivery of the curriculum online. The most frequent areas of learning occurred in the following topics: email correspondence with patients, medical photography, and awareness of medical apps. SurveyMonkey® is a valuable and efficient tool for curriculum delivery, attendance taking, and assessment activities.


Sign in / Sign up

Export Citation Format

Share Document