BACKGROUND
Improving clinical reasoning skills — the thought processes used by clinicians during consultations to formulate appropriate questions and diagnoses — is essential for reducing missed diagnostic opportunities. The electronic Clinical Reasoning Educational Simulation Tool (eCREST) was developed to improve future doctors’ clinical reasoning skills. A feasibility study demonstrated acceptability and potential impacts but the processes by which students developed their clinical reasoning is unknown.
OBJECTIVE
To identify and characterize final-year medical students’ clinical reasoning strategies while using eCREST; to explore how students interacted with eCREST.
METHODS
A sequential mixed methods design was used. Quantitative data captured in a feasibility trial across three UK medical schools (n=148) was used to identify typologies of reasoning, based on the proportion of essential information students identified and the proportion of relevant questions they asked a virtual patient. Strategies were compared between the intervention and control group. A qualitative think-aloud and semi-structured interview study was then undertaken with 16 final year medical students from one medical school to explore how students reasoned while using eCREST. Themes generated from qualitative data were used to expand the typologies of strategies.
RESULTS
Three types of clinical reasoning strategy were identified: ‘Focused’ (elicited most essential information and asked few irrelevant questions; n=78/148, 53%), ‘Thorough’ (elicited most essential information but asked many irrelevant questions; n=33/148, 22%) and ‘Succinct’ (elicited little essential information but asked few irrelevant questions; n=27/148, 18%). One group were ‘Non-strategic’ (did not elicit enough essential information and asked mostly irrelevant questions; n=10/148, 7%). In the feasibility trial, the intervention group, were significantly more likely to adopt a ‘Thorough’ strategy than controls (21/78, 27% vs 6/70, 9%) and less likely to adopt a ‘Succinct’ strategy (13/78, 17% vs 20/70, 29%); χ2 (3)=9.87, P=.02. Use of other strategies were similar across groups. Thematic analysis identified three dimensions underpinning reasoning: data gathering processes, generating diagnostic hypotheses, confidence and uncertainty. The mixed methods analysis indicated that those classified as ‘Thorough’ asked many questions to avoid missing key information and reported that eCREST helped them to manage uncertainty. The ‘Succinct’ group aimed to limit the number of questions asked and eCREST helped them to focus on asking pertinent questions. The ‘Focused’ group had clear rationales for asking questions but those who used a ‘Non-strategic’ approach did not and may have found eCREST less useful in developing their clinical reasoning.
CONCLUSIONS
Students apply a range of clinical reasoning strategies to online patient simulations like eCREST. eCREST led students to use more ‘Thorough’ strategies and students reported it helped them to manage uncertainty, which could help future doctors to identify missed diagnostic opportunities. eCREST could also be used by educators to support students to develop their clinical reasoning strategies.