DOES CARDIAC PHYSICAL EXAM TEACHING USING A CARDIAC SIMULATOR IMPROVE MEDICAL STUDENTS’ DIAGNOSTIC SKILLS?

2017 ◽  
Vol 33 (10) ◽  
pp. S44-S45
Author(s):  
N. Gauthier ◽  
C. Johnson ◽  
M. Keenan ◽  
E. Stadnick ◽  
M. Sostok ◽  
...  
Cureus ◽  
2019 ◽  
Author(s):  
Nadine Gauthier ◽  
Christopher Johnson ◽  
Ellamae Stadnick ◽  
Marissa Keenan ◽  
Timothy Wood ◽  
...  

2020 ◽  
Vol 72 (6) ◽  
pp. 488-491
Author(s):  
Sumanas Bunyaratavej ◽  
Rungsima Kiratiwongwan ◽  
Pichaya Limphoka ◽  
Kamonpan Lertrujiwanit ◽  
Charussri Leeyaphan

Objective: To compared pattern recognition abilities of final-year medical students and dermatology residents to distinguish and classify superficial fungal infections and resembling lesions.Methods: The study was conducted at the Department of Dermatology, Faculty of Medicine, Siriraj Hospital, Mahidol University, Bangkok, Thailand, in 2019. The participants had to make diagnosis from 78 images including typical and atypical lesions within 50 second. No history or any description was given. The answer sheets were reviewed.Results: Medical students (n = 18) and dermatology residents (n = 19) showed no significant differences in the means of overall accuracy scores. Residents demonstrated a statistically higher mean score than the medical students in diagnoses of anthropophilic infection with mostly presented with typical lesion. However, there were no significant differences in the mean scores for their diagnoses of zoophilic dermatophytosis as atypical lesions and other skin lesions.Conclusion: Pattern recognition was helpful for the diagnosis of cutaneous dermatophytosis, especially in cases of typical lesions. Nonetheless, pattern recognition alone is insufficient for the diagnosis of atypical dermatophytosis lesions: analytical diagnostic skills should also be enhanced to an increase in the accuracies of atypical-lesion diagnoses.


PEDIATRICS ◽  
1972 ◽  
Vol 49 (1) ◽  
pp. 151-152
Author(s):  
Howard L. Weinberger ◽  
Herbert Schneiderman ◽  
Robert Jensen

We should like to present a form developed in a University Hospital Well Child Clinic for use by medical students and house officers. This form was designed to: (a) specify a minimal data base; (b) allow easy recording of normal data; (c) allow easy scanning for abnormal findings; (d) recognize and highlight changes in physical examination which are appropriate at different ages; and (e) allow for relative efficiency in retrieval of information for the supervisor or instructor.


2018 ◽  
Vol 93 (5) ◽  
pp. 736-741 ◽  
Author(s):  
Toshiko Uchida ◽  
Francis I. Achike ◽  
Angela D. Blood ◽  
Mary Boyle ◽  
Jeanne M. Farnan ◽  
...  

2008 ◽  
Vol 13 (1) ◽  
pp. 4482 ◽  
Author(s):  
Lawrence S. Amesse ◽  
Ealena Callendar ◽  
Teresa Pfaff-Amesse ◽  
Janice Duke ◽  
William N.P. Herbert

2008 ◽  
Vol 23 (7) ◽  
pp. 991-997 ◽  
Author(s):  
Sheila Naghshineh ◽  
Janet P. Hafler ◽  
Alexa R. Miller ◽  
Maria A. Blanco ◽  
Stuart R. Lipsitz ◽  
...  

2020 ◽  
Vol 129 (7) ◽  
pp. 715-721
Author(s):  
Mads J. Guldager ◽  
Jacob Melchiors ◽  
Steven Arild Wuyts Andersen

Objective: Handheld otoscopy requires both technical and diagnostic skills, and is often reported to be insufficient after medical training. We aimed to develop and gather validity evidence for an assessment tool for handheld otoscopy using contemporary medical educational standards. Study Design: Educational study. Setting: University/teaching hospital. Subjects and Methods: A structured Delphi methodology was used to develop the assessment tool: nine key opinion leaders (otologists) in undergraduate training of otoscopy iteratively achieved consensus on the content. Next, validity evidence was gathered by the videotaped assessment of two handheld otoscopy performances of 15 medical students (novices) and 11 specialists in otorhinolaryngology using two raters. Standard setting (pass/fail criteria) was explored using the contrasting groups and Angoff methods. Results: The developed Copenhagen Assessment Tool of Handheld Otoscopy Skills (CATHOS) consists 10 items rated using a 5-point Likert scale with descriptive anchors. Validity evidence was collected and structured according to Messick’s framework: for example the CATHOS had excellent discriminative validity (mean difference in performance between novices and experts 20.4 out of 50 points, P < .001); and high internal consistency (Cronbach’s alpha = 0.94). Finally, a pass/fail score was established at 30 points for medical students and 42 points for specialists in ORL. Conclusion: We have developed and gathered validity evidence for an assessment tool of technical skills of handheld otoscopy and set standards of performance. Standardized assessment allows for individualized learning to the level of proficiency and could be implemented in under- and postgraduate handheld otoscopy training curricula, and is also useful in evaluating training interventions. Level of evidence: NA


Sign in / Sign up

Export Citation Format

Share Document