faculty assessment
Recently Published Documents


TOTAL DOCUMENTS

74
(FIVE YEARS 21)

H-INDEX

8
(FIVE YEARS 1)

2021 ◽  
Vol 4 (4) ◽  
Author(s):  
Ashley Fankhauser ◽  
Morgan Kessler ◽  
Cathy A. McCarty ◽  
Amy Greminger

to guide procedures. As the technology becomes more portable and affordable, schools have increasingly utilized this technology in training physicians. Ultrasonography may be especially useful in rural settings to fill the limitations that rural hospitals have in terms of imaging. The mission of many regional medical campuses is to train physicians to work in rural or underserved communities. Given this goal, we wanted to explore how regional medical campuses are utilizing ultrasound preclinically and determine the best approach for developing a standardized ultrasound curriculum, keeping regional medical campus resources in mind. A literature review of medical schools’ preclinical ultrasound curriculum was completed, and information was collected regarding curriculum programming, faculty, assessment, and student feedback. Based on data from this search, a fourteen question Qualtrics survey was sent out to regional medical campuses with questions regarding the use of ultrasound in their own preclinical curriculum. Of the 11 campuses that responded, 10 (90.9%) indicated that they include ultrasound in their curriculum. Respondents from nine of these schools progressed through the survey and information regarding topics covered in ultrasound curriculum, teachers of curriculum, patients used, ultrasound equipment used, and assessment of student knowledge all varied among medical campuses. The data suggested that regional medical campuses are focusing on similar aspects of ultrasound curriculum however, a standardized curriculum does not currently exist to ensure that all students are receiving similar ultrasound training. 


2021 ◽  
Vol 13 (6) ◽  
pp. 833-840
Author(s):  
Judith C. French ◽  
Lily C. Pien

ABSTRACT Background Written feedback by faculty of resident performance is valuable when it includes components based on assessment for learning. However, it is not clear how often assessment forms include these components for summative and formative feedback. Objective To analyze prompts used in forms for faculty assessment of resident performance, guided by best practices in survey research methodology, self-regulation theory, and competency-based assessment. Methods A document analysis, which is a qualitative approach used to analyze content and structure of texts, was completed on assessment forms nationally available in MedHub. Due to the number of forms available, only internal medicine and surgery specialties were included. A document summary form was created to analyze the assessments. The summary form guided researchers through the analysis. Results Forty-eight forms were reviewed, each from a unique residency program. All forms provided a textbox for comments, and 54% made this textbox required for assessment completion. Eighty-three percent of assessments placed the open textbox at the end of the form. One-third of forms contained a simple prompt, “Comments,” for the narrative section. Fifteen percent of forms included a box to check if the information on the form had been discussed with the resident. Fifty percent of the assessments were unclear if they were meant to be formative or summative in nature. Conclusions Our document analysis of assessment forms revealed they do not always follow best practices in survey design for narrative sections, nor do they universally address elements deemed important for promotion of self-regulation and competency-based assessment.


2021 ◽  
Vol 111 (1) ◽  
pp. e6-e7
Author(s):  
Greeshma Rajeev-Kumar ◽  
Rajashri Manjunath ◽  
Rahul Tendulkar ◽  
Kimberly Corbin ◽  
Reshma Jagsi ◽  
...  

2021 ◽  
pp. 028418512098157
Author(s):  
Mary L Dinh ◽  
Rana Yazdani ◽  
Nikhil Godiyal ◽  
Cory M Pfeifer

Background Overnight radiology resident discrepancies have been described in multiple studies; however, study of resident discrepancies specific to pediatric radiology is limited. Purpose To examine radiology resident discrepancies as they pertain to a large pediatric hospital system. Material and Methods A total of 21,560 preliminary reports issued by 39 residents over a one-year period were scored as agreement, minor discrepancy, or major discrepancy by faculty members using a modification of the 2009 RADPEER scoring system. Residents were trainees of three different diagnostic radiology programs: large university-based, medium-sized community-based, or small community-based. Discrepancy rates were evaluated based on resident postgraduate year, program, and imaging modality. The effect of a general pediatric radiology report versus pediatric neuroradiology report of a CT scan was also tested. CT was the only modality in which there were comparable numbers of studies scored by both general pediatric radiologists and neuroradiologists. Results The rate of major resident to faculty assessment discrepancies was 1.01%, and the rate of minor resident to faculty assessment discrepancies was 4.47%. Major discrepancy rates by postgraduate years 3-5 were 1.08%, 0.75%, and 1.59%, respectively. Major discrepancy rates were highest for MR (11.22%), followed by CT (1.82%), radiographs (0.91%), and ultrasound (0.56%). There was no significant difference in discrepancy rate between residency programs and general pediatric radiology report of a CT versus pediatric neuroradiology report of a CT. Conclusion Radiology discrepancy rates for residents issuing preliminary reports at a large children’s hospital system are similar to those reported for adult procedures.


2020 ◽  
Author(s):  
Shaikha Alzaabi ◽  
Mohammed Nasaif ◽  
Amar Hassan Khamis ◽  
Farah Otaki ◽  
Nabil Zary ◽  
...  

BACKGROUND The utility of peer learning in clinical skills is well recognized and researched given the many benefits gained, such as: enhanced learning, alleviation of the burden on faculty, and early development of teaching skills of future doctors. However, little is known in terms of its effectiveness as an assessment tool and the extent to which peer assessment can be relied upon in the absence of faculty support. OBJECTIVE This study was conducted to assess medical students' perception towards peer learning as a tool of assessment and compare peer to faculty evaluation of clinical skill performance. METHODS A cohort of 36 third year medical students were exposed to clinical skill's focused, same-level, peer learning for three months. A mixed method approach was adapted to collect data including students' perception of peer learning, performance scores, and reflective observational analysis. A five -point-Likert scale - Instrument was used to assess (n=28) students' perception on the value of peer learning. The students were asked to assess their peers using a pre-set checklist on clinical skill's performance and scores were compared to faculty assessment scores. Reflective observational data was collected from observing video recordings of some of the peer learning sessions. RESULTS Twenty five out of 28 students were completed the survey. Twenty students perceived peer-learning as valuable in clinical skills' education. The mean score of peers compared to faculty assessment was higher. There was a significant difference in student performance between the supervised teaching and peer learning groups with a (P =0.003). Observations have shown that most students focus on mastery of skill with little attention to the technique's quality. Also, students were unable to appreciate the relevance of potential clinical findings of physical examination. CONCLUSIONS Peer learning in clinical skills' teaching empowers the students to develop a more responsible approach towards their education. However, peer assessment is insufficient to evaluate clinical skills' performance in the absence of faculty support. Therefore, we recommend the presence of faculty to guide and supervise peer learning activities.


2020 ◽  
Vol 32 (6) ◽  
pp. 10-13
Author(s):  
Karen Singer‐Freeman ◽  
Christine Robinson ◽  
Elise Demeter ◽  
Mitchel L. Cottenoir ◽  
Harriet T. Hobbs

Sign in / Sign up

Export Citation Format

Share Document