scholarly journals The Annual Program Evaluation, Self-Study, and 10-Year Accreditation Site Visit: Connected Steps in Facilitating Program Improvement

2017 ◽  
Vol 9 (1) ◽  
pp. 147-149 ◽  
2018 ◽  
Vol 29 (2) ◽  
pp. 15-16
Author(s):  
Margaret Thomas-Evans ◽  
Carrie Longley ◽  
M. Michaux Parker
Keyword(s):  

Author(s):  
Shelley Anne Doucet ◽  
Diane MacKenzie ◽  
Elaine Loney ◽  
Anne Godden-Webster ◽  
Heidi Lauckner ◽  
...  

Background: The Dalhousie Health Mentors Program (DHMP) is a community-based, pre-licensure interprofessional education initiative that aims to prepare health professional students for collaborative practice in the care of patients with chronic conditions. This program evaluation explores the students’ 1) learning and plans to incorporate skills into future practice; 2) ratings of program content, delivery, and assignments; 3) perspectives of curricular factors that inadvertently acted as barriers to learning; and 4) program improvement suggestions.Methods: All students (N = 745) from the 16 participating health programs were invited to complete an online mixed methods program evaluation survey at the conclusion of the 2012–2013 DHMP. A total of 295 students (40% response rate) responded to the Likert-type questions analyzed using descriptive and non-parametric statistics. Of these students, 204 (69%) provided responses to 10 open-ended questions, which were analyzed thematically.Findings: While the majority of respondents agreed that they achieved the DHMP learning objectives, the mixed-methods approach identified curriculum integration, team composition, and effectiveness of learning assignments as factors that unintentionally acted as barriers to learning, with three key student recommendations for program improvement.Conclusions: Educators and program planners need to be aware that even well-intended learning activities may result in unintended experiences that hamper interprofessional learning.


2020 ◽  
Vol 11 (4) ◽  
pp. 34-44
Author(s):  
Jahnette Wilson ◽  
Sam Brower ◽  
Teresa Edgar ◽  
Amber Thompson ◽  
Shea Culpepper

Accountability and rigor in teacher education have been the focus of recent policy initiatives. Thus, data use practices have become increasingly critical to informing program improvement. Educational researchers have established self-study as a research methodology to intentionally be used by teacher educators to improve their practice. The purpose of the self-study described in this article was to examine the data use practices of one teacher preparation program in an effort to facilitate improvement of the program's capacity in using program data. The qualitative data gathered in this case study proved to be pivotal in the continuous improvement efforts of the teacher preparation program; thus, the usefulness and value of the findings within this case study have implications for how institutional self-study and qualitative data can support quantitative programmatic data in order to facilitate programmatic improvement initiatives.


1993 ◽  
Vol 44 (3) ◽  
pp. 190-199 ◽  
Author(s):  
Judy R. Wilkerson ◽  
Evelyn F. Searls ◽  
A. Edward Uprichard

2015 ◽  
Vol 30 (12) ◽  
pp. 1743 ◽  
Author(s):  
Hyun Bae Yoon ◽  
Jwa-Seop Shin ◽  
Seung-Hee Lee ◽  
Do-Hwan Kim ◽  
Jinyoung Hwang ◽  
...  

2014 ◽  
Vol 6 (1) ◽  
pp. 133-138 ◽  
Author(s):  
Aditee P. Narayan ◽  
Shari A. Whicker ◽  
Betty B. Staples ◽  
Jack Bookman ◽  
Kathleen W. Bartlett ◽  
...  

Abstract Background Program evaluation is important for assessing the effectiveness of the residency curriculum. Limited resources are available, however, and curriculum evaluation processes must be sustainable and well integrated into program improvement efforts. Intervention We describe the pediatric Clinical Skills Fair, an innovative method for evaluating the effectiveness of residency curriculum through assessment of trainees in 2 domains: medical knowledge/patient care and procedure. Each year from 2008 to 2011, interns completed the Clinical Skills Fair as rising interns in postgraduate year (PGY)-1 (R1s) and again at the end of the year, as rising residents in PGY-2 (R2s). Trainees completed the Clinical Skills Fair at the beginning and end of the intern year for each cohort to assess how well the curriculum prepared them to meet the intern goals and objectives. Results Participants were 48 R1s and 47 R2s. In the medical knowledge/patient care domain, intern scores improved from 48% to 65% correct (P < .001). Significant improvement was demonstrated in the following subdomains: jaundice (41% to 65% correct; P < .001), fever (67% to 94% correct; P < .001), and asthma (43% to 62% correct; P  =  .002). No significant change was noted within the arrhythmia subdomain. There was significant improvement in the procedure domain for all interns (χ2  =  32.82, P < .001). Conclusions The Clinical Skills Fair is a readily implemented and sustainable method for our residency program curriculum assessment. Its feasibility may allow other programs to assess their curriculum and track the impact of programmatic changes; it may be particularly useful for program evaluation committees.


2016 ◽  
Vol 16 (6) ◽  
pp. e24
Author(s):  
Cherie Lewis ◽  
Savanna Carson ◽  
Alan Chin ◽  
Kate Perkins ◽  
James Lee

Author(s):  
Dewa Gede Hendra Divayana ◽  
Baso Intang Sappaile ◽  
I Gusti Ngurah Pujawan ◽  
I Ketut Dibia ◽  
Luh Artaningsih ◽  
...  

<span lang="EN-GB">One of various types of application program which could facilitate human activities is known as expert system application program. The concept of the expert system of the application program so important that it requires to be shared to other people by developing a course program called expert system conducted in particular by an institution which specialized in the field of information technology. This course program of expert system which is offered to the students need to be designed in terms of good instructional process to improve its effectiveness. In fact, not all existing course programs conducted based on expert system could run more effectively than it was expected in order to achieve higher quality of learning outcomes. This could be explained as a numerous problems occurred during the process of implementation. In order to discover the extend of program effectivity in the expert system of the course, a through program evaluation is required to be conducted. One model called CSE-UCLA will be recommended in making deeper instructional program evaluation in relation to expert system of course program. Through this model the instructional affectivity of application program of expert system could be measured by involving system assessment, program planning, program</span><span lang="EN-GB"> implementation, program improvement, and program certification.</span>


Sign in / Sign up

Export Citation Format

Share Document