scholarly journals Evaluation of Professional Development in the Use of Arts-Integrated Activities with Mathematics Content: Findings About Program Implementation

Author(s):  
Meredith Jane Ludwig ◽  
Mengli Song ◽  
Akua Kouyate-Tate ◽  
Jennifer E. Cooper ◽  
Lori Phillips ◽  
...  
2002 ◽  
Vol 35 (2) ◽  
pp. 161-170 ◽  
Author(s):  
Albert S. Lozano ◽  
Hyekyung Sung ◽  
Amado M. Padilla ◽  
Duarte M. Silva

2018 ◽  
Author(s):  
Friyatmi

This study aimed to evaluate the effectiveness of the implementation of the boarding Teacher Professional Development Program (PPG SM-3T) in State University of Padang (SUP). This research was an evaluation study using a part of the CIPP model, namely the process evaluation. The research questions for this study were 1) how is the effectiveness of the implementation of PPG SM-3T boarding program? 2) what are the weaknesses in implementing the PPG SM-3T boarding program in SUP? Data in this study were collecting using questionnaires and interview techniques. Informants of this study were the participants of the PPG SM-3T PSU and managers program. The data were analyzed using descriptive statistic techniques. The results of this study revealed that all over management of program implementation was considered less effective by the participants. The weaknesses of the program implementation are as follows. 1) The boarding education programs was less-organized, causing some programs were not function properly and often the schedule were not followed strictly; (2) Lack of coordination between the management and weakness supervision/controlling manager on the boarding activities resulted in less well executed program. (3) Meals service agent were less professional, resulting in the low quality and less variation food being served to the participants.


2021 ◽  
pp. 0193841X2110553
Author(s):  
Giovanni Abbiati ◽  
Gianluca Argentin ◽  
Andrea Caputo ◽  
Aline Pennisi

Background A recent stream of literature recognizes the impact of good/poor implementation on the effectiveness of programs. However, implementation is often disregarded in randomized controlled trials (RCTs) because they are run on a small scale. Replicated RCTs, although rare, provide a unique opportunity to study the relevance of implementation for program effectiveness. Objectives Evaluating the effectiveness of an at-scale professional development program for lower secondary school math teachers through two repeated RCTs. Research Design The program lasts a full school year and provides innovative methods for teaching math. The evaluation was conducted on two cohorts of teachers in the 2009/10 and 2010/11 school years. The program and RCTs were held at scale. Participating teachers and their classes were followed for 3 years. Impact is estimated by comparing the math scores of treatment and control students. Subjects The evaluation involved 195 teachers and their 3940 students (first cohort) and 146 teachers and their 2858 students (second cohort). Measures The key outcome is students’ math achievement, measured through standardized assessment. Results In the first wave, the program did not impact on students’ achievement, while in the second wave, a positive, persistent, and not insignificant effect was found. After excluding other sources of change, different findings across waves are interpreted in the light of improvements in the program implementation, such as enrollment procedure, teacher collaboration, and integration of innovation in daily teaching. Conclusions Repeated assessment of interventions already at-scale provides the opportunity to better identify and correct sources of weak implementation, potentially improving effectiveness.


2002 ◽  
Vol 18 (1) ◽  
pp. 29-37 ◽  
Author(s):  
Carl J. Liaupsin

Training for pre-service and in-service teachers that is delivered through distance education methods solves various problems of traditional professional development. These problems include issues such as providing a consistent message, training large numbers of personnel, and overcoming scheduling and funding problems. As with other forms of instruction, professional development training materials should be subjected to a level of evaluation that increases the likelihood of successful implementation. However, it is unclear how this need for evaluation of professional development software is being met. Commercial software companies have been found to eschew evaluation of their products and classroom teachers can be expected to conduct only limited software evaluations of the software. This article contends that educational researchers are well positioned to conduct comprehensive evaluations of the professional development software they create. This purpose of this review of the literature is to (a) consolidate the database of literature with regard to professional development software, (b) examine the degree to which the software described in the literature has been comprehensively evaluated, and (c) provide suggestions for future research.


1998 ◽  
Vol 12 (2) ◽  
pp. 191-207 ◽  
Author(s):  
Artur Poczwardowski ◽  
Clay P. Sherman ◽  
Keith P. Henschen

This article outlines 11 factors that a consultant may consider when planning, implementing, and evaluating psychological services. These factors are professional boundaries; professional philosophy; making contact; assessment; conceptualizing athletes’ concerns and potential interventions; range, types, and organization of service; program implementation; managing the self as an intervention instrument; program and consultant evaluation; conclusions and implications; and leaving the setting. All 11 factors represent important considerations for applied sport psychology professionals. Although consultants each have their own unique style and approach, these 11 factors are prerequisite considerations that form the foundation of a consultant’s effective practice. These guidelines may provide direction for a practitioner’s professional development, and as such, need time and commitment to be realized.


2021 ◽  
Vol 5 (2) ◽  
pp. 162-171
Author(s):  
Santi Setiani Hasanah* ◽  
Anna Permanasari ◽  
Riandi Riandi

During the pandemic, face-to-face training should not be carried out to reduce the spread of the covid outbreak. Therefore, online teacher professional development is an alternative method to replace the face-to-face training. In a program implementation, evaluation is an important component to determine whether the program has been implemented well or not. This evaluation is important to determine the effect of this online training on improving STEM PCK teachers. The evaluation used the CIPP model (Context, Input, Process, and Product), with the embedded mixed method research. This research was conducted on 60 science teachers of SMP alumni online training organized by PPPPTK IPA, from the “Sayangi Bumi” classroom. The results of the context evaluation show that 100% of respondents stated that the program is in accordance with the needs of teachers to strengthen their PCK., while the resources used in the program (Input evaluation) were good. The program implementation process ran 100% as planned and the product evaluation shows an increased teachers' understanding of the STEM approach. Teacher considered that the online professional development is very useful for teachers and can develop their skills in implementing STEM learning during distance learning. Online PD, apart from low-cost, can actually be an alternative way in improving teachers' STEM PCK competence and it also can reach all islands in Indonesia


Author(s):  
Nazmul Islam ◽  
Amy A. Weimer

Engaging undergraduate students in research not only improves discipline-specific knowledge and skillsets, but also exposes them to increased research-related career paths, and motivates their pursuit of graduate study. With an interest in increasing these outcomes among students, the University of Texas Rio Grande Valley (UTRGV) developed the Student Mentoring and Research Training (SMART) program. The primary objective of the program was to provide an increasing number of undergraduate student (UG) research opportunities by building triadic teams comprised of (1) a faculty mentor, (2) a graduate student assistant, and (3) at least three undergraduate students. By utilizing graduate student mentors, in collaboration with faculty researchers, an increased number of undergraduates could benefit from participation in these experiential learning opportunities. SMART also encouraged graduate student professional development as each graduate student oversaw a research project and was responsible for mentoring the UGs over a five-week period of employment. Three professional development workshops were implemented for graduate mentors. Workshops focused on teaching graduate students best practices in teaching of research skills, and building motivation, teamwork, and leadership. Pre- and post-test surveys were used to assess program effectiveness. Findings are reported on SMART program outcomes, which include analyses of quantitative and qualitative data collected from undergraduate student mentees and graduate student mentors during the first year of program implementation.


Sign in / Sign up

Export Citation Format

Share Document