Task and Performance-Based Assessment

2017 ◽  
pp. 121-133 ◽  
Author(s):  
Gillian Wigglesworth ◽  
Kellie Frost
2021 ◽  
Vol 5 (2) ◽  
pp. 109-127
Author(s):  
Yuliana Yuliana

Online learning has become a requirement during the COVID-19 pandemic. The Ministry of Education asks the teachers and students to use online classes during the pandemic. Teaching English using ICT is a challenging condition for teachers. Not all teachers are familiar with ICT. This paper aims to describe the ICT Role and implementation in Teaching English during the COVID-19 pandemic. Method: this is a literature review. Literature was searched in Science Direct and Google Scholar databases. Keywords were COVID-19, English Teaching, ICT, implementation, role. Results revealed that ICT in English Teaching is started from CALL (Computer Assisted Language Learning), TELL (Technology-Enhanced Language Learners), LMS (Learning Management System), and blended learning. YouTube and WhatsApp are preferred to be done because it is practical. Performance-based assessment is important during teaching English subject because students can learn how to perform, speaking, and debating during the performance. In conclusion, there are many systems available for teaching English using ICT. The systems are CALL, TELL, LMS, blended, WhatsApp, YouTube, and Performance-Based Assessment. The main goal is students’ understanding of the English subjects. The choice depends on study goal, teachers’ and students’ preference also feasibilities.  


2002 ◽  
Vol 46 (2) ◽  
pp. 365-378 ◽  
Author(s):  
Andrew Clifford

Abstract This article examines interpreter assessment and draws attention to the limits of a lexico-semantic approach. It proposes using features of discourse theory to identify some of the competencies needed to interpret and suggests developing assessment instruments with the technical rigour common in other fields. The author gives examples of discursive features in interpretation and shows how these elements might be used to construct a rubric for assessing interpreter performance.


2009 ◽  
Vol 19 (4) ◽  
pp. 633-640 ◽  
Author(s):  
Benedict Martin Wand ◽  
Lara A. Chiffelle ◽  
Neil Edward O’Connell ◽  
James Henry McAuley ◽  
Lorraine Hilary DeSouza

2000 ◽  
Vol 124 (2) ◽  
pp. 195-202 ◽  
Author(s):  
Peter J. Howanitz ◽  
Paul N. Valenstein ◽  
Glen Fine

Abstract Objective.—To survey employee competence assessment practices in departments of pathology and laboratory medicine and provide suggestions for improvement. Design.—A 3-part study consisting of a questionnaire about current competence assessment practices, an evaluation of compliance with stated competence assessment practices using personnel records of 30 employees, and a written appraisal of competence of 5 specimen-processing staff members per institution. Setting.—A total of 522 institutions participating in the College of American Pathologists 1996 Q-Probes program. Main Outcome Measures.—Institutional competence assessment practices, compliance of each institution with their own practices, and determination of competence of specimen-processing personnel. Results.—Of the participating institutions, 89.8% had a written competence plan and 98.1% reported reviewing employee competence at least yearly. General competence was reviewed by direct observations (87.5%), review of test or quality control results (77.4%), review of instrument preventive maintenance (60.0%), written testing (52.2%), and/or other methods (20.8%). In 8.6% of institutions, employees who failed competence assessment were not allowed to continue their usual work. On review of records of 14 029 employees for adherence to the laboratory's general competence plan, adherence was 89.7% for direct observations, 85.8% for review of quality control and test results, 78.0% for review of instrument records, and 74.0% for written testing. Employee failure rate ranged from 0.9% to 6.4%, depending on the competence evaluated. Adherence to an institution's plan was 90.4% for new employees, 93.1% for computer skills, 95.8% for laboratory safety, and 92.1% for continuing education. When a written competence assessment was given to 2853 specimen-processing staff members, 90.0% responded satisfactorily. Conclusions.—Opportunities for improvement in employee competence assessment are numerous, and we provide several specific suggestions.


Author(s):  
Gillian Wigglesworth ◽  
Kellie Frost

2018 ◽  
Vol 55 (1) ◽  
pp. 52-77
Author(s):  
Derek M. Fay ◽  
Roy Levy ◽  
Vandhana Mehta

Sign in / Sign up

Export Citation Format

Share Document