scholarly journals DEVELOPMENTOF ANALYTIC RUBRICS FOR COMPETENCY ASSESSMENT

Author(s):  
Nikita Dawe ◽  
Gayle Lesmond ◽  
Susan McCahan ◽  
Lisa Romkey

This project aims to create and validate generic rubrics that can be used to authentically assess learning outcomes in core competency areas. As these rubrics are intended for ongoing use by students and educators who have had no involvement in their development, ensuring consistent interpretation and application is a challenge. This paper describes the rubric development methodology and progress to-date on Teamwork, Communication, and Design rubrics including the refinement of outcomes, indicators, and descriptors in response to expert feedback. We also discuss challenges that have delayed testing and deployment as well as future steps.

Author(s):  
Gayle Lesmond ◽  
Nikita Dawe ◽  
Susan McCahan ◽  
Lisa Romkey

The shift towards outcomes-based assessment in higher education has necessitated the exploration and development of valid measurement tools. Given this trend, the current project seeks to develop a set of generic analytic rubrics for the purpose of assessing learning outcomes in the core competency areas of design, communication, teamwork, problem analysis and investigation. This paper will provide an update on the original paper presented at CEEA 2015, in which the approach to rubric development for communication, design and teamwork was discussed. The current paper will detail the process of testing the communication, design and teamwork rubrics. In particular, it will report on the progress achieved in shadow testing, where teaching assistants and/or course instructors with grading experience (“assessors”) are asked to evaluate samples of student work using selected rows from the rubrics. The results of shadow testing will be presented.


2015 ◽  
Vol 12 (1) ◽  
pp. 1-22 ◽  
Author(s):  
Imroatus Solikhah

This article, for all intents and purposes, is to describe the Competency-Based Curriculum in respons to the advent of National Qualification Framework (KKNI) that sets Outcomes-Based Curriculum in a wide range of education practices.  The objectives of the article are to persuit the nature of competency and the learning outcomes delineated in the KKNI clarifying some terms that are still confius. Concepts of curriculum design pertaining to development of needs analyis are briefly discussed.  In addition, a substantial discussion on the learning outcomes, core competency, competency, and objectices from where curriculum development is based upon is outlined.  In the perspective of Indonesian policy, Competency-Based Curriculum will be no longer implemented as the advent of KKNI would give great impact on the Outcomes-Based Curriculum.  


2015 ◽  
Vol 34 (6) ◽  
pp. 329-336 ◽  
Author(s):  
Leigh Ann Cates ◽  
Sheryl Bishop ◽  
Debra Armentrout ◽  
Terese Verklan ◽  
Jennifer Arnold ◽  
...  

AbstractPurpose: Determine content validity of global statements and operational definitions and choose scenarios for Competency, Assessment, Technology, Education, and Simulation (C.A.T.E.S.), instrument in development to evaluate multidimensional competency of neonatal nurse practitioners (NNPs).Design: Real-time Delphi (RTD) method to pursue four specific aims (SAs): (1) identify which cognitive, technical, or behavioral dimension of NNP competency accurately reflects each global statement; (2) map the global statements to the National Association of Neonatal Nurse Practitioners (NANNP) core competency domains; (3) define operational definitions for the novice to expert performance subscales; and (4) determine the essential scenarios to assess NNPs.Sample: Twenty-five NNPs and nurses with competency and simulation experienceMain outcome variable: One hundred percent of global statements correct for competency dimension and all but two correct for NANNP domain. One hundred percent novice to expert operational definitions and eight scenarios chosen.Results: Content validity determined for global statements and novice to expert definitions and essential scenarios chosen.


2007 ◽  
Vol 64 (6) ◽  
pp. 390-394 ◽  
Author(s):  
Arnold Tabuenca ◽  
Richard Welling ◽  
Ajit K. Sachdeva ◽  
Patrice G. Blair ◽  
Karen Horvath ◽  
...  

2016 ◽  
Vol 36 (4) ◽  
pp. 295-299 ◽  
Author(s):  
Betsy White Williams ◽  
Phil D. Byrne ◽  
Dillon Welindt ◽  
Michael V. Williams

2020 ◽  
Vol 4 (3) ◽  
pp. 459
Author(s):  
Dea Rian Firmansyah ◽  
Nahadi Nahadi ◽  
Harry Firman

This study aims to develop good quality performance assessment instrument that can be used to measure students’ scientific thinking skills. The research method used is Development & validation The study describes the 5 stages of instrument development, namely the analysis of performance assessment journals, analysis of Core Competency and Basic Competency on 2013 curriculum, field surveys, development of indicators and student worksheets, as well as target skills and rubric development. The developed instrument consists of 33 target skills (experimental problem solving) and 9 target skills (quantitative literacy). The expected value in this research is the accuracy of students in performing practical work. In the validity test, it is obtained a CVR value of 1.00 on 42 developed target skills, so that the instrument was declared as valid. Based on the analysis of research data, it can be concluded that the developed performance assessment instruments have good quality to measure students' scientific thinking skills.


Sign in / Sign up

Export Citation Format

Share Document