scholarly journals CELBEST Project: Design and Implementation of the First Engineering Education-specific Assessment Tool for Professional Communicative Competence

Author(s):  
Cristina Fabretto

Following the 2010 review of engineering programs in Canada by the Canadian Engineering Accreditation Board (CEAB), the Faculty of Engineering and Applied Science at Memorial University introduced a number of changes to its undergraduate program in order to align with the new CEAB outcome-based accreditation approach [1-3]. As programs’ accreditation begun to be reviewed for progress toward assessment of graduate attributes (G.A.), the 12 graduate attributes as defined by the CEAB became de facto the undergraduate program outcomes at Memorial. This paper provides an overview of the Faculty’s approach to the development and progressive assessment of communication skills as Graduate Attribute (G.A.: 07) in such a way that is aligned with CEAB accreditation requirements while taking into account the unique challenges and opportunities inherent in its program.

Author(s):  
Shanzhong Shawn Duan ◽  
Kurt Bassett

The assessment of program outcomes for ABET accreditation has become a challenge for engineering programs nationwide. Various methods and approaches have been investigated to develop good practices for program assessment. At South Dakota State University (SDSU), an approach called Faculty Course Assessment Reports (FCAR) has been explored for mechanical engineering (ME) program assessment. FCAR provides an assessment tool to correlate the ME program outcomes with the outcomes of the core ME courses, and to evaluate student performance at the course level based on ABET outcome criterion. This process begins with the development of course objectives and outcomes. Then these course objectives and outcomes are directly mapped with the ME program objectives and outcomes respectively. Further the quantitative and qualitative details generated in the FCAR are lined up directly to ABET program outcome a to k criterion through FCAR rubrics. By use of the FCAR process, all ME program outcomes are evaluated at the course level based on the ABET program outcomes. The assessment results are being used for improvement of the ME curriculum. The process was developed to provide an effective tool for the ME program outcome assessment at the course level with reasonable effort.


Author(s):  
Ever J. Barbero ◽  
Jacky C. Prucz ◽  
Larry E. Banta ◽  
Charles E. Stanley ◽  
Nilay Mukherjee

A comprehensive implementation of outcome portfolios is presented. Outcome portfolios are assessment tools used by the authors to accomplish triangulation in the Accreditation Board for Engineering and Technology (ABET) EC-2000 assessment process. Systematic and effective use of outcome portfolios has provided us with a convenient, reliable, and powerful tool for assessing the level of achievement of our graduates on all the program outcomes for the Aerospace Engineering and Mechanical Engineering programs at West Virginia University. The objective of this paper is to describe our approach to assembling, assessing, and improving outcome portfolios as an essential outcome assessment tool under ABET Criterion 3. The process is illustrated in detail using outcome “k” [1] as an example. Assessment data are presented to support the hypothesis that survey data alone are inconclusive and that outcome portfolios provide additional, valuable information for program enhancement. A comparison between the assessment data for the two programs, Aerospace Engineering and Mechanical Engineering, is used to support our conclusions.


2016 ◽  
Vol 25 (1) ◽  
pp. 43-53 ◽  
Author(s):  
Isabel Garcia de Quevedo ◽  
Felipe Lobelo ◽  
Loren Cadena ◽  
Madalena Soares ◽  
Michael Pratt

Non-communicable diseases (NCDs) are the leading causes of death worldwide, with higher rates of premature mortality in low- and middle-income countries (LMICs). This places a high economic burden on these countries, which usually have limited capacity to address this public health problem. We developed a guided self-assessment tool for describing national capacity for NCD prevention and control. The purpose of this tool was to assist countries in identifying key opportunities and gaps in NCD capacity. It was piloted in three countries between 2012 and 2013: Mozambique, Colombia, and the Dominican Republic. The tool includes details about NCD burden; health system infrastructure and primary care services; workforce capacity; surveillance; planning, policy, and program management; and partnerships. In the three pilot countries, the tool helped to identify differences in capacity needs pertaining to staff, training, and surveillance, but similarities were also found related to NCD challenges and opportunities. The NCD tool increased our understanding of needs and critical capacity elements for addressing NCDs in the three pilot countries. This tool can be used by other LMICs to map their efforts toward addressing NCD goals and defining priorities.


Author(s):  
Nazrul Islam

This chapter aims to provide a new readiness matrix called ‘innovative manufacturing readiness levels (IMRLs)’ to evaluate and assess the areas of micro and nanotechnology maturity including their performance. The study employs a case study approach through which the practicability and applicability of the IMRLs conceptual matrix were verified and confirmed. A case study with laser-based manufacturing technologies explores the stages of micro and nano technologies (MNTs)’ maturity, including the key issues and performances that contributed to the development of a new assessment tool. Concerning intense global R&D competition in MNTs, this study exhibits a forward-looking approach in assessing MNTs maturity and performance. A generic conclusion is reached by which product designers and technology managers position themselves and take into account risk reduction exercises related to MNTs. The novelty of the research could be that organizations, which develop and use MNTs, have an opportunity in applying such a specific assessment matrix to quantify the technology readiness of unreleased MNTs.


Author(s):  
Aneta George ◽  
Liam Peyton

The Graduate Attribute Information Analysis system (GAIA) was developed at the University of Ottawa to support data collection and performance management of graduate attributes for engineering programs at the program level and at the course level [10]. This paper reports on our research to develop support for cohort analysis and reporting by providing a single consistent view of graduate attributes (GA) and performance indicators for groups of students who started and finished an engineering program at the same time. This is supported by two special purpose reports: Graduate Attribute Report per Cohort (GAR/C) and Course Progression Report per Cohort (CPR/C). The former shows average GA data per attribute, the latter tracks student achievement as students progress in their program. It also adds to the historic data trend analysis for a program. Furthermore, a COOP Progress Report per cohort (COOPR/C) is generated.


Author(s):  
Iman Moazzen ◽  
Mariel Miller ◽  
Peter Wild ◽  
LillAnne Jackson ◽  
Allyson Hadwin

Design is one of twelve graduate attributesthat needs to be assessed as part of the accreditationprocess for engineering programs in Canada, as requiredby the Canadian Engineering Accreditation Board(CEAB). However, assessment of design competence is acomplex task due to the fact design process is non-linearand depends on many factors including communicationskills, teamwork skills, individual knowledge and skills,and project complexity. This study aims to captureundergraduate students’ design and teamwork skills andthe challenges they face in their design projects. To thisend, a low-cost assessment tool which can beimplemented and analyzed relatively fast is presented.The tool is a new survey which assesses students’ selfreportedintentions and skills for four key dimensions ofteam-based engineering design: (a) design process, (b)design communication, (c) teamwork, (d) regulation ofteamwork. The survey was administered to the first yearstudents enrolled in “Design and Communication I” aftercompletion of a final design project. In this paper, thesurvey development and key findings from the collecteddata are discussed in detail.


Author(s):  
Jillian Seniuk Cicek ◽  
Sandra Ingram ◽  
Nariman Sepehri

This paper describes the third year of a studyat the University of Manitoba aimed at exploring how theCanadian Engineering Accreditation Board (CEAB)graduate attributes are manifested and measured in theFaculty of Engineering’s curriculum. Instructors from theDepartments of Biosystems, Civil, Electrical andComputer, and Mechanical Engineering were asked toconsider the presence of four attributes and theirsubsequent indicators in one engineering course taught inthe 2013-14 academic year. The attributes were: AKnowledge Base for Engineering, Individual and TeamWork, Impact of Engineering on Society and theEnvironment, and Economics and Project Management.Data were gathered using a self-administered checklist,which was introduced to instructors in a workshopsetting. The checklist has evolved over the three years inan effort to define student attribute competency levels andto create an assessment tool that meets the needs of boththe researchers and the instructors, as we work togetherto examine the graduate attributes in our courses andimplement an outcomes-based assessment protocol. Thedata from this third year give us the ability to report onhow the remaining four CEAB graduate attributes arepresently manifest and measured in our engineeringfaculty, to look for evidence of outcomes-basedassessment, to evaluate the checklist as an assessmenttool, and to reflect on the overall process.


2014 ◽  
Vol 79 (5) ◽  
pp. 798-807.e5 ◽  
Author(s):  
Catharine M. Walsh ◽  
Simon C. Ling ◽  
Nitin Khanna ◽  
Mary Anne Cooper ◽  
Samir C. Grover ◽  
...  

Author(s):  
Govind Gopakumar ◽  
Deborah Dusart-Gale ◽  
Ali Akgunduz

In 2009 the Canadian Engineering Accreditation Board (CEAB) announced its intention requiring all undergraduate engineering programs in Canada to utilize twelve graduate attributes for assessing the capacities of its students. In response, engineering faculties across the country have been experimenting with creating processes that incorporate these graduate attributes as a means to stimulate program improvement to achieve curricular and program innovation. Many of the support resources (like the inter-university collaboration EGAD, for example) have focused largely in three directions – definitional, programmatic and information management challenges faced by different engineering programs.Less attention has been given to identifying and addressing leadership challenges faced by faculty administrators in piloting curricular and programmatic changes such as the CEAB graduate attributes. We argue that these challenges result from fundamental features of university educational culture: faculty members place great value upon autonomy in their workplace, and likewise expect a high degree of intellectual independence in designing courses. The introduction of CEAB attributes, together with the mandated changes they will bring to course design, is perceived by faculty members as an external imposition. Such a perception we suggest introduces some scepticism in the faculty about its efficacy leading to a disengagement from the change process. Thorough attention to these cultural factors impacting on graduate attributes adoption is crucial to the implementation of successful curriculum development.Describing these challenges in detail, this paper will outline some pathways that can circumvent these impediments to curricular innovation.


Sign in / Sign up

Export Citation Format

Share Document