Analysis of a Feedback Assessment Loop in Engineering Sciences Core Curriculum

Author(s):  
Amitabha Ghosh

A formal two-loop learning outcomes assessment process has been evaluated in the mechanical engineering department at Rochester Institute of Technology. This initiative, originally called the Engineering Sciences Core Curriculum (ESCC), provided systematic course learning outcomes and assessment data of student performance in Statics, Mechanics, Dynamics, Thermodynamics, Fluid Mechanics and Heat Transfer. This paper reports detailed analyses with some important observations in the Statics-Dynamics sequence to determine obstacles in student performance. New data shows that students’ mastery of Dynamics is affected largely by incorrect interpretations and weak retention of fundamentals in Statics. Further evidence of students’ behavioral influences are discussed requiring a future focus in this area. This report completes the 5 year feedback loop designed to achieve the ESCC goals on the Statics-Dynamics sequence.

Author(s):  
Amitabha Ghosh

A two-loop learning outcomes assessment process was followed to evaluate the core curriculum in Mechanical Engineering at Rochester Institute of Technology. This initiative, originally called the Engineering Sciences Core Curriculum, provided systematic course learning outcomes and assessment data of examination performance in Statics, Mechanics, Dynamics, Thermodynamics, Fluid Mechanics and Heat Transfer. This paper reports longitudinal data and important observations in the Statics-Dynamics sequence to determine efficacy and obstacles in student performance. An earlier paper showed that students’ mastery of Dynamics is affected largely by weak retention of fundamentals of Statics and mathematics. New observations recorded in this report suggest the need for better instructional strategies to teach certain focal areas in Statics. Subsequesntly offered Dynamics and Fluid Mechanics classes further need reinforcement of some of these fundamental topics in Statics. This report completes a 9 year long broader feedback loop designed to achieve the educational goals in the Statics-Dynamics sequence.


Author(s):  
Edward Hensel ◽  
Amitabha Ghosh

A formal two-loop learning outcomes assessment process has been implemented in the mechanical engineering department at Rochester Institute of Technology. The outer loop establishes high-level outcomes and objectives for the program, while the inner loop provides assessment of achievement and feedback for improvement. Planning for the two-loop assessment process was initiated in academic year 2005–06 with the establishment of four faculty workgroups, with each group assigned responsibility for conducting outcomes assessment on a subset of the ME core curriculum. The engineering science core curriculum inner loop assessment process was initiated in AY2006–07 and continues today. Results of a three year longitudinal study of the engineering science course learning outcomes assessment and the details of the assessment and continuous improvement process are described herein. The three year study period encompassed 83 sections of five courses, with a cumulative student enrollment of 2,853 individuals. Sample data from the longitudinal study of the fluid mechanics course (reviewing 19 sections with a cumulative enrollment of 619 students) is presented, to illustrate improvement of student learning through refinement of the course delivery. A discussion of the assessment process, lessons learned, and conclusions are presented.


Author(s):  
Amitabha Ghosh

This paper highlights some important obstacles in student test performance resulting from different forms of testing procedures in Statics and Dynamics. A group approach dictates the core pedagogy in these classes, which are components of Engineering Sciences Core Curriculum (ESCC) at Rochester Institute of Technology (RIT). Our observations indicate that the difficulties start before engineering sciences due to incomplete understanding of mathematics and physics. While the human aspects of this assessment may not be revealed on tests, results of long hours of counseling sessions of students with faculty and academic advisors have now been imbedded in designing of our program. But in spite of our streamlined processes of improved delivery and testing, many good students demonstrate superior test scores on essay type questions but poor understanding of concepts as revealed from the analysis of Multiple Choice (MC) responses. This lack of performance has been tracked to a narrow focus and a lack of retention of prior concepts in their active memory. The paper discusses these topics using a select set of multiple choice questions administered on Statics and Dynamics examinations and offers remedial actions including proposal of a new course.


2018 ◽  
Author(s):  
Andrea A. Curcio

Law school institutional learning outcomes require measuring nuanced skills that develop over time. Rather than look at achievement just in our own courses, institutional outcome-measures assessment requires collective faculty engagement and critical thinking about our students’ overall acquisition of the skills, knowledge, and qualities that ensure they graduate with the competencies necessary to begin life as professionals. Even for those who believe outcomes assessment is a positive move in legal education, in an era of limited budgets and already over-burdened faculty, the new mandated outcomes assessment process raises cost and workload concerns. This essay addresses those concerns. It describes a relatively simple, low-cost model to measure institutional law school learning outcomes that does not require any initial changes in individual faculty members’ pedagogical approach or assessment methods. It explains how a rubric method, used by the Association of American Colleges and Universities [AAC&U] and medical educators to assess a wide range of nuanced skills such as critical thinking and analysis, written and oral communication, problem-solving, intercultural competence, teamwork, and self-reflection, could be adapted by law schools. The essay explains a five-step institutional outcomes assessment process: 1. Develop rubrics for institutional learning outcomes that can be assessed in law school courses; 2. Identify courses that will use the rubrics; 3. Ask faculty in designated courses to assess and grade as they usually do, adding only one more step – completion of a short rubric for each student; 4. Enter the rubric data; and 5. Analyze and use the data to improve student learning. The essay appendix provides sample rubrics for a wide range of law school institutional learning outcomes. This outcomes assessment method provides an option for collecting data on institutional learning outcomes assessment in a cost-effective manner, allowing faculties to gather data that provides an overview of student learning across a wide range of learning outcomes. How faculties use that data depends upon the results as well as individual schools’ commitment to using the outcomes assessment process to help ensure their graduates have the knowledge, skills and values necessary to practice law.Citation: Andrea A. Curcio, A Simple Low-Cost Institutional Learning-Outcomes Assessment Process, 67 J. Legal Educ. 489 (2018).


2021 ◽  
Author(s):  
Walid Ibrahim ◽  
Hoda Amer

Learning Outcomes Assessment (LOA) provide educators with a practical instrument to review and enhance the alignment between the planned, delivered and experienced curriculum. Effective LOA processes help educators decide on the proper actions to take and the strategies to implement in order to ensure the continuous improvement of the student learning experience, and the attainment of the intended learning outcomes. Nonetheless, the adoption of LOA in higher education is still lagging and the assessment loop is rarely closed. This is mainly due to the indigent implementation of the assessment processes, and the vague definition of the responsibilities and quality assurance measures. This paper introduces a committee infrastructure to foster accountability and responsibility and assure the quality of the implemented assessment processes. The infrastructure has been established syccessfully over the last few years, and a noticeable improve in the execution of the assessment process has been detected.


Author(s):  
Amitabha Ghosh

Dynamics is a pivotal class in a student’s life-long learning profile since it builds upon the logical extensions of Statics and Strength of Materials classes, and provides a framework on which Fluid Mechanics concepts may be developed for deformable media. This paper establishes the contextual reference of Dynamics in this framework. An earlier paper by the author discussed details of how the design of proper multiple choice questions is critical for assessment in Statics and Fluid Mechanics. This paper provides a progress report of such evaluations in Dynamics. In addition, this paper explores the pedagogical issues related to building a student’s learning profile. While comparing test results obtained in trailer sections of Dynamics with those obtained in sections taught by faculty teams, some structural differences were discovered. This reporting completes the feedback loop used by faculty in our Engineering Sciences Core Curriculum for improving student performance over time. The process may further be developed by using some similarities and differences in the performance data.


Author(s):  
Sami Ainane ◽  
Chandrasekhar Thamire

In 2005, the undergraduate program offered by the Department of Mechanical Engineering at the University of Maryland, College Park, will be undergoing the ABET accreditation review process. In view of the recent changes in the EAC criteria, the Department is currently implementing a reformed assessment program. Direct assessment practices are now being utilized to assess the outcomes, along with the other assessment tools and methodologies used during the previous years. As part of this process, individual courses in the curriculum are identified to target selected learning outcomes, related student work is collected and examined by individual faculty and faculty committees, and the results are used to evaluate the outcomes and identify deficiencies. In this paper we present the outcomes assessment process developed for this purpose, which includes a description of the direct measures of student achievement in engineering courses and the traditional tools such as course and program evaluation surveys and inputs from various constituencies and committees. Specific student work targeted to achieve different learning outcomes is identified for selected outcomes.


2020 ◽  
Vol 2 (2) ◽  
pp. 67-87
Author(s):  
Wesley Shumar

<?page nr="67"?>Abstract The article addresses the practice of program and learning outcomes assessment adopted by many American universities. Arguing that the justification of administrative intervention into faculty’s teaching is based upon the separation of the content of a course or program from the form of that material, the article demonstrates that the form/content distinction is a false opposition. Further, the article demonstrates that administrative efforts to assess learning outcomes based on the idea that they will not affect the content follows an older and outmoded transmission theory of learning. To impose outcomes assessment upon faculty is to also impose the transmission theory of learning. This would constitute a threat to academic freedom. The article concludes with a call for faculty full participation into the assessment process including the design of assessments. Only when faculty are in the forefront of the process can academic freedom be preserved.


Sign in / Sign up

Export Citation Format

Share Document