scholarly journals Assessing Student Engagement in Online Programmes: Using Learning Design and Learning Analytics

2019 ◽  
Vol 8 (6) ◽  
pp. 171 ◽  
Author(s):  
Maria Toro-Troconis ◽  
Jesse Alexander ◽  
Manuel Frutos-Perez

This paper presents the learning design framework used in the design of the Online MA in Photography at Falmouth University. It discusses the importance of evaluating the success of online learning programmes by analysing learning analytics and student feedback within the overall pedagogic context and design of the programme. Linear regression analysis was used to analyse the engagement of three cohorts of students that completed four modules of the Online MA Photography (n=33) with over 80,000 entries in the dataset. The research explored student engagement with online content that promoted low-order cognitive skills (i.e. watching videos, reading materials and listening to podcasts) as well as high-order cognitive skills (i.e. participating in online forums and webinars). The results suggest there is weak evidence of an association between average overall mark in all modules and the level of engagement with self-directed content (P = 0.0187). There is also weak evidence of an association between average overall mark in all modules and the level of engagement in collaborative activities (P < 0.0528). Three major themes emerged from the focus group 1) weekly forums and webinars, 2) self-directed learning materials and 3) learning design and support. Online learning was acceptable and convenient to postgraduate students. These findings are discussed further in the paper as potential predictors of student performance in online programmes.

Author(s):  
Yan Cong ◽  
Kerry Earl

Findings presented explore the Chinese cultural influence, aspects of instructional design that supported learning and achievement, and the influence of the culture in which they were learning. Lessons for the teaching staff, learning design staff and others involved in online learning for students of other cultures are outlined.


2019 ◽  
Vol 16 ◽  
Author(s):  
Airina Volungevičienė ◽  
Josep Maria Duart ◽  
Justina Naujokaitienė ◽  
Giedrė Tamoliūnė ◽  
Rita Misiulienė

The research aims at a specific analysis of how learning analytics as a metacognitive tool can be used as a method by teachers as reflective professionals and how it can help teachers learn to think and come down to decisions about learning design and curriculum, learning and teaching process, and its success. Not only does it build on previous research results by interpreting the description of learning analytics as a metacognitive tool for teachers as reflective professionals, but also lays out new prospects for investigation into the process of learning analytics application in open and online learning and teaching. The research leads to the use of learning analytics data for the implementation of teacher inquiry cycle and reflections on open and online teaching, eventually aiming at an improvement of curriculum and learning design. The results of the research demonstrate how learning analytics method can support teachers as reflective professionals, to help understand different learning habits of their students, recognize learners’ behavior, assess their thinking capacities, willingness to engage in the course and, based on the information, make real time adjustments to their course curriculum.


2018 ◽  
Vol 30 (2) ◽  
Author(s):  
Ronald George Leppan ◽  
Reinhardt A Botha ◽  
Johan F Van Niekerk

Higher education institutions seem to have a haphazard approach to harnessing the ubiquitous data that learners generate on online educational platforms, despite promising opportunities offered by this data. Several learning analytics process models have been proposed to optimise the learning environment based on this learner data. The model proposed in this paper addresses deficiencies in existing learning analytics models that frequently emphasises only the technical aspects of data collection, analysis and intervention, yet remain silent on ethical issues inherent in collecting and analysing student data and pedagogy-based approaches to the interventions. The proposed model describes how differentiated instruction can be provided based on a dynamic learner profile built through an ethical learning analytics process. Differentiated instruction optimises online learning through recommending learning objects tailored towards the learner attributes stored in a learner profile. The proposed model provides a systematic and comprehensive abstraction of a differentiated learning design process informed by learning analytics. The model emerged by synthesising steps of a tried-and-tested web analytics process with educational theory, an ethical learning analytics code of practice, principles of adaptive education systems and a layered abstraction of online learning design.


Author(s):  
Karlis Krumins ◽  
Sarma Cakula

INTRODUCTION Student performance prediction has become a viable means to improving academic performance and course content in online learning. Predictive models such as neural networks, decision trees and linear regression are used to transform inputs (e.g. past performance, social background, learning system usage patterns, test results) into outputs (course completion, expected grade, difficulties encountered, personalized suggestions). Often, the existing quantitative data drive model design, especially when applying such models to the conventional classroom and the person delivering the course, is a passive participant in designing models and delivering data. In seeking to capture and code as much student behavior and environment as possible to apply learning analytics to a mostly conventional classroom, the most successful inputs (predictors) among existing models can be identified, categorized and their common characteristics determined. Together with a study of formative and summative assessment methods (e.g. types of feedback and how it can be captured) and factors affecting student performance in the classroom (e.g. environmental factors), this allows to identify the existing data in classrooms that are not captured by current learning management systems, thus allowing the expanded use of learning analytics and student performance prediction in traditional classrooms, with a focus on personalized suggestions. The goal of the paper is to identify patterns among inputs used in existing models of student learning (based on online learning and learning management system data mining) that can then also be applied to the traditional classroom. Research question: how can characteristics common to effective predictors of student performance be used to identify predictors among data produced in the traditional classroom? MATERIAL AND METHODS A literature review is performed where inputs captured and features discovered in existing learning analytics systems are characterised, along with methods used to identify those and the modelling approaches employed. An attempt is made to identify measures in online learning that may have analogues in the traditional classroom (e.g., seating patterns and communication in chatrooms) or for which proxies may be found (e.g. screen size and lighting quality, where the proxy is the classroom number). The corresponding outputs are recorded where possible, with a focus on those that allow providing feedback for individual students or for course/curriculum deliverers/designers (i.e. allow to improve  the success of future students in this course). RESULTS Successful predictors and characteristics common to those are identified, so that they can be used in features engineering for student performance prediction models. Predictors used in online learning are categorised, so that analogous inputs can be developed for use in traditional classrooms. Types of feedback provided by existing models of learning are identified, where possible, along with the corresponding input (weights of inputs). Studies are identified where learning personnel, not the researcher, were able to drive the model development process. DISCUSSION Recently, there has been increasing focus on increasing the visibility into models of learning and of involving learning personnel in designing, modifying and running those models. Providing inputs and recognizing the features they represent determines the success of such models. Therefore, recognizing existing successes and applying them to formative assessment methods may be a means of recognizing additional inputs to and features used in models, while involving educators. Applying learning models to the traditional classroom as an integrated part of the learning management (school record keeping/grading) systems may allow to expand their use, while simultaneously increasing the predictive power and effectiveness of (personalized) suggestions, both by using existing data, and by providing tools for educators to transform the existing feedback they provide into data than can be used as inputs for models. CONCLUSION Predictors used in learning models in online learning can be applied to the traditional classroom. Analogues may be found for predictors that are not available in the conventional classroom. Common characteristics and categorisation of predictors may be used to identify predictors among existing data, including data provided by students (e.g. formative feedback) that is not captured by the existing learning management systems used.


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Chris Kossen ◽  
Chia-Yi Ooi

PurposeThis paper reports on how micro-learning design principles are being trialled in an Australian and a Malaysian university to make online courses more accessible and attractive, and a more positive experience, with the aim of increasing student success. Central to this approach is segmenting materials into “bite-size” instalments by way of short micro-lecture presentations and reducing other content. The aim of this “less is more” strategy is to reduce unnecessary cognitive load as an impediment to learning so that focus can shift to prioritising the most essential skills and content. The purpose of this trial is to explore the efficacy of micro-learning as a means for increasing student engagement and learning.Design/methodology/approachThe trials involved a mixed mode methodology drawing on qualitative and ratings data from course satisfaction surveys and records on grades and completion.FindingsTo date, results have shown significant increases in student engagement and satisfaction, and also performance. Our application of micro-learning included reducing volume of content based on its practical value, use of novelty (e.g. infusing guest presenter input) and design of practical and collaborative student activities.Research limitations/implicationsEarly results are encouraging regarding apparent utility for engaging learners and ease of application, i.e. implementability and transference potential. However, the rapidly expanding area of online learning requires further research to establish a well-validated evidence base for effective online teaching practices.Practical implicationsThe findings are relevant to universities involved in online and blended learning. Micro-learning design methods show promise in being able to address major engagement barriers including cognitive overload.Social implicationsMore students are struggling with learning in today's social environment brought about with the massification of higher education. Micro-learning seeks to address major barriers these learners face with methods that go beyond traditional teaching practices.Originality/valueFindings here are encouraging and contribute to existing understanding on ways to increase learner engagement in the competitive and fast-growing area of online learning for universities globally.


Author(s):  
Yan Cong ◽  
Kerry Earl

Findings presented explore the Chinese cultural influence, aspects of instructional design that supported learning and achievement, and the influence of the culture in which they were learning. Lessons for the teaching staff, learning design staff and others involved in online learning for students of other cultures are outlined.


2021 ◽  
Vol 16 (23) ◽  
pp. 140-157
Author(s):  
Iman Rashid Al-Kindi ◽  
Zuhoor Al-Khanjari

Our motivation in this paper is to predict student Engagement (E), Behavior (B), Personality (P) and Performance (P) via designing a Tracking Student Perfor-mance Tool (TSPT) that obtained data directly from Moodle logs of any selected courses. The proposed tool follows the predictive EBP model that focuses mainly on student's EBP and Performance where the instructor could use it to monitor the overall performance of his/her students during the course. The results of test-ing the tool show that the developed tool gives the same as manual results analy-sis. Analyzing Moodle log of any course using such a tool is supposed to help with the implementation of similar courses and helpful for the instructor in re-designing it in a way that is more beneficial to the students. This paper sheds light on the importance of studying student's EBPP and provides interesting possibili-ties for improving student performance with a specific focus on designing online learning environments or contexts.


2012 ◽  
Vol 16 (3) ◽  
Author(s):  
Laurie P Dringus

This essay is written to present a prospective stance on how learning analytics, as a core evaluative approach, must help instructors uncover the important trends and evidence of quality learner data in the online course. A critique is presented of strategic and tactical issues of learning analytics. The approach to the critique is taken through the lens of questioning the current status of applying learning analytics to online courses. The goal of the discussion is twofold: (1) to inform online learning practitioners (e.g., instructors and administrators) of the potential of learning analytics in online courses and (2) to broaden discussion in the research community about the advancement of learning analytics in online learning. In recognizing the full potential of formalizing big data in online coures, the community must address this issue also in the context of the potentially "harmful" application of learning analytics.


Sign in / Sign up

Export Citation Format

Share Document