scholarly journals Including Learning Analytics in the Loop of Self-Paced Online Course Learning Design

Author(s):  
Hongxin Yan ◽  
Fuhua Lin ◽  
Kinshuk

AbstractOnline education is growing because of its benefits and advantages that students enjoy. Educational technologies (e.g., learning analytics, student modelling, and intelligent tutoring systems) bring great potential to online education. Many online courses, particularly in self-paced online learning (SPOL), face some inherent barriers such as learning awareness and academic intervention. These barriers can affect the academic performance of online learners. Recently, learning analytics has been shown to have great potential in removing these barriers. However, it is challenging to achieve the full potential of learning analytics with the traditional online course learning design model. Thus, focusing on SPOL, this study proposes that learning analytics should be included in the course learning design loop to ensure data collection and pedagogical connection. We propose a novel learning design-analytics model in which course learning design and learning analytics can support each other to increase learning success. Based on the proposed model, a set of online course design strategies are recommended for online educators who wish to use learning analytics to mitigate the learning barriers in SPOL. These strategies and technologies are inspired by Jim Greer’s work on student modelling. By following these recommended design strategies, a computer science course is used as an example to show our initial practices of including learning analytics in the course learning design loop. Finally, future work on how to develop and evaluate learning analytics enabled learning systems is outlined.

Author(s):  
Kevin P. Gosselin ◽  
Maria Northcote ◽  
Kristi D. Wuensche ◽  
Trudy Stoddard

Over the past few decades, substantial growth has occurred in online education in general, and this has been particularly true of the higher education sector. Most universities and post-secondary institutions now offer students the opportunity to enroll in online pre-tertiary, vocational, undergraduate and/or postgraduate courses. While some of these courses are successful for the learners who enroll in them, others have been found somewhat deficient, often criticized for their lack of humanization, interaction, communication and online presence. This chapter examines the role of the so-called soft skills of online course design and online teaching that are seen as vital for online educators who are responsible for the facilitation of high quality online learning. Along with a review of relevant literature about the soft skills of online teaching, the chapter presents three institutional case studies from which a set of practically-focused recommendations for promoting the design of humanized online learning environments has been developed.


2019 ◽  
Vol 2 (1) ◽  
pp. 23
Author(s):  
Sally Eberhard

“ETEC 565A: Understanding Learning Analytics” was a new course offered in a Master of Educational Technology programme in UBC in January 2019. In order to support students exploring learning analytics in a more relevant way to them, the final project allows students to choose their own learning analytics adventure. This presentation will be a showcase and a reflection of learning from our final (group) project. Our group wanted to focus on learning design and learning analytics. There has been a lot of interests in learning analytics in higher education, it has been appearing in the EDUCAUSE’s Horizon report for many years as a technology to adopt. We also know that learning technologies should support the educational goals. Therefore, it is important for us to understand, how would one combine learning design with learning analytics. Our instructor guided to Lockyer, Heathcote and Dawson (2013)’s work. Their article presented “learning design as a form of documentation of pedagogical intent that can provide the context for making sense of diverse sets of analytic data” (p.1439). Lockyer et al (2013) explored using the checkpoint and process analytics as broad categories of learning analytics and how through this documentation of pedagogical intent and the related learning analytics that can be collected, could support pedagogical actions.   Our instructor has given us permission to use our course as an example to apply Lockyer et al (2013)’s framework and conduct our analysis on the course. Our group also had access to some learning analytics data ourselves, through the “Threadz” tool for analysing our discussion forum activities. For all other types of data that we did not have access to, we made comments on what the data could be used for, and if it would provide enough information to assess if the pedagogical intent was met. Comments were also made about potential data that could have been better for informing pedagogical actions but were either not possible to get or too difficult and unpractical.   In the presentation, I will share some backgrounds for the Lockyer et al (2013)’s framework for aligning learning analytics with learning design, how one could use the framework to document their own course design and identify potential learning analytics data sources or the lack of. As the framework provides teachers and designers a tool to think, plan and reflect with. I will discuss some of our group’s findings and reflections from the analysis of our own online course.  Then discuss about potentials of using such framework on a more traditional face-to-face course.   As institutions and courses collect more data about their students, it is useful to have a framework to help teachers think about how they might use the learning analytics data to support their students through examining and documenting their pedagogical intents. It is also important to note what the existing data can and cannot do to support pedagogical goals.


2012 ◽  
Vol 16 (3) ◽  
Author(s):  
Laurie P Dringus

This essay is written to present a prospective stance on how learning analytics, as a core evaluative approach, must help instructors uncover the important trends and evidence of quality learner data in the online course. A critique is presented of strategic and tactical issues of learning analytics. The approach to the critique is taken through the lens of questioning the current status of applying learning analytics to online courses. The goal of the discussion is twofold: (1) to inform online learning practitioners (e.g., instructors and administrators) of the potential of learning analytics in online courses and (2) to broaden discussion in the research community about the advancement of learning analytics in online learning. In recognizing the full potential of formalizing big data in online coures, the community must address this issue also in the context of the potentially "harmful" application of learning analytics.


Author(s):  
Marc R. Robinson

Student perceptions of online courses are likely influenced by two overarching aspects of quality: instructor quality and course design quality (Ortiz-Rodriguez, Telg, Irani, Roberts & Rhoades, 2005). Both of these forces in online education may be analyzed using a well-known model of instructional design - Gagnés instructional design and cognition theory, the centerpiece of which are the nine events of instruction (Gagné, Wager, Golas, & Keller, 2004). Multiple studies positively correlate learner attitudes and perceptions of the online course to instructor quality. Early studies evaluating instructor quality attempted to correlate instructor quality with the attitude and perception of the learner, but not directly to learner success or course design quality. Researchers of online courses, such as Palloff & Pratt (2003), discussed the role of the instructor in depth while neglecting the roles of the learner, the institution, and course design. The main focus remained instructor-centered, and highlighted key instructor tasks such as understanding the virtual learner in terms of roles the learner plays, fostering team roles for the learner, designing an effective course orientation, and identifying potential legal issues the instructor might face (Palloff & Pratt, 2002, p. 16). A distant secondary focus was on effective course design. This highlighted instructor tasks in building an effective online learning community without highlighting the roles effective communication tools would play.


2021 ◽  
Vol 18 (6) ◽  
Author(s):  
Krystyna Krzyszkowska ◽  
Maria Mavrommati

: Education authorities in Norway endorse online courses for in‑service teachers to raise education standards and to promote digital competence. Naturally, these offerings present teachers with opportunities to integrate new theoretical perspectives and their professional experience in an online learning community. The inquiry into one's professional practice, enhanced by critical reflection in a group of fellow professionals, is considered essential for a lifelong learning practitioner, however, the emerging examples of instructional design tend to prioritise content delivery rather than professional discourse. In this paper, we demonstrate how the Community of Inquiry (CoI) framework could be adopted to transform learning design, which prioritises the delivery of individual assignments, into a more collaborative learning experience. Using the CoI instructional design principles and the associated questionnaire, we have investigated student perceptions of learning via an online course and formulated recommendations about how the course design can be refined to promote learning in the community. Despite the modest evidence, this investigation can serve as an example of how a concrete learning design can be improved based on this validated e‑learning model.


Author(s):  
Teresa L. Coffman ◽  
Mary Beth Klinger

Online education is advancing the world over and recent emphasis has focused on the quality of online learning and student outcomes. This chapter focuses on managing quality in online learning design through two different project management approaches at two different institutions of higher education. University X instituted a pilot program of faculty and instructional designers to initiate online course development at this University and to identify and define quality in the online course design process. College Y has had a successful online cadre of courses and programs and recently adopted a for-purchase quality initiative through Quality Matters. Courses are put through the Quality Matters evaluation process to determine strengths and weaknesses. Both institutions will continue to offer online education as an alternative to traditional, classroom courses and both will continue to monitor quality as a key indicator of student learning and online course success.


2020 ◽  
Vol 121 (5/6) ◽  
pp. 365-380
Author(s):  
Angela P. Murillo ◽  
Kyle M.L. Jones

Purpose Quality Matters is one of the most widely regarded standards for online course design. Due to the COVID-19 pandemic, many instructors have needed to quickly convert face-to-face classes into an online environment. However, many instructors do not have online education expertise. Standards such as Quality Matters can help guide the creation of quality online course environments. This paper aims to provide a research-based and pragmatic approach for creating QM-informed online courses. Design/methodology/approach The Quality Matters Standards Rubric consists of eight General and 42 Specific Review Standards. Each standard was analyzed to determine the ease of implementation and implementation approach for a Quality Matters-informed online course template. Findings Of the 42 specific review standards, 16 (38%) are easily achievable, 20 (48%) are achievable, but required some intervention, and six (14%) are difficult to achieve through a course template. Practical implications This study provides guidance for implementing Quality Matters-informed online course design. As many instructors without an instructional design or online education background now need to conduct online classes, Quality Matters provides structure and guidance to assist with creating high-quality learning environments. As receiving formal Quality Matters certification is time-consuming and requires peer-review, this research provides guidance to create Quality Matters-informed online courses in a timely manner. Originality/value This study is particularly timely due to the COVID-19 pandemic and will help prepare instructors for any second-wave scenarios. Furthermore, through providing guidance on the creation of Quality Matters-informed online course design, this paper will help instructors have a greater chance of instructional success for online course delivery.


2021 ◽  
Vol 1 (1) ◽  
Author(s):  
Justi Echeles

Keeping diversity and inclusion in mind throughout the process of online course design and delivery can be daunting to instructors, course developers, and content creators. These concepts, along with access equity and legal compliance, can seem distant from the principal objective of content presentation and instruction. Recent public health circumstances resulted in much of higher education’s move to remote learning. This reveals the need for quality online education that seeks to remove barriers and create challenging and engaging opportunities for all learners. This article presents research-based and established best practices and universal standards to help educators create accessible, usable, and inclusive online learning environments in a way that simplifies the process, meets rigorous standards, and improves the experience for all learners.


2019 ◽  
Vol 12 (21) ◽  
pp. 21 ◽  
Author(s):  
René Boyer Christiansen ◽  
Karsten Gynther ◽  
Rasmus Jørnø

This paper presents an approach to the meaningful use of learning analytics as a tool for teachers to improve the robustness of their learning designs. The approach is based on examining how participants act within a Massive Open Online Course (MOOC) format through learning analytics. We show that a teacher/designer can gain knowledge about his or her intended, implemented and attained learning design; about how MOOC participants act in response to these and about how students are able to develop ‘study efficiency’ when participating in a MOOC. The learning analytics approach makes it possible to follow certain MOOC students and their study behaviour (e.g. the participants who pass the MOOC by earning enough achievement badges) and to examine the role of the moderator in MOOCs, showing that scaffolding plays a central role in studying and learning processes in an educational format such as a MOOC. Key words: MOOCs, Massive Open Online Courses, data-saturated, learning analytics, learning design, educational design research, LMS.


Sign in / Sign up

Export Citation Format

Share Document