Scaling Course Design as a Learning Analytics Variable

2021 ◽  
pp. 73-93
Author(s):  
John Fritz ◽  
Thomas Penniston ◽  
Mike Sharkey ◽  
John Whitmer
2020 ◽  
Vol 2 (1) ◽  
pp. 42
Author(s):  
Steve Leichtweis

Universities are increasingly being expected to ensure student success while at the same time delivering larger courses.  Within this environment, the provision of effective and timely feedback to students and creating opportunities for genuine engagement between teachers and students is increasingly difficult if not impossible for many instructors, despite the known value and importance of feedback (Timperley & Hattie, 2007) and instructor presence (Garrison, Anderson & Archer, 2010).  Similar to other tertiary institutions, the University of Auckland has adopted various technology-enhanced learning approaches and technologies, including learning analytics in an attempt to support teaching and learning at scale.  The increased use of educational technology to support learning provides a variety of data sources for teachers to provide personalised feedback and improve the overall learning experience for students.  This workshop is targeted to teachers interested in the use of learning data to provide personalized support to learners.  Participants will have a hands-on opportunity to use the open-source tool OnTask (Pardo, et al. 2018) within some common teaching scenarios with a synthetically generated data set.  The facilitators will also share and discuss how OnTask is currently being used in universities to support student experience, teaching practice and course design.  As this is a hands-on workshop, participants must bring a laptop computer to work with the online tool and the prepared scenarios.  References   Garrison, D. R., Anderson, T., & Archer, W. (2010). The first decade of the community of inquiry framework: A retrospective. The internet and higher education, 13(1-2), 5-9. Hattie, J., & Timperley, H. (2007). The power of feedback. Review of educational research, 77(1), 81-112. Pardo, A., Bartimote-Aufflick, K., Shum, S. B., Dawson, S., Gao, J., Gaševic, D., Leichtweis, S., Liu, D., Martínez-Maldonado, R., Mirriahi, N. and Moskal, A. C. M. (2018). OnTask: Delivering Data-Informed, Personalized Learning Support Actions. Journal of Learning Analytics, 5(3), 235-249.


2019 ◽  
Vol 6 (2) ◽  
Author(s):  
Alyssa Friend Wise ◽  
Yeonji Jung

The process of using analytic data to inform instructional decision-making is acknowledged to be complex; however, details of how it occurs in authentic teaching contexts have not been fully unpacked. This study investigated five university instructors’ use of a learning analytics dashboard to inform their teaching. The existing literature was synthesized to create a template for inquiry that guided interviews, and inductive qualitative analysis was used to identify salient emergent themes in how instructors 1) asked questions, 2) interpreted data, 3) took action, and 4) checked impact. Findings showed that instructors did not always come to analytics use with specific questions, but rather with general areas of curiosity. Questions additionally emerged and were refined through interaction with the analytics. Data interpretation involved two distinct activities, often along with affective reactions to data: reading data toidentify noteworthy patterns and explaining their importance in the course using contextual knowledge. Pedagogical responses to the analytics included whole-class scaffolding, targeted scaffolding, and revising course design, as well two new non-action responses: adopting a wait-and-see posture and engaging in deep reflection on pedagogy. Findings were synthesized into a model of instructor analytics use that offers useful categories of activities for future study and support


Author(s):  
Kerry Wilkinson ◽  
Imogen McNamara ◽  
David Wilson ◽  
Karina Riggs

This case study describes the use of learning analytics to evaluate the transition of a postgraduate wine business course from face-to-face to online delivery using e-learning course design principles. Traditionally, Foundations of Wine Science lectures were delivered face-to-face, however the decision to transition the course from semester to trimester format presented an opportunity for online delivery of lectures. This was initially achieved through audio recordings, then video lectures, supported by a range of digital learning resources intended to engage, support and enhance student learning and the student experience. Descriptive analysis of learning analytics, comprising assessment results, student evaluations of learning and teaching, and data sourced from the Learning Management System, was performed to evaluate the impact of online delivery of course content on student performance, satisfaction and engagement. The use of audio lecture recordings negatively impacted students’ perception of the overall quality of the course (including course organisation, learning strategies and learning resources). The subsequent implementation of e-learning designed video lectures was considered superior to audio recordings, albeit final grades were not significantly different between the delivery modes. However, student engagement was equal to, or better than face-to-face delivery, when content was designed specifically for an e-learning environment.


2015 ◽  
Vol 1 (3) ◽  
pp. 1-3
Author(s):  
Negin Mirriahi ◽  
Shane Dawson ◽  
Dragan Gasevic ◽  
Philip D. Long

This issue of the Journal of Learning Analytics comprises two special issue sections. The first of which presents five papers from the 4th International Learning Analytics and Knowledge conference held in Indianapolis. The second showcases the current or recent work of doctoral students who attended the 2nd Learning Analytics Summer Institute at Harvard University, Boston. The issue also includes two articles in the Hot Spots section, discussing the application of learning analytics initiatives in higher education institutions from different perspectives – broad-scale initiatives to individual course design. The breadth and diversity of the articles covered in this issue demonstrate how the discipline has matured and moved towards understanding student learning to inform pedagogical practice and curricular redesign coupled with strategies for the application and adoption of LA strategies across institutions


Author(s):  
Hongxin Yan ◽  
Fuhua Lin ◽  
Kinshuk

AbstractOnline education is growing because of its benefits and advantages that students enjoy. Educational technologies (e.g., learning analytics, student modelling, and intelligent tutoring systems) bring great potential to online education. Many online courses, particularly in self-paced online learning (SPOL), face some inherent barriers such as learning awareness and academic intervention. These barriers can affect the academic performance of online learners. Recently, learning analytics has been shown to have great potential in removing these barriers. However, it is challenging to achieve the full potential of learning analytics with the traditional online course learning design model. Thus, focusing on SPOL, this study proposes that learning analytics should be included in the course learning design loop to ensure data collection and pedagogical connection. We propose a novel learning design-analytics model in which course learning design and learning analytics can support each other to increase learning success. Based on the proposed model, a set of online course design strategies are recommended for online educators who wish to use learning analytics to mitigate the learning barriers in SPOL. These strategies and technologies are inspired by Jim Greer’s work on student modelling. By following these recommended design strategies, a computer science course is used as an example to show our initial practices of including learning analytics in the course learning design loop. Finally, future work on how to develop and evaluate learning analytics enabled learning systems is outlined.


The Community of Inquiry framework provides a three-fold and multi-faceted way to consider effectiveness within an online, digital, and/or blended course setting. A broader understanding of online learning as social and interactive (e.g., Anderson & Elloumi, 2004) provides a theoretical grounding to understand the CoI framework for both course design as well as research. This chapter also describes key ideas that will be discussed in later chapters, including an overview of the Community of Inquiry framework, an overview of big data, learning analytics, predictive analytics, computational linguistics, social network analysis, and other conceptual ideas that foster analysis of online learners in large course settings or across programs. The authors offer a current understanding of the overall extant literature on the CoI framework as it relates to the key ideas since its conception around the year 2000. Additional readings are provided.


2019 ◽  
Vol 2 (1) ◽  
pp. 23
Author(s):  
Sally Eberhard

“ETEC 565A: Understanding Learning Analytics” was a new course offered in a Master of Educational Technology programme in UBC in January 2019. In order to support students exploring learning analytics in a more relevant way to them, the final project allows students to choose their own learning analytics adventure. This presentation will be a showcase and a reflection of learning from our final (group) project. Our group wanted to focus on learning design and learning analytics. There has been a lot of interests in learning analytics in higher education, it has been appearing in the EDUCAUSE’s Horizon report for many years as a technology to adopt. We also know that learning technologies should support the educational goals. Therefore, it is important for us to understand, how would one combine learning design with learning analytics. Our instructor guided to Lockyer, Heathcote and Dawson (2013)’s work. Their article presented “learning design as a form of documentation of pedagogical intent that can provide the context for making sense of diverse sets of analytic data” (p.1439). Lockyer et al (2013) explored using the checkpoint and process analytics as broad categories of learning analytics and how through this documentation of pedagogical intent and the related learning analytics that can be collected, could support pedagogical actions.   Our instructor has given us permission to use our course as an example to apply Lockyer et al (2013)’s framework and conduct our analysis on the course. Our group also had access to some learning analytics data ourselves, through the “Threadz” tool for analysing our discussion forum activities. For all other types of data that we did not have access to, we made comments on what the data could be used for, and if it would provide enough information to assess if the pedagogical intent was met. Comments were also made about potential data that could have been better for informing pedagogical actions but were either not possible to get or too difficult and unpractical.   In the presentation, I will share some backgrounds for the Lockyer et al (2013)’s framework for aligning learning analytics with learning design, how one could use the framework to document their own course design and identify potential learning analytics data sources or the lack of. As the framework provides teachers and designers a tool to think, plan and reflect with. I will discuss some of our group’s findings and reflections from the analysis of our own online course.  Then discuss about potentials of using such framework on a more traditional face-to-face course.   As institutions and courses collect more data about their students, it is useful to have a framework to help teachers think about how they might use the learning analytics data to support their students through examining and documenting their pedagogical intents. It is also important to note what the existing data can and cannot do to support pedagogical goals.


2008 ◽  
Vol 11 (2) ◽  
pp. 76-82 ◽  
Author(s):  
Sarah M. Ginsberg

Abstract This qualitative study examined student perceptions regarding a hybrid classroom format in which part of their learning took place in a traditional classroom and part of their learning occurred in an online platform. Pre-course and post-course anonymous essays suggest that students may be open to learning in this context; however, they have specific concerns as well. Students raised issues regarding faculty communication patterns, learning styles, and the value of clear connections between online and traditional learning experiences. Student concerns and feedback need to be addressed through the course design and by the instructor in order for them to have a positive learning experience in a hybrid format course.


Sign in / Sign up

Export Citation Format

Share Document