scholarly journals Learning Design and Learning Analytics: Snapshot 2020

2020 ◽  
Vol 7 (3) ◽  
pp. 6-12
Author(s):  
Leah P. Macfadyen ◽  
Lori Lockyer ◽  
Bart Rienties

“Learning design” belongs to that interesting class of concepts that appear on the surface to be simple and self-explanatory, but which are actually definitionally vague and contested in practice. Like “learning analytics,” the field of learning design aspires to improve teaching practice, the learning experience, and learning outcomes. And like learning analytics, this interdisciplinary field also lacks a shared language, common vocabulary, or agreement over its definition and purpose, resulting in uncertainty even about who its practitioners are — Educators? Designers? Researchers? All of these? (Law, Li, Farias Herrera, Chan & Pong, 2017). Almost a decade ago, however, learning analytics researchers pointed to the rich potential for synergies between learning analytics and learning design (Lockyer & Dawson, 2011). These authors (and others since, as cited below) argued that effective alignment of learning analytics and learning design would benefit both fields, and would offer educators and investigators the evidence they need that their efforts and innovations in learning design are “worth it” in terms of improving teaching practice and learning: "The integration of research related to both learning design and learning analytics provides the necessary contextual overlay to better understand observed student behavior and provide the necessary pedagogical recommendations where learning behavior deviates from pedagogical intention" (Lockyer & Dawson, 2011, p. 155).

Author(s):  
Yizhou Fan ◽  
Wannisa Matcha ◽  
Nora’ayu Ahmad Uzir ◽  
Qiong Wang ◽  
Dragan Gašević

AbstractThe importance of learning design in education is widely acknowledged in the literature. Should learners make effective use of opportunities provided in a learning design, especially in online environments, previous studies have shown that they need to have strong skills for self-regulated learning (SRL). The literature, which reports the use of learning analytics (LA), shows that SRL skills are best exhibited in choices of learning tactics that are reflective of metacognitive control and monitoring. However, in spite of high significance for evaluation of learning experience, the link between learning design and learning tactics has been under-explored. In order to fill this gap, this paper proposes a novel learning analytic method that combines three data analytic techniques, including a cluster analysis, a process mining technique, and an epistemic network analysis. The proposed method was applied to a dataset collected in a massive open online course (MOOC) on teaching in flipped classrooms which was offered on a Chinese MOOC platform to pre- and in-service teachers. The results showed that the application of the approach detected four learning tactics (Search oriented, Content and assessment oriented, Content oriented and Assessment oriented) which were used by MOOC learners. The analysis of tactics’ usage across learning sessions revealed that learners from different performance groups had different priorities. The study also showed that learning tactics shaped by instructional cues were embedded in different units of study in MOOC. The learners from a high-performance group showed a high level of regulation through strong alignment of the choices of learning tactics with tasks provided in the learning design. The paper also provides a discussion about implications of research and practice.


2020 ◽  
Vol 2 (1) ◽  
pp. 42
Author(s):  
Steve Leichtweis

Universities are increasingly being expected to ensure student success while at the same time delivering larger courses.  Within this environment, the provision of effective and timely feedback to students and creating opportunities for genuine engagement between teachers and students is increasingly difficult if not impossible for many instructors, despite the known value and importance of feedback (Timperley & Hattie, 2007) and instructor presence (Garrison, Anderson & Archer, 2010).  Similar to other tertiary institutions, the University of Auckland has adopted various technology-enhanced learning approaches and technologies, including learning analytics in an attempt to support teaching and learning at scale.  The increased use of educational technology to support learning provides a variety of data sources for teachers to provide personalised feedback and improve the overall learning experience for students.  This workshop is targeted to teachers interested in the use of learning data to provide personalized support to learners.  Participants will have a hands-on opportunity to use the open-source tool OnTask (Pardo, et al. 2018) within some common teaching scenarios with a synthetically generated data set.  The facilitators will also share and discuss how OnTask is currently being used in universities to support student experience, teaching practice and course design.  As this is a hands-on workshop, participants must bring a laptop computer to work with the online tool and the prepared scenarios.  References   Garrison, D. R., Anderson, T., & Archer, W. (2010). The first decade of the community of inquiry framework: A retrospective. The internet and higher education, 13(1-2), 5-9. Hattie, J., & Timperley, H. (2007). The power of feedback. Review of educational research, 77(1), 81-112. Pardo, A., Bartimote-Aufflick, K., Shum, S. B., Dawson, S., Gao, J., Gaševic, D., Leichtweis, S., Liu, D., Martínez-Maldonado, R., Mirriahi, N. and Moskal, A. C. M. (2018). OnTask: Delivering Data-Informed, Personalized Learning Support Actions. Journal of Learning Analytics, 5(3), 235-249.


2020 ◽  
Author(s):  
Sebastian M. Herrmann

This article describes the ideas behind and the experiences with the experimental e-learning platform SHRIMP. Developed and deployed at American Studies Leipzig, the platform is used for the introductory Literature and Culture I seminar in the American Studies Bachelor of Arts program, and it serves as the main medium of instruction for around 80 students per year. It breaks up the linear form of the original seminar reader and instead offers students a hypertext of interconnected, short segments, enriched with social media and gamification elements, as well as a learning analytics component that invites students to take control of their own study and learning experience. It is driven by a dual assumption about digitization: that the digital age changes how students interact with text, and that digital textuality offers rich affordances beyond linear reading. Both can be harnessed to improve learning outcomes.


2020 ◽  
Vol 36 (6) ◽  
pp. 107-119
Author(s):  
Rita Prestigiacomo ◽  
Jane Hunter ◽  
Simon Knight ◽  
Roberto Martinez-Maldonado ◽  
Lori Lockyer

Data about learning can support teachers in their decision-making processes as they design tasks aimed at improving student educational outcomes. However, to achieve systemic impact, a deeper understanding of teachers’ perspectives on, and expectations for, data as evidence is required. It is critical to understand how teachers’ actions align with emerging learning analytics technologies, including the practices of pre-service teachers who are developing their perspectives on data use in classroom in their initial teacher education programme. This may lead to an integration gap in which technology and data literacy align poorly with expectations of the role of data and enabling technologies. This paper describes two participatory workshops that provide examples of the value of human-centred approaches to understand teachers’ perspectives on, and expectations for, data as evidence. These workshops focus on the design of pre-service teachers enrolled in teacher education programmes (N = 21) at two Australian universities. The approach points to the significance of (a) pre-service teachers’ intentions to track their students’ dispositions to learning and their ability to learn effectively, (b) the materiality of learning analytics as an enabling technology and (c) the alignment of learning analytics with learning design, including the human-centred, ethical and inclusive use of educational data in the teaching practice.   Implications for practice or policy: Pre-service teachers ought to be given opportunities to engage and understand more about learning design, learning analytics and the use of data in classrooms. Professional experience placements for pre-service teachers should include participatory data sessions or learning design workshops. Teacher education academics in universities must be provided with ongoing professional development to support their preparation work of pre-service teachers’ data literacy, learning analytics and the increasing presence of data.


2020 ◽  
Vol 10 (10) ◽  
pp. 260
Author(s):  
Laura Dooley ◽  
Nikolas Makasis

The flipped classroom has been increasingly employed as a pedagogical strategy in the higher education classroom. This approach commonly involves pre-class learning activities that are delivered online through learning management systems that collect learning analytics data on student access patterns. This study sought to utilize learning analytics data to understand student learning behavior in a flipped classroom. The data analyzed three key parameters; the number of online study sessions for each individual student, the size of the sessions (number of topics covered), and the first time they accessed their materials relative to the relevant class date. The relationship between these parameters and academic performance was also explored. The study revealed patterns of student access changed throughout the course period, and most students did access their study materials before the relevant classroom session. Using k-means clustering as the algorithm, consistent early access to learning materials was associated with improved academic performance in this context. Insights derived from this study informed iterative improvements to the learning design of the course. Similar analyses could be applied to other higher education learning contexts as a feedback tool for educators seeking to improve the online learning experience of their students.


2020 ◽  
Vol 7 (3) ◽  
pp. 13-32
Author(s):  
Marion Blumenstein

The field of learning analytics (LA) has seen a gradual shift from purely data-driven approaches to more holistic views of improving student learning outcomes through data-informed learning design (LD). Despite the growing potential of LA in higher education (HE), the benefits are not yet convincing to the practitioner, in particular aspects of aligning LA data with LD toward desired learning outcomes. This review presents a systematic evaluation of effect sizes reported in 38 key studies in pursuit of effective LA approaches to measuring student learning gain for the enhancement of HE pedagogy and delivery. Large positive effects on student outcomes were found in LDs that fostered socio-collaborative and independent learning skills. Recent trends in personalization of learner feedback identified a need for the integration of student-idiosyncratic factors to improve the student experience and academic outcomes. Finally, key findings are developed into a new three-level framework, the LA Learning Gain Design (LALGD) model, to align meaningful data capture with pedagogical intentions and their learning outcomes. Suitable for various settings — face to face, blended, or fully online — the model contributes to data-informed learning and teaching pedagogies in HE.


Author(s):  
Lotfi Elaachak

Nowadays, learning via smartphones has become one of the most popular teaching tools used by young people, thanks to the ease of use of such devices in the field of education. There are now a large number of both instructional applications and mobile serious games "MSGs" which are available in mobile applications stores. The diversity of such applications especially MSGs can guarantee a personalized learning experience for each learner. However, it is difficult to decide if a given MSG is efficient or not because this decision depends on several factors. One of those major factors is their ability to transmit knowledge effectively to the learners, in order to teach them new skills. This ability can be measured and then analyzed by using several techniques and algorithms like learning analytics, educational data mining, inference knowledge e.g. "Bayesian Knowledge Tracing", etc. Hence the need for the establishment of a user-friendly platform based on these algorithms, the proposed platform will be able to evaluate easily the learning outcomes of this kind of video games.


2018 ◽  
Vol 10 (2) ◽  
pp. 3-9
Author(s):  
Anna Dipace ◽  
F. Feldia Loperfido ◽  
Alessia Scarinci

AbstractThis article describes Learning Analytics (LA) as a predictive and formative approach that enables the planning of educational scenarios in line with students’ needs and languages in order to set a priori and in progress systems of control and inspection of the following: consistency, relevance and effectiveness of training objectives, curriculum paths, students’ needs and learning outcomes. Thanks to LA, it is possible to understand how students learn. Training courses are designed to include the definition of those learning outcomes that respond effectively to students’ needs in terms of contents, methodologies, tools and teaching resources. The present article aims to describe and discuss, after reviewing the relevant literature, in what way LA represents a valid support not only in designing student-centred training courses, which assess outcomes, but also in carrying out a formative assessment considering the learning experience as a whole. The analysis of some case studies was a good opportunity to reflect and define the bridge existing between the use of LA for assessment purposes and personalized learning paths.


Sign in / Sign up

Export Citation Format

Share Document