Journal of Learning Analytics
Latest Publications


TOTAL DOCUMENTS

259
(FIVE YEARS 80)

H-INDEX

22
(FIVE YEARS 6)

Published By Society For Learning Analytics Research

1929-7750

2021 ◽  
Vol 8 (3) ◽  
pp. 1-9
Author(s):  
Alyssa F. Wise ◽  
Simon Knight ◽  
Xavier Ochoa

The ongoing changes and challenges brought on by the COVID-19 pandemic have exacerbated long-standing inequities in education, leading many to question basic assumptions about how learning can best benefit all students. Thirst for data about learning is at an all-time high, sometimes without commensurate attention to ensuring principles this community has long valued: privacy, transparency, openness, accountability, and fairness. How we navigate this dynamic context is critical for the future of learning analytics. Thinking about the issue through the lens of JLA publications over the last eight years, we highlight the important contributions of “problem-centric” rather than “tool-centric” research. We also value attention (proximal or distal) to the eventual goal of closing the loop, connecting the results of our analyses back to improve the learning from which they were drawn. Finally, we recognize the power of cycles of maturation: using information generated about real-world uses and impacts of a learning analytics tool to guide new iterations of data, analysis, and intervention design. A critical element of context for such work is that the learning problems we identify and choose to work on are never blank slates; they embed societal structures, reflect the influence of past technologies; and have previous enablers, barriers and social mediation acting on them. In that context, we must ask the hard questions: What parts of existing systems is our work challenging? What parts is it reinforcing? Do these effects, intentional or not, align with our values and beliefs? In the end what makes learning analytics matter is our ability to contribute to progress on both immediate and long-standing challenges in learning, not only improving current systems, but also considering alternatives for what is and what could be. This requires including stakeholder voices in tackling important problems of learning with rigorous analytic approaches to promote equitable learning across contexts. This journal provides a central space for the discussion of such issues, acting as a venue for the whole community to share research, practice, data and tools across the learning analytics cycle in pursuit of these goals.


2021 ◽  
pp. 1-18
Author(s):  
Marcelo Worsley ◽  
Roberto Martinez-Maldonado ◽  
Cynthia D'Angelo

Multimodal learning analytics (MMLA) has increasingly been a topic of discussion within the learning analytics community. The Society of Learning Analytics Research is home to the CrossMMLA Special Interest Group and regularly hosts workshops on MMLA during the Learning Analytics Summer Institute (LASI). In this paper, we articulate a set of 12 commitments that we believe are critical for creating effective MMLA innovations. Moreover, as MMLA grows in use, it is important to articulate a set of core commitments that can help guide both MMLA researchers and the broader learning analytics community. The commitments that we describe are deeply rooted in the origins of MMLA and also reflect the ways that MMLA has evolved over the past 10 years. We organize the 12 commitments in terms of (i) data collection, (ii) analysis and inference, and (iii) feedback and data dissemination and argue why these commitments are important for conducting ethical, high-quality MMLA research. Furthermore, in using the language of commitments, we emphasize opportunities for MMLA research to align with established qualitative research methodologies and important concerns from critical studies.


2021 ◽  
pp. 1-16
Author(s):  
Hamideh Iraj ◽  
Anthea Fudge ◽  
Huda Khan ◽  
Margaret Faulkner ◽  
Abelardo Pardo ◽  
...  

One of the major factors affecting student learning is feedback. Although the importance of feedback has been recognized in educational institutions, dramatic changes - such as bigger class sizes and a more diverse student population - challenged the provision of effective feedback. In light of these changes, educators have increasingly been using new digital tools to provide student feedback, given the broader adoption and availability of these new technologies. However, despite these efforts, most educators have limited insight into the recipience of their feedback and wonder which students engage with feedback. This problem is referred to as the "feedback gap," which is the difference between the potential and actual use of feedback, preventing educators and instructional designers from understanding feedback recipience among students. In this study, a set of trackable call-to-action (CTA) links were embedded in feedback messages focused on learning processes and self-regulation of learning in one fully online marketing course and one blended bioscience course. These links helped us examine the association between feedback engagement and course success. We also conducted two focus groups with students from one of the courses to further examine student perceptions of feedback messages. Our results across both courses revealed that early engagement with feedback is positively associated with passing the course and that most students considered feedback messages helpful in their learning. Our study also found some interesting demographic differences between students regarding their engagement with the feedback messages. Such insight enables instructors to ask "why" questions, support students' learning, improve feedback processes, and narrow the gap between potential and actual use of feedback. The practical implications of our findings are further discussed.


2021 ◽  
pp. 1-17
Author(s):  
Scott Harrison ◽  
Renato Villano ◽  
Grace Lynch ◽  
George Chen

Early alert systems (EAS) are an important technological tool to help manage and improve student retention. Data spanning 16,091 students over 156 weeks was collected from a regionally based university in Australia to explore various microeconometric approaches that establish links between EAS and student retention outcomes. Controlling for numerous confounding variables, significant relationships between the EAS and student retention were identified. Capturing dynamic relationships between the explanatory variables and the hazard of discontinuing provides new insight into understanding student retention factors. We concluded that survival models are the best methods of understanding student retention when temporal data is available.


2021 ◽  
pp. 1-16
Author(s):  
Hassan Khosravi ◽  
George Gyamfi ◽  
Barbara E. Hanna ◽  
Jason Lodge ◽  
Solmaz Abdi

The value of students developing the capacity to accurately judge the quality of their work and that of others has been widely studied and recognized in higher education literature. To date, much of the research and commentary on evaluative judgment has been theoretical and speculative in nature, focusing on perceived benefits and proposing strategies seen to hold the potential to foster evaluative judgment. The efficacy of the strategies remains largely untested. The rise of educational tools and technologies that generate data on learning activities at an unprecedented scale, alongside insights from the learning sciences and learning analytics communities, provides new opportunities for fostering and supporting empirical research on evaluative judgment. Accordingly, this paper offers a conceptual framework and an instantiation of that framework in the form of an educational tool called RiPPLE for data-driven approaches to investigating the enhancement of evaluative judgment. Two case studies, demonstrating how RiPPLE can foster and support empirical research on evaluative judgment, are presented.


2021 ◽  
pp. 1-22
Author(s):  
Hassan Khosravi ◽  
Shiva Shabaninejad ◽  
Aneesha Bakharia ◽  
Shazia Sadiq ◽  
Marta Indulska ◽  
...  

Learning analytics dashboards commonly visualize data about students with the aim of helping students and educators understand and make informed decisions about the learning process. To assist with making sense of complex and multidimensional data, many learning analytics systems and dashboards have relied strongly on AI algorithms based on predictive analytics. While predictive models have been successful in many domains, there is an increasing realization of the inadequacies of using predictive models in decision-making tasks that affect individuals without human oversight. In this paper, we employ a suite of state-of-the-art algorithms, from the online analytics processing, data mining, and process mining domains, to present an alternative human-in-the-loop AI method to enable educators to identify, explore, and use appropriate interventions for subpopulations of students with the highest deviation in performance or learning process compared to the rest of the class. We demonstrate an application of our proposed approach in an existing learning analytics dashboard (LAD) and explore the recommended drill-downs in a course with 875 students. The demonstration provides an example of the recommendations from real course data and shows how recommendations can lead the user to interesting insights. Furthermore, we demonstrate how our approach can be employed to develop intelligent LADs.


2021 ◽  
pp. 1-20
Author(s):  
Yi-Shan Tsai ◽  
Alexander Whitelock-Wainwright ◽  
Dragan Gašević

The adoption of learning analytics (LA) in complex educational systems is woven into sociocultural and technical challenges that have induced distrust in data and difficulties in scaling LA. This paper presents a study that investigated areas of distrust and threats to trustworthy LA through a series of consultations with teaching staff and students at a large UK university. Surveys and focus groups were conducted to explore participant expectations of LA. The observed distrust is broadly attributed to three areas: the subjective nature of numbers, the fear of power diminution, and approaches to design and implementation of LA. The paper highlights areas to maintain existing trust with policy procedures and areas to cultivate trust by engaging with tensions arising from the social process of LA.


2021 ◽  
pp. 1-17
Author(s):  
Yingbin Zhang ◽  
Luc Paquette ◽  
Ryan S. Baker ◽  
Jaclyn Ocumpaugh ◽  
Nigel Bosch ◽  
...  

Confusion may benefit learning when it is resolved or partially resolved. Metacognitive strategies (MS) may help learners to resolve confusion when it occurs during learning and problem solving. This study examined the relationship between confusion and MS that students evoked in Betty’s Brain, a computer-based learning-by-modelling environment where elementary and middle school students learn science by building causal maps. Participants were sixth graders. Emotion data were collected from real-time observations by trained researchers. MS and task performance information were determined by analyzing the action logs. Pre- and post-tests were used to assess learning gains. The results revealed that the use of MS was a function of the state of student confusion. However, confusion resolution was not related to MS behaviour, and MS did not moderate the effect of confusion on student task performance in Betty’s Brain or on learning gains.


2021 ◽  
pp. 1-15
Author(s):  
Tom Olney ◽  
Steve Walker ◽  
Carlton Wood ◽  
Anactoria Clarke

Most higher education institutions view their increasing use of learning analytics as having significant potential to improve student academic achievement, retention outcomes, and learning and teaching practice but the realization of this potential remains stubbornly elusive. While there is an abundance of published research on the creation of visualizations, dashboards, and predictive models, there has been little work done to explore the impact of learning analytics on the actual practice of teachers. Through the lens of social informatics (an approach that views the users of technologies as active social actors whose technological practices constitute a wider socio-technical system) this qualitative study reports on an investigation into the practice of 30 tutors in the STEM faculty at Europe’s largest distance learning organization, The Open University UK (OU). When asked to incorporate learning analytics (including predictive learning analytics) contained in the Early Alert Indicator (EAI) dashboard during the 2017–2018 academic year into their practice, we found that tutors interacted with this dashboard in certain unanticipated ways and developed three identifiable “shadow practices”.


2021 ◽  
Vol 8 (2) ◽  
pp. 6-21
Author(s):  
Anouschka van Leeuwen ◽  
Carolien A. N. Knoop-van Campen ◽  
Inge Molenaar ◽  
Nikol Rummel

Teacher dashboards are a specific form of analytics in which visual displays provide teachers with information about their students; for example, concerning student progress and performance on tasks during lessons or lectures. In the present paper, we focus on the role of teacher dashboards in the context of teacher decision-making in K–12 education. There is large variation in teacher dashboard use in the classroom, which could be explained by teacher characteristics. Therefore, we investigate the role of teacher characteristics — such as experience, age, gender, and self-efficacy — in how teachers use dashboards. More specifically, we present two case studies to understand how diversity in teacher dashboard use is related to teacher characteristics. Surprisingly, in both case studies, teacher characteristics were not associated with dashboard use. Based on our findings, we propose an initial framework to understand what contributes to diversity of dashboard use. This framework might support future research to attribute diversity in dashboard use. This paper should be seen as a first step in examining the role of teacher characteristics in dashboard use in K–12 education.


Sign in / Sign up

Export Citation Format

Share Document