International Journal of Learning Analytics and Artificial Intelligence for Education (iJAI)
Latest Publications


TOTAL DOCUMENTS

22
(FIVE YEARS 22)

H-INDEX

3
(FIVE YEARS 3)

Published By International Association Of Online Engineering (IAOE)

2706-7564

Author(s):  
Susanne Jauhiainen ◽  
Tron Krosshaug ◽  
Erich Petushek ◽  
Jukka-Pekka Kauppi ◽  
Sami Äyrämö

Strength training exercises are essential for rehabilitation, improving our health as well as in sports. For optimal and safe training, educators and trainers in the industry should comprehend exercise form or technique. Currently, there is a lack of tools measuring in-depth skills of strength training experts. In this study, we investigate how data mining methods can be used to identify novel and useful skill patterns from a binary multiple choice questionnaire test designed to measure the knowledge level of strength training experts. A skill test assessing exercise technique expertise and comprehension was answered by 507 fitness professionals with varying backgrounds. A triangulated approach of clustering and non-negative matrix factorization (NMF) was used to discover skill patterns among participants and patterns in test questions. Four distinct participant subgroups were identified in data with clustering and further question patterns with NMF. The results can be used to, for example, identify missing skills and knowledge in participants and subgroups of participants and form general and personalized or background specific guidelines for future education. In addition, the test can be optimized based on, for example, if some questions can be answered correct even without the required skill or if they seem to be measuring overlapping skills. Finally, this approach can be utilized with other multiple choice test data in future educational research.


Author(s):  
Mark T. Williams ◽  
Lesley Jan Lluka ◽  
Prasad Chunduri

Learning analytics (LA), a fast emerging concept in higher education, is used to understand and optimize the student learning process and the envi-ronment in which it occurs. Knowledge obtained from the LA paradigm is often utilized to construct statistical models aimed at identifying students who are at risk of failing the unit/course, and to subsequently design inter-ventions that are targeted towards improving the course outcomes for these students. In previous studies, models were constructed using a wide variety of variables, but emerging evidence suggests that the models constructed us-ing course-specific variables are more accurate, and provide a better under-standing of the learning context. For our current study, student performance in the various course assessment tasks was used as a basis for the predictive models and future intervention design, as they are conventionally used to evaluate student learning outcomes and the degree to which the various course learning objectives are met. Further, students in our course are pri-marily first-year university students, who are still unfamiliar with the learning and assessment context of higher education, and this prevents them from adequately preparing for the tasks, and consequently reduces their course performance and outcome. We first constructed statistical models that would be used to identify students who are at risk of failing the course and to identify assessment tasks that students in our course find challeng-ing, as a guide for the design of future interventional activities. Every con-structed predictive model had an excellent capacity to discriminate between students who passed the course and those who failed. Analysis revealed that not only at-risk students, but the whole cohort, would benefit from in-terventions improving their conceptual understanding and ability to con-struct high-scoring answers to Short Answer Questions.


Author(s):  
Sepinoud Azimi ◽  
Carmen-Gabriela Popa ◽  
Tatjana Cucić

<p class="0abstract"><span lang="EN-US">The birth of massive open online courses (MOOCs) has had an undeniable effect on how teaching is being delivered. It seems that traditional in-class teaching is becoming less popular with the young generation – the generation that wants to choose when, where and at what pace they are learning. As such, many universities are moving towards taking their courses, at least partially, online. However, online courses, although very appealing to the younger generation of learners, come at a cost. For example, the dropout rate of such courses are higher than that of more traditional ones, and the reduced in-person interaction with the teachers results in less timely guidance and intervention from the educators. Machine learning (ML)-based approaches have shown phenomenal successes in other domains. The existing stigma that applying ML-based techniques requires a large amount of data seems to be a bottleneck when dealing with small-scale courses with limited amounts of produced data. In this study, we show not only that the data collected from an online learning management system could be well utilized in order to predict students’ overall performance but also that it could be used to propose timely intervention strategies to boost the students’ performance level. The results of this study indicate that effective intervention strategies could be suggested as early as the middle of the course to change the course of students’ progress for the better. We also present an assistive pedagogical tool based on the outcome of this study, to assist in identifying challenging students and in suggesting early intervention strategies.</span></p>


Author(s):  
Nina Bergdahl ◽  
Jalal Nouri ◽  
Thashmee Karunaratne ◽  
Muhammad Afzaal ◽  
Mohammed Saqr

<p>Learning Analytics (LA) approaches in Blended Learning (BL) research is becoming an established field. In the light of previous critiqued toward LA for not being grounded in theory, the General Data Protection and a renewed focus on individuals’ integrity, this review aims to explore the use of theories, the methodological and analytic approaches in educational settings, along with surveying ethical and legal considerations. The review also maps and explores the outcomes and discusses the pitfalls and potentials currently seen in the field. Journal articles and conference papers were identified through systematic search across relevant databases. 70 papers met the inclusion criteria:  they applied LA within a BL setting, were peer-reviewed, full-papers, and if they were in English. The results reveal that the use of theoretical and methodological approaches was disperse, we identified approaches of BL not included in categories of BL in existing BL literature and suggest these may be referred to as hybrid blended learning, that ethical considerations and legal requirements have often been overlooked. We highlight critical issues that contribute to raise awareness and inform alignment for future research to ameliorate diffuse applications within the field of LA.</p>


Author(s):  
Christina Gloerfeld ◽  
Silke Wrede ◽  
Claudia De Witt ◽  
Xia Wang

Artificial intelligence is one of the disruptive technologies, that drives change in our society and economy, but also in our educational system. Educational data mining, machine learning and expert systems are increasingly being used to support study and teaching. This article takes an educational science perspective to present an approach, how to use a recommendation system for students to support inquiry-based learning and self-directed learning. Along the course of the semester various AI-based applications like automatic assessments, interest visualizations or a learning strategy finder assist in the different phases of the semester. When planning and designing this recommendation systems, the most important premise is to foster self-determination of the students.


Author(s):  
David J Lemay ◽  
Ram B Basnet ◽  
Tenzin Doleck

This study examines the relationship between individuals’ beliefs about AI (Artificial Intelligence) and levels of anxiety with respect to their technology readiness level. In this cross-sectional study, we surveyed 65 students at a southwestern US college. Using partial least squares analysis, we found that technology readiness contributors were significantly and positively related to only one AI anxiety factor: socio-technical illiteracy. In contrast, all four links between technology readiness inhibitors and AI anxiety factors were significant with medium effect sizes. Technology readiness inhibitors are positively related to learning, fears of job replacement, socio-technical illiteracy, and particular AI configurations. Thus, we conclude that AI anxiety runs through a spectrum. It is influenced by real, practical consequences of immediate effects of increased automatization but also by popular representations and discussions of the negative consequences of artificial general intelligence and killer robots and addressing technology readiness is unlikely to mitigate effects of AI anxiety.


Author(s):  
Dirk Ifenthaler ◽  
Jane Yin-Kim Yau

<p class="0abstract"><span lang="EN-AU">Common factors, which are related to study success include students’ sociodemographic factors, cognitive capacity, or prior academic performance, and individual attributes as well as course related factors such as active learning and attention or environmental factors related to supportive academic and social embeddedness. In addition, there are various stages of a learner’s learning journey from the beginning when commencing learning until its completion, as well as different indicators or variables that can be examined to gauge or predict how successfully that journey can or will be at different points during that journey, or how successful learners may complete the study and thereby acquiring the intended learning outcomes. The aim of this research is to gain a deeper understanding of not only if learning analytics can support study success, but which aspects of a learner’s learning journey can benefit from the utilisation of learning analytics. We, therefore, examined different learning analytics indicators to show which aspect of the learning journey they were successfully supporting. Key indicators may include GPA, learning history, and clickstream data. Depending on the type of higher education institution, and the mode of education (face-to-face and/or distance), the chosen indicators may be different due to them having different importance in predicting the learning outcomes and study success.</span></p>


Author(s):  
David John Lemay ◽  
Tenzin Doleck

This paper presents a social learning network analysis of Twitter during the 2020 global shutdown due to the COVID-19 pandemic. Research concerning online learning environments is focused on the reproduction of conventional teaching arrangements, whereas social media technologies afford new channels for the dissemination of information and sharing of knowledge and expertise. We examine Twitter feed around the hashtags #onlinelearning and #onlineteaching during the global shutdown to examine the spontaneous development of online learning communities. We find relatively small and ephemeral communities on the two topics. Most users make spontaneous contributions to the discussion but do not maintain a presence in the Twitter discourse. Optimizing the social learning network, we find many potential efficiencies to be gained through more proactive efforts to connect knowledge seekers and knowledge disseminators. Considerations and prospects for supporting online informal social learning networks are discussed.


Author(s):  
Clare Baek ◽  
Tenzin Doleck

To analyze the current research status and trends of the artificial intelligence in education field, we applied bibliometric methods to examine the articles published in one of the representative journals of the field, <em>International Journal of Artificial Intelligence in Education, </em>from 2015 to 2019. We analyzed 135 articles retrieved from the Web of Science database and examined prolific countries, collaboration networks, prolific authors, keywords, and the citations the articles received. Through examining keywords, we found that the authors largely focused on students and learning. Through examining prolific authors and countries, we found active publication of corresponding authors from United States, United Kingdom, Canada, and Germany. We found international collaboration among some researchers and institutions, such as strong collaboration network between United States and Canada. We suggest reinforcement in building more widespread international partnership and expanding collaboration network by including diverse institutions. International collaboration and expanded institutional network can improve research by incorporating various perspectives and expertise.


Author(s):  
Justian Knobbout ◽  
Esther Van der Stappen

Despite the promises of learning analytics and the existence of several learning analytics implementation frameworks, the large-scale adoption of learning analytics within higher educational institutions remains low. Extant frameworks either focus on a specific element of learning analytics implementation, for example, policy or privacy, or lack operationalization of the organizational capabilities necessary for successful deployment. Therefore, this literature review addresses the research question “<em>What capabilities for the successful adoption of learning analytics can be identified in existing literature on big data analytics, business analytics, and learning analytics?”</em> Our research is grounded in resource-based view theory and we extend the scope beyond the field of learning analytics and include capability frameworks for the more mature research fields of big data analytics and business analytics. This paper’s contribution is twofold: 1) it provides a literature review on known capabilities for big data analytics, business analytics, and learning analytics and 2) it introduces a capability model to support the implementation and uptake of learning analytics. During our study, we identified and analyzed 15 key studies. By synthesizing the results, we found 34 organizational capabilities important to the adoption of analytical activities within an institution and provide 461 ways to operationalize these capabilities. Five categories of capabilities can be distinguished – <em>Data, Management, People, Technology</em>, and <em>Privacy &amp; Ethics.</em> Capabilities presently absent from existing learning analytics frameworks concern <em>sourcing and integration, market, knowledge, training, automation, </em>and <em>connectivity</em>. Based on the results of the review, we present the Learning Analytics Capability Model: a model that provides senior management and policymakers with concrete operationalizations to build the necessary capabilities for successful learning analytics adoption.


Sign in / Sign up

Export Citation Format

Share Document