The effectiveness of learning analytics for identifying at-risk students in higher education

2019 ◽  
Vol 45 (6) ◽  
pp. 842-854 ◽  
Author(s):  
Ed Foster ◽  
Rebecca Siddle
Author(s):  
Dennis Foung

Use of algorithms and data mining approaches are not new to Industry 4.0. However, these may not be common for students and educators in higher education. This chapter compares various classification techniques: classification tree, logistic regression, and artificial neural networks (ANN). The comparison focuses on each method's accuracy, algorithm, and practicality in higher education. This study made use of a dataset from two academic writing courses in a university in Hong Kong with more than 5,000 records. Results suggest that classification trees and logistic regression can be easily used in the higher education context, but ANN may not be applicable in higher educational settings. The research team suggests that higher education administrators take this research forward and design platforms to realize these classification algorithms to predict at-risk students.


2020 ◽  
Vol 10 (13) ◽  
pp. 4427 ◽  
Author(s):  
David Bañeres ◽  
M. Elena Rodríguez ◽  
Ana Elena Guerrero-Roldán ◽  
Abdulkadir Karadeniz

Artificial intelligence has impacted education in recent years. Datafication of education has allowed developing automated methods to detect patterns in extensive collections of educational data to estimate unknown information and behavior about the students. This research has focused on finding accurate predictive models to identify at-risk students. This challenge may reduce the students’ risk of failure or disengage by decreasing the time lag between identification and the real at-risk state. The contribution of this paper is threefold. First, an in-depth analysis of a predictive model to detect at-risk students is performed. This model has been tested using data available in an institutional data mart where curated data from six semesters are available, and a method to obtain the best classifier and training set is proposed. Second, a method to determine a threshold for evaluating the quality of the predictive model is established. Third, an early warning system has been developed and tested in a real educational setting being accurate and useful for its purpose to detect at-risk students in online higher education. The stakeholders (i.e., students and teachers) can analyze the information through different dashboards, and teachers can also send early feedback as an intervention mechanism to mitigate at-risk situations. The system has been evaluated on two undergraduate courses where results shown a high accuracy to correctly detect at-risk students.


2021 ◽  
Vol 38 (2) ◽  
pp. 243-257
Author(s):  
Paul Joseph-Richard ◽  
James Uhomoibhi ◽  
Andrew Jaffrey

PurposeThe aims of this study are to examine affective responses of university students when viewing their own predictive learning analytics (PLA) dashboards, and to analyse how those responses are perceived to affect their self-regulated learning behaviour.Design/methodology/approachA total of 42 Northern Irish students were shown their own predicted status of academic achievement on a dashboard. A list of emotions along with definitions was provided and the respondents were instructed to verbalise them during the experience. Post-hoc walk-through conversations with participants further clarified their responses. Content analysis methods were used to categorise response patterns.FindingsThere is a significant variation in ways students respond to the predictions: they were curious and motivated, comforted and sceptical, confused and fearful and not interested and doubting the accuracy of predictions. The authors show that not all PLA-triggered affective states motivate students to act in desirable and productive ways.Research limitations/implicationsThis small-scale exploratory study was conducted in one higher education institution with a relatively small sample of students in one discipline. In addition to the many different categories of students included in the study, specific efforts were made to include “at-risk” students. However, none responded. A larger sample from a multi-disciplinary background that includes those who are categorised as “at-risk” could further enhance the understanding.Practical implicationsThe authors provide mixed evidence for students' openness to learn from predictive learning analytics scores. The implications of our study are not straightforward, except to proceed with caution, valuing benefits while ensuring that students' emotional well-being is protected through a mindful implementation of PLA systems.Social implicationsUnderstanding students' affect responses contributes to the quality of student support in higher education institutions. In the current era on online learning and increasing adaptation to living and learning online, the findings allow for the development of appropriate strategies for implementing affect-aware predictive learning analytics (PLA) systems.Originality/valueThe current study is unique in its research context, and in its examination of immediate affective states experienced by students who viewed their predicted scores, based on their own dynamic learning data, in their home institution. It brings out the complexities involved in implementing student-facing PLA dashboards in higher education institutions.


2015 ◽  
Vol 21 (2) ◽  
pp. 247-262 ◽  
Author(s):  
Jay Bainbridge ◽  
James Melitski ◽  
Anne Zahradnik ◽  
Eitel J. M. Lauría ◽  
Sandeep Jayaprakash ◽  
...  

Author(s):  
Nick Dix ◽  
Andrew Lail ◽  
Matt Birnbaum ◽  
Joseph Paris

Institutions of higher education often use the term “at-risk” to label undergraduate students who have a higher likelihood of not persisting. However, it is not clear how the use of this label impacts the perspectives of the higher education professionals who serve and support these students. Our qualitative study explores the descriptions and understandings of higher education professionals who serve and support at-risk students. We use thematic analysis (Braun & Clark, 2006) to interpret our data and develop our themes. These themes include conflicting views of the “at-risk” definition, attempts to normalize at-risk, fostering relationships, and “at-promise.”


Sign in / Sign up

Export Citation Format

Share Document