scholarly journals Learning Factor Models of Students at Risk of Failing in the Early Stage of Tertiary Education

2016 ◽  
Vol 3 (2) ◽  
pp. 330-372 ◽  
Author(s):  
Geraldine Gray ◽  
Colm McGuinness ◽  
Philip Owende ◽  
Markus Hofmann

This paper reports on a study to predict students at risk of failing based on data available prior to commencement of first year of study. The study was conducted over three years, 2010 to 2012, on a student population from a range of academic disciplines, n=1,207. Data was gathered from both student enrolment data maintained by college administration, and an online, self-reporting, learner profiling tool administered during first-year student induction. Factors considered included prior academic performance, personality, motivation, self-regulation, learning approaches, age and gender.  Models were trained on data from the 2010 and 2011 student cohort, and tested on data from the 2012 student cohort. A comparison of eight classification algorithms found k-NN achieved best model accuracy (72%), but results from other models were similar, including ensembles (71%), support vector machine (70%) and a decision tree (70%). Models of subgroups by age and discipline achieved higher accuracies, but were affected by sample size; n<900 underrepresented patterns in the dataset. Results showed that factors most predictive of academic performance in first year of study at tertiary education included age, prior academic performance and self-efficacy. This study indicated that early modelling of first year students yielded informative, generalisable models that identified students at risk of failing.

Author(s):  
Pilar Gandía Herrero ◽  
Agustín Romero Medina

The quality of academic performance and learning outcomes depend on various factors, both psychological and contextual. The academic context includes the training activities and the type of evaluation or examination, which also influences cognitive and motivational factors, such as learning and study approaches and self-regulation. In our university context, the predominant type of exam is that of multiple-choice questions. The cognitive requirement of these questions may vary. From Bloom's typical taxonomy, it is considered that from lower to higher cognitive demand we have questions about factual, conceptual, application knowledge, etc. Normally, the teacher does not take these classifications into account when preparing this type of exam. We propose here an adaptation model of the multiple choice questions classification according to cognitive requirement (associative memorization, comprehension, application), putting it to the test analyzing an examination of a subject in Psychology Degree and relating the results with measures of learning approaches (ASSIST and R-SPQ-2F questionnaires) and self-regulation in a sample of 87 subjects. The results show differential academic performance according to "cognitive" types of questions and differences in approaches to learning and self-regulation. The convenience of taking into account these factors of cognitive requirement when elaborating multiple choice questions is underlined.


2019 ◽  
Vol 9 (4) ◽  
pp. 265
Author(s):  
Chambers ◽  
Salter ◽  
Muldrow

First-year students who enter college pursuing a STEM degree still face challenges persisting through the STEM pipeline (Chen, 2013; Leu, 2017). In this case study, researchers examine the impact of a utilitarian scientific literacy based academic intervention on retention of first-year students in STEM using a mixed methods approach. A sample (n = 116) of first-year students identified as at-risk of not persisting in STEM were enrolled in a for credit utilitarian scientific literacy course. Participants of the semester long course were then compared with a control group of first-year students identified as at-risk of persisting in STEM. A two-proportion z test was performed to assess the mean differences between students and participants of the course were given a survey to gauge student experiences. Quantitative results (φ 0.34, p < 0.05) indicate that the utilitarian scientific literacy course had a statistically significant impact on retention among first-year students at-risk of persisting in STEM. Moreover, qualitative data obtained from participant responses describe internal and external growth as positive outcomes associated with the intervention.


2019 ◽  
Vol 9 (3) ◽  
pp. 448 ◽  
Author(s):  
Fredys Simanca ◽  
Rubén González Crespo ◽  
Luis Rodríguez-Baena ◽  
Daniel Burgos

Learning analytics (LA) has become a key area of study in educology, where it could assist in customising teaching and learning. Accordingly, it is precisely this data analysis technique that is used in a sensor—AnalyTIC—designed to identify students who are at risk of failing a course, and to prompt subsequent tutoring. This instrument provides the teacher and the student with the necessary information to evaluate academic performance by using a risk assessment matrix; the teacher can then customise any tutoring for a student having problems, as well as adapt the course contents. The sensor was validated in a study involving 39 students in the first term of the Environmental Engineering program at the Cooperative University of Colombia. Participants were all enrolled in an Algorithms course. Our findings led us to assert that it is vital to identify struggling students so that teachers can take corrective measures. The sensor was initially created based on the theoretical structure of the processes and/or phases of LA. A virtual classroom was built after these phases were identified, and the tool for applying the phases was then developed. After the tool was validated, it was established that students’ educational experiences are more dynamic when teachers have sufficient information for decision-making, and that tutoring and content adaptation boost the students’ academic performance.


2000 ◽  
Vol 75 (Supplement) ◽  
pp. S78-S80 ◽  
Author(s):  
SCOTT A. FIELDS ◽  
CYNTHIA MORRIS ◽  
WILLIAM L. TOFFLER ◽  
EDWARD J. KEENAN

2021 ◽  
Vol 11 (22) ◽  
pp. 10546
Author(s):  
Serepu Bill-William Seota ◽  
Richard Klein ◽  
Terence van Zyl

The analysis of student performance involves data modelling that enables the formulation of hypotheses and insights about student behaviour and personality. We extract online behaviours as proxies to Extraversion and Conscientiousness, which have been proven to correlate with academic performance. The proxies of personalities we obtain yield significant (p<0.05) population correlation coefficients for traits against grade—0.846 for Extraversion and 0.319 for Conscientiousness. Furthermore, we demonstrate that a student’s e-behaviour and personality can be used with deep learning (LSTM) to predict and forecast whether a student is at risk of failing the year. Machine learning procedures followed in this report provide a methodology to timeously identify students who are likely to become at risk of poor academic performance. Using engineered online behaviour and personality features, we obtain a classification accuracy (κ) of students at risk of 0.51. Lastly, we show that we can design an intervention process using machine learning that supplements the existing performance analysis and intervention methods. The methodology presented in this article provides metrics that measure the factors that affect student performance and complement the existing performance evaluation and intervention systems in education.


Sign in / Sign up

Export Citation Format

Share Document