student modelling
Recently Published Documents


TOTAL DOCUMENTS

67
(FIVE YEARS 8)

H-INDEX

10
(FIVE YEARS 0)

Author(s):  
Sandra Katz ◽  
Patricia Albacete ◽  
Irene-Angelica Chounta ◽  
Pamela Jordan ◽  
Bruce M. McLaren ◽  
...  

Author(s):  
Hongxin Yan ◽  
Fuhua Lin ◽  
Kinshuk

AbstractOnline education is growing because of its benefits and advantages that students enjoy. Educational technologies (e.g., learning analytics, student modelling, and intelligent tutoring systems) bring great potential to online education. Many online courses, particularly in self-paced online learning (SPOL), face some inherent barriers such as learning awareness and academic intervention. These barriers can affect the academic performance of online learners. Recently, learning analytics has been shown to have great potential in removing these barriers. However, it is challenging to achieve the full potential of learning analytics with the traditional online course learning design model. Thus, focusing on SPOL, this study proposes that learning analytics should be included in the course learning design loop to ensure data collection and pedagogical connection. We propose a novel learning design-analytics model in which course learning design and learning analytics can support each other to increase learning success. Based on the proposed model, a set of online course design strategies are recommended for online educators who wish to use learning analytics to mitigate the learning barriers in SPOL. These strategies and technologies are inspired by Jim Greer’s work on student modelling. By following these recommended design strategies, a computer science course is used as an example to show our initial practices of including learning analytics in the course learning design loop. Finally, future work on how to develop and evaluate learning analytics enabled learning systems is outlined.


IEEE Access ◽  
2019 ◽  
Vol 7 ◽  
pp. 128242-128262
Author(s):  
Farman Ali Khan ◽  
Awais Akbar ◽  
Muhammad Altaf ◽  
Shujaat Ali Khan Tanoli ◽  
Ayaz Ahmad

2018 ◽  
Vol 5 (2) ◽  
Author(s):  
Nigel Bosch ◽  
Luc Paquette

Metrics including Cohen’s kappa, precision, recall, and F1 are common measures of performance for models of discrete student states, such as a student’s affect or behaviour. This study examined discrete model metrics for previously published student model examples to identify situations where metrics provided differing perspectives on model performance. Simulated models also systematically showed the effects of imbalanced class distributions in both data and predictions, in terms of the values of metrics and the chance levels (values obtained by making random predictions) for those metrics. Random chance level for F1 was also established and evaluated. Results for example student models showed that over-prediction of the class of interest (positive class) was relatively common. Chance-level F1 was inflated by over-prediction; conversely, maximum possible values for F1 and kappa were negatively impacted by over-prediction of the positive class. Additionally, normalization methods for F1 relative to chance are discussed and compared to kappa, demonstrating an equivalence between kappa and normalized F1. Finally, implications of results for choice of metrics are discussed in the context of common student modelling goals, such as avoiding false negatives for student states that are negatively related to learning.


Sign in / Sign up

Export Citation Format

Share Document