On Developing a Framework for Knowledge-Based Learning Indicator System in the Context of Learning Analytics

Author(s):  
Rami Hodrob ◽  
Ahmed Ewais ◽  
Mohammed Maree
2016 ◽  
Vol 3 (1) ◽  
Author(s):  
Mohammad Khalil ◽  
Martin Ebner

Learning Analytics has reserved its position as an important field in the educational sector. However, the large-scale collection, processing and analyzing of data have steered the wheel beyond the border lines and faced an abundance of ethical breaches and constraints. Revealing learners’ personal information and attitudes, as well as their activities, are major aspects that lead to personally identify individuals. Yet, de-identification can keep the process of Learning Analytics in progress while reducing the risk of inadvertent disclosure of learners’ identities. In this paper, the authors talk about de-identification methods in the context of learning environment and propose a first prototype conceptual approach that describes the combination of anonymization strategies and Learning Analytics techniques.


Author(s):  
Kam Hou Vat

This chapter investigates an ethical mechanism of organizational measurement for student learning that is based on the learning analytics gathered from various learning-related activities over an extended period of time. In the context of today’s Web 2.0, such learning analytics are often collected from an electronic learning environment, such as a Web-based course management system (CMS), providing various tools of interest in scaffolding student learning: blogs, wikis, online forums, RSS, and many other innovative resources to facilitate learning online. This mechanism, intended to be ethically sound, could be considered as an instance of an accountability system typically installed in institutions of higher education and/or secondary schools, serving to gather evidence of student learning in a virtual learning environment involving electronic presence from both teachers and students in the context of learning development. It is understood that today’s university as a higher education institution (HEI) must put in place such an accountability system to measure student college experience, as her sustained commitment to continuous improvement in the quality of student learning; yet, without the context of data analysis, the transformation of any existing accountability infrastructure in support of assessment for student learning could hardly be innovated effectively, especially regarding the productivity and coordination of its staff, both academic and administrative. The question is how innovatively a HEI could establish such an accountability system to measure and assess student learning responsibly by collecting, analyzing, and interpreting student learning analytics designed into their various learning activities.


2014 ◽  
Vol 1 (3) ◽  
pp. 211-222 ◽  
Author(s):  
Noureddine Elouazizi

This paper identifies some of the main challenges of data governance modeling in the context of learning analytics for higher education institutions, and discusses the critical factors for designing data governance models for learning analytics. It identifies three fundamental common challenges that cut across any learning analytics data governance model, viz., the ownership of the learning analytics data sets, its interpretation and the enacting of decision-making on the basis of this learning analytics data. It also proposes a set of high-level requirements that are necessary for modeling data governance for learning analytics.  


2020 ◽  
Vol 36 (6) ◽  
pp. 1-6
Author(s):  
Linda Corrin ◽  
Maren Scheffel ◽  
Dragan Gašević

The field of learning analytics has evolved over the past decade to provide new ways to view, understand and enhance learning activities and environments in higher education. It brings together research and practice traditions from multiple disciplines to provide an evidence base to inform student support and effective design for learning. This has resulted in a plethora of ideas and research exploring how data can be analysed and utilised to not only inform educators, but also to drive online learning systems that offer personalised learning experiences and/or feedback for students. However, a core challenge that the learning analytics community continues to face is how the impact of these innovations can be demonstrated. Where impact is positive, there is a case for continuing or increasing the use of learning analytics, however, there is also the potential for negative impact which is something that needs to be identified quickly and managed. As more institutions implement strategies to take advantage of learning analytics as part of core business, it is important that impact can be evaluated and addressed to ensure effectiveness and sustainability. In this editorial of the AJET special issue dedicated to the impact of learning analytics in higher education, we consider what impact can mean in the context of learning analytics and what the field needs to do to ensure that there are clear pathways to impact that result in the development of systems, analyses, and interventions that improve the educational environment.


2020 ◽  
Vol 36 (6) ◽  
pp. 89-106
Author(s):  
Jo-Anne Clark ◽  
Yulin Liu ◽  
Pedro Isaias

Critical success factors (CSFs) have been around since the late 1970s and have been used extensively in information systems implementations. CSFs provide a comprehensive understanding of the multiple layers and dimensions of implementation success. In the specific context of learning analytics (LA), identifying CSFs can maximise the possibilities of an effective implementation and harness the value of converting data into actionable information. This paper proposes a framework that aims to identify and explore the CSFs for the implementation of LA within the higher education sector by examining the viewpoints of higher education professionals. To obtain a rounded insight into stakeholders’ perceptions, we conducted a mixed-method inquiry with factor analysis, profile analysis and thematic analysis of both quantitative and qualitative data collected with an online questionnaire from an international sample. The responses validate five CSFs of LA implementation: strategy and policy at organisational level, information technological readiness, performance and impact evaluation, people’s skills and expertise and data quality. Results also disclose diverse views about the CSFs’ priorities and the associated difficulties and achievements. Implications for practice or policy: Higher education practitioners should consider CSFs for implementing an LA initiative successfully. This study validates five dimensions of the CSFs of implementing LA in higher education. The validated framework enumerates several factors in each of the main dimensions for achieving optimum results. Stakeholders have diverse opinions about the priorities of CSFs, particularly in organisational commitment, data quality and human capital.


2017 ◽  
Vol 38 (3) ◽  
pp. 133-143 ◽  
Author(s):  
Danny Osborne ◽  
Yannick Dufresne ◽  
Gregory Eady ◽  
Jennifer Lees-Marshment ◽  
Cliff van der Linden

Abstract. Research demonstrates that the negative relationship between Openness to Experience and conservatism is heightened among the informed. We extend this literature using national survey data (Study 1; N = 13,203) and data from students (Study 2; N = 311). As predicted, education – a correlate of political sophistication – strengthened the negative relationship between Openness and conservatism (Study 1). Study 2 employed a knowledge-based measure of political sophistication to show that the Openness × Political Sophistication interaction was restricted to the Openness aspect of Openness. These studies demonstrate that knowledge helps people align their ideology with their personality, but that the Openness × Political Sophistication interaction is specific to one aspect of Openness – nuances that are overlooked in the literature.


Sign in / Sign up

Export Citation Format

Share Document