scholarly journals Introduction to Section Two: MOOCs, Psychological Constructs, Communication Behaviors

2016 ◽  
Vol 20 (2) ◽  
Author(s):  
Peter Shea

This issue of Online Learning also contains four articles outside the theme of learning analytics. This section contains papers investigating MOOCs, a comparison of anxiety levels and the “imposter phenomenon” between online and classroom students, and a qualitative analysis of information behaviors among online students.

2012 ◽  
Vol 16 (3) ◽  
Author(s):  
Laurie P Dringus

This essay is written to present a prospective stance on how learning analytics, as a core evaluative approach, must help instructors uncover the important trends and evidence of quality learner data in the online course. A critique is presented of strategic and tactical issues of learning analytics. The approach to the critique is taken through the lens of questioning the current status of applying learning analytics to online courses. The goal of the discussion is twofold: (1) to inform online learning practitioners (e.g., instructors and administrators) of the potential of learning analytics in online courses and (2) to broaden discussion in the research community about the advancement of learning analytics in online learning. In recognizing the full potential of formalizing big data in online coures, the community must address this issue also in the context of the potentially "harmful" application of learning analytics.


2020 ◽  
Vol 10 (24) ◽  
pp. 9148
Author(s):  
Germán Moltó ◽  
Diana M. Naranjo ◽  
J. Damian Segrelles

Cloud computing instruction requires hands-on experience with a myriad of distributed computing services from a public cloud provider. Tracking the progress of the students, especially for online courses, requires one to automatically gather evidence and produce learning analytics in order to further determine the behavior and performance of students. With this aim, this paper describes the experience from an online course in cloud computing with Amazon Web Services on the creation of an open-source data processing tool to systematically obtain learning analytics related to the hands-on activities carried out throughout the course. These data, combined with the data obtained from the learning management system, have allowed the better characterization of the behavior of students in the course. Insights from a population of more than 420 online students through three academic years have been assessed, the dataset has been released for increased reproducibility. The results corroborate that course length has an impact on online students dropout. In addition, a gender analysis pointed out that there are no statistically significant differences in the final marks between genders, but women show an increased degree of commitment with the activities planned in the course.


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Yating Li ◽  
Chi Zhou ◽  
Di Wu ◽  
Min Chen

PurposeAdvances in information technology now permit the recording of massive and diverse process data, thereby making data-driven evaluations possible. This study discusses whether teachers’ information literacy can be evaluated based on their online information behaviors on online learning and teaching platforms (OLTPs).Design/methodology/approachFirst, to evaluate teachers’ information literacy, the process data were combined from teachers on OLTP to describe nine third-level indicators from the richness, diversity, usefulness and timeliness analysis dimensions. Second, propensity score matching (PSM) and difference tests were used to analyze the differences between the performance groups with reduced selection bias. Third, to effectively predict the information literacy score of each teacher, four sets of input variables were used for prediction using supervised learning models.FindingsThe results show that the high-performance group performs better than the low-performance group in 6 indicators. In addition, information-based teaching and behavioral research data can best reflect the level of information literacy. In the future, greater in-depth explorations are needed with richer online information behavioral data and a more effective evaluation model to increase evaluation accuracy.Originality/valueThe evaluation based on online information behaviors has concrete application scenarios, positively correlated results and prediction interpretability. Therefore, information literacy evaluations based on behaviors have great potential and favorable prospects.


2021 ◽  
Author(s):  
Jay Liebowitz

Author(s):  
D. Thammi Raju ◽  
G. R. K. Murthy ◽  
S. B. Khade ◽  
B. Padmaja ◽  
B. S. Yashavanth ◽  
...  

Building an effective online course requires an understanding of learning analytics. The study assumes significance in the COVID 19 pandemic situation as there is a sudden surge in online courses. Analysis of the online course using the data generated from the Moodle Learning Management System (LMS), Google Forms and Google Analytics was carried out to understand the tenants of an effective online course. About 515 learners participated in the initial pre-training needs & expectations’ survey and 472 learners gave feedback at the end, apart from the real-time data generated from LMS and Google Analytics during the course period. This case study analysed online learning behaviour and the supporting learning environment and suggest critical factors to be at the centre stage in the design and development of online courses; leads to the improved online learning experience and thus the quality of education. User needs, quality of resources and effectiveness of online courses are equally important in taking further online courses.


2016 ◽  
Vol 45 (2) ◽  
pp. 165-187 ◽  
Author(s):  
Florence Martin ◽  
Abdou Ndoye ◽  
Patricia Wilkins

Quality Matters is recognized as a rigorous set of standards that guide the designer or instructor to design quality online courses. We explore how Quality Matters standards guide the identification and analysis of learning analytics data to monitor and improve online learning. Descriptive data were collected for frequency of use, time spent, and performance and analyzed to identify patterns and trends on how students interact with online course components based on the Quality Matters standards. Major findings of this article provide a framework and guidance for instructors on how data might be collected and analyzed to improve online learning effectiveness.


Author(s):  
Michelle Kilburn ◽  
Martha Henckell ◽  
David Starrett

Identifying the positive attributes of students and instructors in the online environment will contribute to the understanding of how we can enhance the learning experience for the student and the teaching experience for the instructor. This article will assist students and instructors in understanding the differences that may be experienced in the online environment versus the face-to-face environment and provide the opportunity to consider whether online learning and/or teaching is a “good fit” for them. Understanding why students and/or instructors might choose the online environment will also assist administrators in developing successful, quality online programs that enrich the experiences for both students and instructors.


2014 ◽  
Vol 31 ◽  
pp. 542-550 ◽  
Author(s):  
Ángel F. Agudo-Peregrina ◽  
Santiago Iglesias-Pradas ◽  
Miguel Ángel Conde-González ◽  
Ángel Hernández-García

2020 ◽  
Vol 9 (2) ◽  
pp. 231
Author(s):  
Justina Naujokaitienė ◽  
Giedrė Tamoliūnė ◽  
Airina Volungevičienė ◽  
Josep M. Duart

Student engagement is one of the most relevant topics within the academic and research community nowadays. Higher education curriculum, teaching and learning integrate new technology- supported learning solutions. New methods and tools enhance teacher and learner interactions and influence learner engagement positively. This research addresses the need to explore new ways of improving teaching practices to better engage students with the help of learning analytics. The paper investigates how university teachers use the data from learning analytics to observe learners and to engage them in online learning. Qualitative inquiry was chosen to approach the research problem, and semi-structured interviews with the teachers using (blended) online learning were conveyed to explore teacher practices in students’ behaviour and engagement observations online, disclosing teachers’ abilities to understand the challenging learner engagement process based on the data from learning analytics. The new evidence provided by this research highlights the successful practices in the use of learning analytics data to observe students’ behaviour and engagement and to inform teachers on the presence needed in order to develop learner–centred activities and to make curriculum changes. The limitation of this study lies in the fact that the different online teaching experiences that research participants had might have restricted their understanding of the use of LA data for curriculum development and learners’ engagement.


Sign in / Sign up

Export Citation Format

Share Document