Teaching and learning analytics to support teacher inquiry

Author(s):  
Demetrios Sampson
Author(s):  
Denis Dennehy ◽  
Kieran Conboy ◽  
Jaganath Babu

AbstractUnderstanding student sentiment plays a vital role in understanding the changes that could or should be made in curriculum design at university. Learning Analytics (LA) has shown potential for improving student learning experiences and supporting teacher inquiry. Yet, there is limited research that reports on the adoption and actual use of LA to support teacher inquiry. This four-year longitudinal study captures sentiment of postgraduate students at a university in Ireland, by integrating LA with the steps of teacher inquiry. This study makes three important contributions to teaching and learning literature. First, it reports on the use of LA to support teacher inquiry over four one-year cycles of a Master of Science in Business Analytics programme between 2016 and 2020. Second, it provides evidence-based recommendations on how to optimise LA to support teacher inquiry, with specific attention as to how these can improve the assimilation of LA into the curriculum design and delivery. Third, the paper concludes with a research agenda to help improve the adoption and integration of LA in the future.


Author(s):  
Jacqueline Mayumi Akazaki ◽  
Leticia Rocha Machado ◽  
Ketia Kellen Araújo da Silva ◽  
Patricia Alejandra Behar

Virtual courses are increasingly being offered in Brazil, making it imperative to develop technological resources and research to help in the teaching and learning processes in this modality. One approach is to analyze student's socio-affective profile in Virtual Learning Environments (VLE). The co-operative learning network (ROODA) VLE has two features called the Social Map (SM) and Affective Map (AM), which can both contribute to the visualization of data regarding social interaction indicators and students' moods in the environment. The SM presents the social relations formed through indicators, which are the absence; collaboration; the distance from the class; evasion; informal groups and popularity, enabling the identification of the participating subjects in the form of sociograms. The AM identifies students' moods graphically through indicators, which are excitement, discouragement, satisfaction, and dissatisfaction. Thus, this article aims to map the possible recurrent socio-affective scenarios in a VLE using Learning Analytics (LA). LA is defined as measurement, collection, analysis, and reporting of data about students and their contexts to understand as well as optimize learning and the environments in which it occurs. It can also contribute to the understanding of student's learning profile, based on social and affective aspects, thus allowing the teacher to develop pedagogical strategies consistent with the needs of each subject. The importance of integrating the possible social and affective scenarios was verified using LA, making it possible to deepen the comprehension of the subjective and qualitative questions regarding the students' interactions in the VLE. In this study, the scenarios are understood as the intersection between the Affective Map and Social Map indicators identified in a VLE. It has both a qualitative and quantitative approach. The choice is qualitatively justified because the research object involves social and affective phenomena that were subjectively expressed in texts and social interactions manifested in the ROODA VLE. It is quantitatively justified by the need to measure the mapping of socio-affective indicators through social parameters and moods applying LA. The subjects were undergraduate students who participated in distance learning courses at a Brazilian public university that used the ROODA VLE in the second semester of 2019. Data were collected from social and affective maps to identify if there was a relationship between them. As a result, based on the existing indicators of social interactions and moods, the socio-affective indicators were created using LA in order to analyze the students’ behavior in relation to the forms of interaction and communication that occur in the ROODA VLE.


2020 ◽  
Vol 2 (1) ◽  
pp. 42
Author(s):  
Steve Leichtweis

Universities are increasingly being expected to ensure student success while at the same time delivering larger courses.  Within this environment, the provision of effective and timely feedback to students and creating opportunities for genuine engagement between teachers and students is increasingly difficult if not impossible for many instructors, despite the known value and importance of feedback (Timperley & Hattie, 2007) and instructor presence (Garrison, Anderson & Archer, 2010).  Similar to other tertiary institutions, the University of Auckland has adopted various technology-enhanced learning approaches and technologies, including learning analytics in an attempt to support teaching and learning at scale.  The increased use of educational technology to support learning provides a variety of data sources for teachers to provide personalised feedback and improve the overall learning experience for students.  This workshop is targeted to teachers interested in the use of learning data to provide personalized support to learners.  Participants will have a hands-on opportunity to use the open-source tool OnTask (Pardo, et al. 2018) within some common teaching scenarios with a synthetically generated data set.  The facilitators will also share and discuss how OnTask is currently being used in universities to support student experience, teaching practice and course design.  As this is a hands-on workshop, participants must bring a laptop computer to work with the online tool and the prepared scenarios.  References   Garrison, D. R., Anderson, T., & Archer, W. (2010). The first decade of the community of inquiry framework: A retrospective. The internet and higher education, 13(1-2), 5-9. Hattie, J., & Timperley, H. (2007). The power of feedback. Review of educational research, 77(1), 81-112. Pardo, A., Bartimote-Aufflick, K., Shum, S. B., Dawson, S., Gao, J., Gaševic, D., Leichtweis, S., Liu, D., Martínez-Maldonado, R., Mirriahi, N. and Moskal, A. C. M. (2018). OnTask: Delivering Data-Informed, Personalized Learning Support Actions. Journal of Learning Analytics, 5(3), 235-249.


2017 ◽  
Vol 21 (3) ◽  
Author(s):  
Holly McKee

With the widespread use of learning analytics tools, there is a need to explore how these technologies can be used to enhance teaching and learning. Little research has been conducted on what human processes are necessary to facilitate meaningful adoption of learning analytics. The research problem is that there is a lack of evidence-based guidance on how instructors can effectively implement learning analytics to support students with the purpose of improving learning outcomes. The goal was to develop and validate a model to guide instructors in the implementation of learning analytics tools. Using design and development research methods, an implementation model was constructed and validated internally. Themes emerged falling into the categories of adoption and caution with six themes falling under adoption including: LA as evidence, reaching out, frequency, early identification/intervention, self-reflection, and align LA with pedagogical intent and three themes falling under the category of caution including: skepticism, fear of overdependence, and question of usefulness.  The model should enhance instructors’ use of learning analytics by enabling them to better take advantage of available technologies to support teaching and learning in online and blended learning environments. Researchers can further validate the model by studying its usability (i.e., usefulness, effectiveness, efficiency, and learnability), as well as, how instructors’ use of this model to implement learning analytics in their courses affects retention, persistence, and performance.


Author(s):  
Yaëlle Chaudy ◽  
Thomas M. Connolly

Assessment is a crucial aspect of any teaching and learning process. New tools such as educational games offer promising advantages: they can personalize feedback to students and save educators time by automating the assessment process. However, while many teachers agree that educational games increase motivation, learning, and retention, few are ready to fully trust them as an assessment tool. A likely reason behind this lack of trust is that educational games are distributed as black boxes, unmodifiable by educators and not providing enough insight about the gameplay. This chapter presents three systematic literature reviews looking into the integration of assessment, feedback, and learning analytics in educational games. It then proposes a framework and present a fully developed engine. The engine is used by both developers and educators. Designed to separate game and assessment, it allows teachers to modify the assessment after distribution and visualize gameplay data via a learning analytics dashboard.


Author(s):  
M. Govindarajan

Educational data mining (EDM) creates high impact in the field of academic domain. EDM is concerned with developing new methods to discover knowledge from educational and academic database and can be used for decision making in educational and academic systems. EDM is useful in many different areas including identifying at risk students, identifying priority learning needs for different groups of students, increasing graduation rates, effectively assessing institutional performance, maximizing campus resources, and optimizing subject curriculum renewal. This chapter discusses educational data mining, its applications, and techniques that have to be adopted in order to successfully employ educational data mining and learning analytics for improving teaching and learning. The techniques and applications discussed in this chapter will provide a clear-cut idea to the educational data mining researchers to carry out their work in this field.


2022 ◽  
pp. 1803-1846
Author(s):  
Yaëlle Chaudy ◽  
Thomas M. Connolly

Assessment is a crucial aspect of any teaching and learning process. New tools such as educational games offer promising advantages: they can personalize feedback to students and save educators time by automating the assessment process. However, while many teachers agree that educational games increase motivation, learning, and retention, few are ready to fully trust them as an assessment tool. A likely reason behind this lack of trust is that educational games are distributed as black boxes, unmodifiable by educators and not providing enough insight about the gameplay. This chapter presents three systematic literature reviews looking into the integration of assessment, feedback, and learning analytics in educational games. It then proposes a framework and present a fully developed engine. The engine is used by both developers and educators. Designed to separate game and assessment, it allows teachers to modify the assessment after distribution and visualize gameplay data via a learning analytics dashboard.


Sign in / Sign up

Export Citation Format

Share Document