scholarly journals Eye Movement Analysis of Digital Learning Content for Educational Innovation

2020 ◽  
Vol 12 (6) ◽  
pp. 2518 ◽  
Author(s):  
Xiaolong Liu ◽  
Xuebai Zhang ◽  
Wei-Wen Chen ◽  
Shyan-Ming Yuan

Eye movement technology is highly valued for evaluating and improving digital learning content. In this paper, an educational innovation study of eye movement behaviors on digital learning content is presented. We proposed three new eye movement metrics to explain eye movement behaviors. In the proposed method, the digital content, which were slide-deck-like works, were classified into page categories according to the characteristics of each page. We interpreted the subjects’ eye movement behaviors on the digital slide decks. After data regularization and filtering, the results were analyzed to give directions for how to design an attractive digital learning content from the viewpoint of eye movement behaviors. The relationships between the subjects’ evaluation scores, page categories, and eye movement metrics are discussed. The results demonstrated that the proposed fixation time percentage (FTP) was a representative, strong, and stable eye movement metric to measure the subjects’ interest. Moreover, a reasonable portion of semantic content had a positive influence on the subjects’ interest.

Intelligence ◽  
1984 ◽  
Vol 8 (3) ◽  
pp. 205-238 ◽  
Author(s):  
Charles E. Bethell-Fox ◽  
David F. Lohman ◽  
Richard E. Snow

Author(s):  
Gavindya Jayawardena ◽  
Sampath Jayarathna

Eye-tracking experiments involve areas of interest (AOIs) for the analysis of eye gaze data. While there are tools to delineate AOIs to extract eye movement data, they may require users to manually draw boundaries of AOIs on eye tracking stimuli or use markers to define AOIs. This paper introduces two novel techniques to dynamically filter eye movement data from AOIs for the analysis of eye metrics from multiple levels of granularity. The authors incorporate pre-trained object detectors and object instance segmentation models for offline detection of dynamic AOIs in video streams. This research presents the implementation and evaluation of object detectors and object instance segmentation models to find the best model to be integrated in a real-time eye movement analysis pipeline. The authors filter gaze data that falls within the polygonal boundaries of detected dynamic AOIs and apply object detector to find bounding-boxes in a public dataset. The results indicate that the dynamic AOIs generated by object detectors capture 60% of eye movements & object instance segmentation models capture 30% of eye movements.


2014 ◽  
Vol 7 (1) ◽  
Author(s):  
Vassilios Krassanakis ◽  
Vassiliki Filippakopoulou ◽  
Byron Nakos

Eye movement recordings and their analysis constitute an effective way to examine visual perception. There is a special need for the design of computer software for the performance of data analysis. The present study describes the development of a new toolbox, called EyeMMV (Eye Movements Metrics & Visualizations), for post experimental eye movement analysis. The detection of fixation events is performed with the use of an introduced algorithm based on a two-step spatial dispersion threshold. Furthermore, EyeMMV is designed to support all well-known eye tracking metrics and visualization techniques. The results of fixation identification algorithm are compared with those of an algorithm of dispersion-type with a moving window, imported in another open source analysis tool. The comparison produces outputs that are strongly correlated. The EyeMMV software is developed using the scripting language of MATLAB and the source code is distributed through GitHub under the third version of GNU General Public License (link: https://github.com/krasvas/EyeMMV).


2013 ◽  
Vol 49 (Supplement) ◽  
pp. S182-S183
Author(s):  
Yasuo OKA ◽  
Iwataro OKA ◽  
Chieko NARITA ◽  
Yuka TAKAI ◽  
Akihiko GOTO ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document