scholarly journals Eye movement analysis with hidden Markov models (EMHMM) with co-clustering

Author(s):  
Janet H. Hsiao ◽  
Hui Lan ◽  
Yueyuan Zheng ◽  
Antoni B. Chan

AbstractThe eye movement analysis with hidden Markov models (EMHMM) method provides quantitative measures of individual differences in eye-movement pattern. However, it is limited to tasks where stimuli have the same feature layout (e.g., faces). Here we proposed to combine EMHMM with the data mining technique co-clustering to discover participant groups with consistent eye-movement patterns across stimuli for tasks involving stimuli with different feature layouts. Through applying this method to eye movements in scene perception, we discovered explorative (switching between the foreground and background information or different regions of interest) and focused (mainly looking at the foreground with less switching) eye-movement patterns among Asian participants. Higher similarity to the explorative pattern predicted better foreground object recognition performance, whereas higher similarity to the focused pattern was associated with better feature integration in the flanker task. These results have important implications for using eye tracking as a window into individual differences in cognitive abilities and styles. Thus, EMHMM with co-clustering provides quantitative assessments on eye-movement patterns across stimuli and tasks. It can be applied to many other real-life visual tasks, making a significant impact on the use of eye tracking to study cognitive behavior across disciplines.

Sensors ◽  
2021 ◽  
Vol 21 (22) ◽  
pp. 7569
Author(s):  
Hsing-Hao Lee ◽  
Zih-Ling Chen ◽  
Su-Ling Yeh ◽  
Janet Hui-Wen Hsiao ◽  
An-Yeu (Andy) Wu

Mind-wandering has been shown to largely influence our learning efficiency, especially in the digital and distracting era nowadays. Detecting mind-wandering thus becomes imperative in educational scenarios. Here, we used a wearable eye-tracker to record eye movements during the sustained attention to response task. Eye movement analysis with hidden Markov models (EMHMM), which takes both spatial and temporal eye-movement information into account, was used to examine if participants’ eye movement patterns can differentiate between the states of focused attention and mind-wandering. Two representative eye movement patterns were discovered through clustering using EMHMM: centralized and distributed patterns. Results showed that participants with the centralized pattern had better performance on detecting targets and rated themselves as more focused than those with the distributed pattern. This study indicates that distinct eye movement patterns are associated with different attentional states (focused attention vs. mind-wandering) and demonstrates a novel approach in using EMHMM to study attention. Moreover, this study provides a potential approach to capture the mind-wandering state in the classroom without interrupting the ongoing learning behavior.


2019 ◽  
Vol 52 (3) ◽  
pp. 1026-1043 ◽  
Author(s):  
Tim Chuk ◽  
Antoni B. Chan ◽  
Shinsuke Shimojo ◽  
Janet H. Hsiao

2014 ◽  
Vol 14 (10) ◽  
pp. 1212-1212 ◽  
Author(s):  
T. Chuk ◽  
A. X. Luo ◽  
K. Crookes ◽  
W. G. Hayward ◽  
A. B. Chan ◽  
...  

2014 ◽  
Vol 7 (3) ◽  
Author(s):  
Andreas Bulling ◽  
Roman Bednarik

Latest developments in remote and head-mounted eye tracking and automated eye movement analysis point the way toward unobtrusive eye-based human-computer interfaces that will become pervasively usable in everyday life. We call this new paradigm pervasive eye tracking – continuous eye monitoring and analysis 24/7. Pervasive Eye Tracking and Mobile Eye-Based Interaction (PETMEI) is a workshop series that revolves around the theme of pervasive eye-tracking as a trailblazer for pervasive eye-based human-computer interaction and eye-based context-awareness. This special issue is composed from extended versions of the top-scoring papers from the 3rd workshop in the PETMEI series held in 2013.


2020 ◽  
Vol 52 (6) ◽  
pp. 2515-2534 ◽  
Author(s):  
Diederick C. Niehorster ◽  
Raimondas Zemblys ◽  
Tanya Beelders ◽  
Kenneth Holmqvist

AbstractThe magnitude of variation in the gaze position signals recorded by an eye tracker, also known as its precision, is an important aspect of an eye tracker’s data quality. However, data quality of eye-tracking signals is still poorly understood. In this paper, we therefore investigate the following: (1) How do the various available measures characterizing eye-tracking data during fixation relate to each other? (2) How are they influenced by signal type? (3) What type of noise should be used to augment eye-tracking data when evaluating eye-movement analysis methods? To support our analysis, this paper presents new measures to characterize signal type and signal magnitude based on RMS-S2S and STD, two established measures of precision. Simulations are performed to investigate how each of these measures depends on the number of gaze position samples over which they are calculated, and to reveal how RMS-S2S and STD relate to each other and to measures characterizing the temporal spectrum composition of the recorded gaze position signal. Further empirical investigations were performed using gaze position data recorded with five eye trackers from human and artificial eyes. We found that although the examined eye trackers produce gaze position signals with different characteristics, the relations between precision measures derived from simulations are borne out by the data. We furthermore conclude that data with a range of signal type values should be used to assess the robustness of eye-movement analysis methods. We present a method for generating artificial eye-tracker noise of any signal type and magnitude.


Sign in / Sign up

Export Citation Format

Share Document