scholarly journals Characterizing gaze position signals and synthesizing noise during fixations in eye-tracking data

2020 ◽  
Vol 52 (6) ◽  
pp. 2515-2534 ◽  
Author(s):  
Diederick C. Niehorster ◽  
Raimondas Zemblys ◽  
Tanya Beelders ◽  
Kenneth Holmqvist

AbstractThe magnitude of variation in the gaze position signals recorded by an eye tracker, also known as its precision, is an important aspect of an eye tracker’s data quality. However, data quality of eye-tracking signals is still poorly understood. In this paper, we therefore investigate the following: (1) How do the various available measures characterizing eye-tracking data during fixation relate to each other? (2) How are they influenced by signal type? (3) What type of noise should be used to augment eye-tracking data when evaluating eye-movement analysis methods? To support our analysis, this paper presents new measures to characterize signal type and signal magnitude based on RMS-S2S and STD, two established measures of precision. Simulations are performed to investigate how each of these measures depends on the number of gaze position samples over which they are calculated, and to reveal how RMS-S2S and STD relate to each other and to measures characterizing the temporal spectrum composition of the recorded gaze position signal. Further empirical investigations were performed using gaze position data recorded with five eye trackers from human and artificial eyes. We found that although the examined eye trackers produce gaze position signals with different characteristics, the relations between precision measures derived from simulations are borne out by the data. We furthermore conclude that data with a range of signal type values should be used to assess the robustness of eye-movement analysis methods. We present a method for generating artificial eye-tracker noise of any signal type and magnitude.

2014 ◽  
Vol 7 (3) ◽  
Author(s):  
Andreas Bulling ◽  
Roman Bednarik

Latest developments in remote and head-mounted eye tracking and automated eye movement analysis point the way toward unobtrusive eye-based human-computer interfaces that will become pervasively usable in everyday life. We call this new paradigm pervasive eye tracking – continuous eye monitoring and analysis 24/7. Pervasive Eye Tracking and Mobile Eye-Based Interaction (PETMEI) is a workshop series that revolves around the theme of pervasive eye-tracking as a trailblazer for pervasive eye-based human-computer interaction and eye-based context-awareness. This special issue is composed from extended versions of the top-scoring papers from the 3rd workshop in the PETMEI series held in 2013.


1999 ◽  
Author(s):  
Theodore T. Blackmon ◽  
Yeuk F. Ho ◽  
Dimitri A. Chernyak ◽  
Michela Azzariti ◽  
Lawrence W. Stark

Author(s):  
Janet H. Hsiao ◽  
Hui Lan ◽  
Yueyuan Zheng ◽  
Antoni B. Chan

AbstractThe eye movement analysis with hidden Markov models (EMHMM) method provides quantitative measures of individual differences in eye-movement pattern. However, it is limited to tasks where stimuli have the same feature layout (e.g., faces). Here we proposed to combine EMHMM with the data mining technique co-clustering to discover participant groups with consistent eye-movement patterns across stimuli for tasks involving stimuli with different feature layouts. Through applying this method to eye movements in scene perception, we discovered explorative (switching between the foreground and background information or different regions of interest) and focused (mainly looking at the foreground with less switching) eye-movement patterns among Asian participants. Higher similarity to the explorative pattern predicted better foreground object recognition performance, whereas higher similarity to the focused pattern was associated with better feature integration in the flanker task. These results have important implications for using eye tracking as a window into individual differences in cognitive abilities and styles. Thus, EMHMM with co-clustering provides quantitative assessments on eye-movement patterns across stimuli and tasks. It can be applied to many other real-life visual tasks, making a significant impact on the use of eye tracking to study cognitive behavior across disciplines.


Sensors ◽  
2021 ◽  
Vol 21 (22) ◽  
pp. 7569
Author(s):  
Hsing-Hao Lee ◽  
Zih-Ling Chen ◽  
Su-Ling Yeh ◽  
Janet Hui-Wen Hsiao ◽  
An-Yeu (Andy) Wu

Mind-wandering has been shown to largely influence our learning efficiency, especially in the digital and distracting era nowadays. Detecting mind-wandering thus becomes imperative in educational scenarios. Here, we used a wearable eye-tracker to record eye movements during the sustained attention to response task. Eye movement analysis with hidden Markov models (EMHMM), which takes both spatial and temporal eye-movement information into account, was used to examine if participants’ eye movement patterns can differentiate between the states of focused attention and mind-wandering. Two representative eye movement patterns were discovered through clustering using EMHMM: centralized and distributed patterns. Results showed that participants with the centralized pattern had better performance on detecting targets and rated themselves as more focused than those with the distributed pattern. This study indicates that distinct eye movement patterns are associated with different attentional states (focused attention vs. mind-wandering) and demonstrates a novel approach in using EMHMM to study attention. Moreover, this study provides a potential approach to capture the mind-wandering state in the classroom without interrupting the ongoing learning behavior.


i-Perception ◽  
2021 ◽  
Vol 12 (2) ◽  
pp. 204166952199839
Author(s):  
Linda Krauze ◽  
Ilze Ceple ◽  
Jurgis Skilters ◽  
Mara Delesa–Velina ◽  
Baingio Pinna ◽  
...  

This study explores perceptual organisation and shape perception when viewing a tetragon and an additional element (a dot) that is located at varying positions and distances next to the tetragon. The aim of the study is to determine the factors that can alter the interpretation of object configuration and impact whether the presented tetragon is perceived as a diamond or a square. Methods used in this study are a forced-choice task as a subjective measurement and eye tracking as an objective measurement of perceptual processes. Overall, 31 stimuli were presented to the participants: a tetragon in two different sizes with an additional element (a dot) located inside or outside the object at three different positions at three distances. The results indicate significant changes in shape perception, depending on the location of the additional element. The results are complemented with eye movement analysis indicating that as the distance between the elements increases, there is a higher probability of either of the two shape interpretations and the gaze is less likely to be directed to the area between the stimuli. Furthermore, the subjective perception of shape is codetermined by the shape perception when the tetragon is presented without the additional element.


Healthcare ◽  
2020 ◽  
Vol 9 (1) ◽  
pp. 10
Author(s):  
Chong-Bin Tsai ◽  
Wei-Yu Hung ◽  
Wei-Yen Hsu

Optokinetic nystagmus (OKN) is an involuntary eye movement induced by motion of a large proportion of the visual field. It consists of a “slow phase (SP)” with eye movements in the same direction as the movement of the pattern and a “fast phase (FP)” with saccadic eye movements in the opposite direction. Study of OKN can reveal valuable information in ophthalmology, neurology and psychology. However, the current commercially available high-resolution and research-grade eye tracker is usually expensive. Methods & Results: We developed a novel fast and effective system combined with a low-cost eye tracking device to accurately quantitatively measure OKN eye movement. Conclusions: The experimental results indicate that the proposed method achieves fast and promising results in comparisons with several traditional approaches.


Intelligence ◽  
1984 ◽  
Vol 8 (3) ◽  
pp. 205-238 ◽  
Author(s):  
Charles E. Bethell-Fox ◽  
David F. Lohman ◽  
Richard E. Snow

Sign in / Sign up

Export Citation Format

Share Document