Eye Movements While Reading Degraded On-Product Warnings

Author(s):  
Nathan T. Dorris ◽  
R. Brian Valimont ◽  
Eric J. Boelhouwer

This investigation tested whether heavily degraded warnings affected gaze patterns and resulted in longer viewing times than lightly degraded warnings. The study included sixteen participants who viewed six matched pairs of lightly and heavily degraded warnings. Eye movements were recorded using an eye tracking system while the total time on task for each warning was collected. Fixation times were also collected as participants viewed the various panels of each warning. In the second part of the experiment, legibility and participant comprehension of each warning was tested. Paired t-tests showed that total time on task, total fixation time, and message panel fixation time were consistently significantly different for three of the six pairs of warnings, such that each of the three aforementioned times increased significantly when participants were viewing a highly degraded warning label. Additionally, participants were able to comprehend all warnings presented. This study also provides evidence that eye tracking can be a useful tool in warnings research.

Author(s):  
Davin Pavlas ◽  
Heather Lum ◽  
Eduardo Salas

Eye tracking, previously the purview of well-funded laboratories, is now available to any individual who wishes to study gaze patterns. Advances in eye-tracking technology have made it possible for those with meager budgets but an abundance of motivation to engage in studies that examine participants’ eye movements and fixations. This article presents a how-to guide for creating low-cost eye-tracking solutions and includes discussion of optical hardware, tracking software, and data analysis programs. The wider availability of eye-tracking technology ensures that the broader scientific community has access to techniques that can inform design and enhance research.


2021 ◽  
pp. 39-60
Author(s):  
Sylwester Białowąs ◽  
Adrianna Szyszka

Eye movements provide information on subconscious reactions in response to stimuli and are a reflection of attention and focus. With regard to visual activity, four types of eye movements—fixations, saccades, smooth pursuits and blinks—can be distinguished. Fixations—the number and distribution, total fixation time or average fixation duration are among the most common measures. The capabilities of this research method also allow the determination of scanpaths that track gaze on the image as well as heat- and focus maps, which visually represent points of gaze focus. A key concept in eye-tracking that allows for more in-depth analysis is areas of interest (AOI)—measures can then be taken for selected parts of the visual stimulus. On the other hand, the area of gaze outside the scope of analysis is called white space. The software allows for comparisons of static and non-static stimuli and provides a choice of template, dataset, metrics or data format. In conducting eye-tracking research, proper calibration is crucial, which means that the participant’s gaze should be adjusted to the internal model of the eye-tracking software. In addition, attention should be paid to such aspects as time and spatial control. The exposure time for each participant should be identical. The testing space should be well-lit and at a comfortable temperature.


Vision ◽  
2021 ◽  
Vol 5 (3) ◽  
pp. 39
Author(s):  
Julie Royo ◽  
Fabrice Arcizet ◽  
Patrick Cavanagh ◽  
Pierre Pouget

We introduce a blind spot method to create image changes contingent on eye movements. One challenge of eye movement research is triggering display changes contingent on gaze. The eye-tracking system must capture the image of the eye, discover and track the pupil and corneal reflections to estimate the gaze position, and then transfer this data to the computer that updates the display. All of these steps introduce delays that are often difficult to predict. To avoid these issues, we describe a simple blind spot method to generate gaze contingent display manipulations without any eye-tracking system and/or display controls.


2019 ◽  
Vol 90 (1) ◽  
pp. 109-117
Author(s):  
Moritz Försch ◽  
Lena Krull ◽  
Marlene Hechtner ◽  
Roman Rahimi ◽  
Susanne Wriedt ◽  
...  

ABSTRACT Objective To evaluate the perception of esthetic orthodontic appliances by means of eye-tracking measurements and survey investigation. Materials and Methods En face and close-up images with different orthodontic appliances (aligner appliance [a], aligner appliance and attachments [b], lingual appliance [c], ceramic brackets [d], no appliance [e; control]) were shown to 140 participants. Eye movement and gaze direction was recorded by eye-tracking system. For different anatomical areas and areas of the appliances, time to first fixation and total fixation time were recorded. The questions included in a visual analog scale regarding individual sentiency were answered by the participants. Results For all groups, the anatomical landmarks were inspected in the following order: (1) eyes, (2) mouth, (3) nose, (4) hair, and (5) ears. Only in group d, first fixation was on the mouth region (1.10 ± 1.05 seconds). All appliances except the lingual appliance (1.87 ± 1.31 seconds) resulted in a longer fixation on the mouth area (a, 2.97 ± 1.32 seconds; b, 3.35 ± 1.38 seconds; d, 3.29 ± 1.36 seconds). For close-up pictures, the fastest (0.58 seconds) and longest (3.14 seconds) fixation was found for group d, followed by group b (1.02 seconds/2.3 seconds), group a (2.57 seconds/0.83 seconds), and group c (3.28 seconds/0.05 seconds). Visual analog scale scoring of questions on visibility were consistent with eye-tracking measurements. With increasing visibility, the feeling of esthetic impairment was considered higher. Conclusions Lingual orthodontic appliances do not change how the face is perceived. Other esthetic orthodontic appliances may change the pattern of facial inspection and are different in subjective perception.


Sensors ◽  
2020 ◽  
Vol 20 (17) ◽  
pp. 4956
Author(s):  
Jose Llanes-Jurado ◽  
Javier Marín-Morales ◽  
Jaime Guixeres ◽  
Mariano Alcañiz

Fixation identification is an essential task in the extraction of relevant information from gaze patterns; various algorithms are used in the identification process. However, the thresholds used in the algorithms greatly affect their sensitivity. Moreover, the application of these algorithm to eye-tracking technologies integrated into head-mounted displays, where the subject’s head position is unrestricted, is still an open issue. Therefore, the adaptation of eye-tracking algorithms and their thresholds to immersive virtual reality frameworks needs to be validated. This study presents the development of a dispersion-threshold identification algorithm applied to data obtained from an eye-tracking system integrated into a head-mounted display. Rules-based criteria are proposed to calibrate the thresholds of the algorithm through different features, such as number of fixations and the percentage of points which belong to a fixation. The results show that distance-dispersion thresholds between 1–1.6° and time windows between 0.25–0.4 s are the acceptable range parameters, with 1° and 0.25 s being the optimum. The work presents a calibrated algorithm to be applied in future experiments with eye-tracking integrated into head-mounted displays and guidelines for calibrating fixation identification algorithms


2020 ◽  
Vol 12 (8) ◽  
Author(s):  
Soon Young Park ◽  
Catarina Espanca Bacelar ◽  
Kenneth Holmqvist

Eye movement of a species reflects the visual behavior strategy that it has adapted to during its evolution. What are eye movements of domestic dogs (Canis lupus familiaris) like? Investigations of dog eye movements per se have not been done, despite the increasing number of visuo-cognitive studies in dogs using eye-tracking systems. To fill this gap, we have recorded dog eye movements using a video-based eye-tracking system, and compared the dog data to that of humans. We found dog saccades follow the systematic relationships between saccade metrics previously shown in humans and other animal species. Yet, the details of the relationships, and the quantities of each metric of dog saccades and fixations differed from those of humans. Overall, dog saccades were slower and fixations were longer than those of humans. We hope our findings contribute to existing comparative analyses of eye movement across animal species, and also to improvement of algorithms used for classifying eye movement data of dogs.


Author(s):  
Nicholas Moellhoff ◽  
Chiara Kandelhardt ◽  
Denis Ehrl ◽  
Lukas Kohler ◽  
Konstantin Koban ◽  
...  

Abstract Background The objective assessment of beauty is challenging and subject to current research efforts. Recently, a new means of objectively determining the aesthetic appeal of body features has been investigated by analyzing gaze patterns and eye movements. Objectives The objective of this study was to assess differences in observers’ gaze patterns presented with standardized 3-dimensional images with different degrees of breast asymmetry using objective eye-tracking technology. Methods A total of 83 Caucasian study participants with a mean age of 38.60 (19.8) years were presented with 5 images depicting varying degrees of breast symmetry. In addition to the assessment of eye movements, participants were asked to rate the aesthetic appeal and the asymmetry of the breasts on a 5-point Likert scale. Results Overall, the data show that participants rating of the breasts’ aesthetic appeal was inversely related to the level of asymmetry. Time until fixation was shortest for the image depicting the greatest breast asymmetry (50 cc) with 0.77 (0.7), p <0.001. In addition, the mammary region was also viewed longest in this image with 3.76 (0.5) seconds, p < 0.001. A volume difference of 35 cc between breasts deflected the observers’ gaze significantly toward the larger of the asymmetrical breasts, p<0.001. Conclusions Surgeons should focus on symmetrical breast volume (ie, differences < 35 cc between breasts) to avoid noticeable asymmetry with regard to breast size.


2020 ◽  
Vol 25 (5) ◽  
pp. 270-275
Author(s):  
Ali S. Tejani ◽  
Bert B. Vargas ◽  
Emily F. Middleton ◽  
Mu Huang

Though studies describe postconcussive changes in eye movements, there is a need for data describing baseline eye movements. The purpose of this study was to describe baseline eye movements and visual contrast acuity using the King-Devick (KD) Eye Tracking System and KD Visual Contrast Sensitivity Chart. Fewer total saccades were noted in soccer players than basketball players (soccer, 56.9 ± 14.3; basketball, 101.1 ± 41.3; p = .0005). No significant differences were noted for the number of saccades between sexes (males, 60.4 ± 20.3; females, 84.9 ± 41.8, p = .100) or in contrast acuity between all groups (p > .05). These results suggest the presence of sport-specific trends that may invalidate the comparison of postconcussion evaluation to generic baseline athlete eye movements.


2019 ◽  
Vol 63 (6) ◽  
pp. 60403-1-60403-6
Author(s):  
Midori Tanaka ◽  
Matteo Paolo Lanaro ◽  
Takahiko Horiuchi ◽  
Alessandro Rizzi

Abstract The Random spray Retinex (RSR) algorithm was developed by taking into consideration the mathematical description of Milano-Retinex. The RSR substituted random paths with random sprays. Mimicking some characteristics of the human visual system (HVS), this article proposes two variants of RSR adding a mechanism of region of interest (ROI). In the first proposed model, a cone distribution based on anatomical data is considered as ROI. In the second model, the visual resolution depending on the visual field based on the knowledge of visual information processing is considered as ROI. We have measured actual eye movements using an eye-tracking system. By using the eye-tracking data, we have simulated the HVS using test images. Results show an interesting qualitative computation of the appearance of the processed area around real gaze points.


2006 ◽  
Vol 18 (06) ◽  
pp. 319-327 ◽  
Author(s):  
MU-CHUN SU ◽  
KUO-CHUNG WANG ◽  
GWO-DONG CHEN

The object of this paper is to present a set of techniques integrated into a low-lost eye tracking system. Eye tracking systems have many potential applications such as learning emotion monitoring systems, drivers' fatigue detection systems, etc. In this paper, we report how we use an eye tracking system to implement an "eye mouse" to provide computer access for people with severe disabilities. The proposed eye mouse allows people with severe disabilities to use their eye movements to manipulate computers. It requires only one low-cost Web camera and a personal computer. A five-stage algorithm is developed to estimate the directions of eye movements and then use the direction information to manipulate the computer. Several experiments were conducted to test the performance of the eye tracking system.


Sign in / Sign up

Export Citation Format

Share Document