scholarly journals A Quality-Centered Analysis of Eye Tracking Data in Foveated Rendering

2017 ◽  
Vol 10 (5) ◽  
Author(s):  
Thorsten Roth ◽  
Martin Weier ◽  
André Hinkenjann ◽  
Yongmin Li ◽  
Philipp Slusallek

This work presents the analysis of data recorded by an eye tracking device in the course of evaluating a foveated rendering approach for head-mounted displays (HMDs). Foveated ren- dering methods adapt the image synthesis process to the user’s gaze and exploiting the human visual system’s limitations to increase rendering performance. Especially, foveated rendering has great potential when certain requirements have to be fulfilled, like low-latency rendering to cope with high display refresh rates. This is crucial for virtual reality (VR), as a high level of immersion, which can only be achieved with high rendering performance and also helps to reduce nausea, is an important factor in this field. We put things in context by first providing basic information about our rendering system, followed by a description of the user study and the collected data. This data stems from fixation tasks that subjects had to perform while being shown fly-through sequences of virtual scenes on an HMD. These fixation tasks consisted of a combination of various scenes and fixation modes. Besides static fixation targets, moving tar- gets on randomized paths as well as a free focus mode were tested. Using this data, we estimate the precision of the utilized eye tracker and analyze the participants’ accuracy in focusing the displayed fixation targets. Here, we also take a look at eccentricity-dependent quality ratings. Comparing this information with the users’ quality ratings given for the displayed sequences then reveals an interesting connection between fixation modes, fixation accuracy and quality ratings.

Sensors ◽  
2021 ◽  
Vol 21 (6) ◽  
pp. 2234
Author(s):  
Sebastian Kapp ◽  
Michael Barz ◽  
Sergey Mukhametov ◽  
Daniel Sonntag ◽  
Jochen Kuhn

Currently an increasing number of head mounted displays (HMD) for virtual and augmented reality (VR/AR) are equipped with integrated eye trackers. Use cases of these integrated eye trackers include rendering optimization and gaze-based user interaction. In addition, visual attention in VR and AR is interesting for applied research based on eye tracking in cognitive or educational sciences for example. While some research toolkits for VR already exist, only a few target AR scenarios. In this work, we present an open-source eye tracking toolkit for reliable gaze data acquisition in AR based on Unity 3D and the Microsoft HoloLens 2, as well as an R package for seamless data analysis. Furthermore, we evaluate the spatial accuracy and precision of the integrated eye tracker for fixation targets with different distances and angles to the user (n=21). On average, we found that gaze estimates are reported with an angular accuracy of 0.83 degrees and a precision of 0.27 degrees while the user is resting, which is on par with state-of-the-art mobile eye trackers.


Healthcare ◽  
2020 ◽  
Vol 9 (1) ◽  
pp. 10
Author(s):  
Chong-Bin Tsai ◽  
Wei-Yu Hung ◽  
Wei-Yen Hsu

Optokinetic nystagmus (OKN) is an involuntary eye movement induced by motion of a large proportion of the visual field. It consists of a “slow phase (SP)” with eye movements in the same direction as the movement of the pattern and a “fast phase (FP)” with saccadic eye movements in the opposite direction. Study of OKN can reveal valuable information in ophthalmology, neurology and psychology. However, the current commercially available high-resolution and research-grade eye tracker is usually expensive. Methods & Results: We developed a novel fast and effective system combined with a low-cost eye tracking device to accurately quantitatively measure OKN eye movement. Conclusions: The experimental results indicate that the proposed method achieves fast and promising results in comparisons with several traditional approaches.


2018 ◽  
Vol 2018 ◽  
pp. 1-11 ◽  
Author(s):  
Yea Som Lee ◽  
Bong-Soo Sohn

3D maps such as Google Earth and Apple Maps (3D mode), in which users can see and navigate in 3D models of real worlds, are widely available in current mobile and desktop environments. Users usually use a monitor for display and a keyboard/mouse for interaction. Head-mounted displays (HMDs) are currently attracting great attention from industry and consumers because they can provide an immersive virtual reality (VR) experience at an affordable cost. However, conventional keyboard and mouse interfaces decrease the level of immersion because the manipulation method does not resemble actual actions in reality, which often makes the traditional interface method inappropriate for the navigation of 3D maps in virtual environments. From this motivation, we design immersive gesture interfaces for the navigation of 3D maps which are suitable for HMD-based virtual environments. We also describe a simple algorithm to capture and recognize the gestures in real-time using a Kinect depth camera. We evaluated the usability of the proposed gesture interfaces and compared them with conventional keyboard and mouse-based interfaces. Results of the user study indicate that our gesture interfaces are preferable for obtaining a high level of immersion and fun in HMD-based virtual environments.


Author(s):  
J. Mirijovsky ◽  
S. Popelka

The main aim of presented paper is to find the most realistic and preferred color settings for four different types of surfaces on the aerial images. This will be achieved through user study with the use of eye-movement recording. Aerial images taken by the unmanned aerial system were used as stimuli. From each image, squared crop area containing one of the studied types of surfaces (asphalt, concrete, water, soil, and grass) was selected. For each type of surface, the real value of reflectance was found with the use of precise spectroradiometer ASD HandHeld 2 which measures the reflectance. The device was used at the same time as aerial images were captured, so lighting conditions and state of vegetation were equal. The spectral resolution of the ASD device is better than 3.0 nm. For defining the RGB values of selected type of surface, the spectral reflectance values recorded by the device were merged into wider groups. Finally, we get three groups corresponding to RGB color system. Captured images were edited with the graphic editor Photoshop CS6. Contrast, clarity, and brightness were edited for all surface types on images. Finally, we get a set of 12 images of the same area with different color settings. These images were put into the grid and used as stimuli for the eye-tracking experiment. Eye-tracking is one of the methods of usability studies and it is considered as relatively objective. Eye-tracker SMI RED 250 with the sampling frequency 250 Hz was used in the study. As respondents, a group of 24 students of Geoinformatics and Geography was used. Their task was to select which image in the grid has the best color settings. The next task was to select which color settings they prefer. Respondents’ answers were evaluated and the most realistic and most preferable color settings were found. The advantage of the eye-tracking evaluation was that also the process of the selection of the answers was analyzed. Areas of Interest were marked around each image in the grid and the sequences of gaze movements were analyzed. Sequence chart was used for visualization and eye-tracking metrics were statistically tested. The presented paper shows the differences in the perception and preferences of aerial images with different color settings.


2020 ◽  
Vol 10 (9) ◽  
pp. 2062-2073
Author(s):  
M.V. Krasnova ◽  
◽  
K.A. Nefedova ◽  

In this article, we say that marketing and advertising professionals have been aware of the limitations in traditional market research methods for decades, but only in recent years has science allowed the development of a more effective mechanism by which consumers’ thoughts can be deciphered and this is neuromarketing. Building an effective and successful communication policy through the use of a variety of technologies and promotion methods is one of the primary tasks of the effective functioning and development of the company in the market, as well as increasing its competitiveness. For this, in parallel with measurements of electroencephalography, magnetic resonance imaging, brain scanners, galvanic skin response, an eye-tracking device is used, which allows one to accurately identify a stimulus that provides information about the consumer’s response to various commercial messages. Eye-tracking technology is that the respondent is shown a visual stimulus (static or dynamic), while a special device records the trajectory and metrics of the pupil movement. The respondent’s pupil is illuminated with infrared rays, while the trajectory of its movement is continuously recorded by several high-precision infrared cameras. The coordinates of the movement of the pupil are recorded in the database, subsequently the data is analyzed and qualitative and quantitative reports are drawn up. This tool is used to analyze and understand the reaction of people to products and promotions, which allows you to increase the effectiveness of product promotion to make them more effective. The purpose of this article is to show the role that eye-tracking plays in the correct understanding of consumer needs, words and emotions. The main advantage of eye-tracking is the impartiality of the tested respondents, since the equipment used in this technology records the natural reactions of a person (by studying the movement and reaction of the pupil), which cannot be imitated. The main tool for applying the technology is the eye-tracker device, which recognizes and records pupil positions and eye movements.


Author(s):  
J. Mirijovsky ◽  
S. Popelka

The main aim of presented paper is to find the most realistic and preferred color settings for four different types of surfaces on the aerial images. This will be achieved through user study with the use of eye-movement recording. Aerial images taken by the unmanned aerial system were used as stimuli. From each image, squared crop area containing one of the studied types of surfaces (asphalt, concrete, water, soil, and grass) was selected. For each type of surface, the real value of reflectance was found with the use of precise spectroradiometer ASD HandHeld 2 which measures the reflectance. The device was used at the same time as aerial images were captured, so lighting conditions and state of vegetation were equal. The spectral resolution of the ASD device is better than 3.0 nm. For defining the RGB values of selected type of surface, the spectral reflectance values recorded by the device were merged into wider groups. Finally, we get three groups corresponding to RGB color system. Captured images were edited with the graphic editor Photoshop CS6. Contrast, clarity, and brightness were edited for all surface types on images. Finally, we get a set of 12 images of the same area with different color settings. These images were put into the grid and used as stimuli for the eye-tracking experiment. Eye-tracking is one of the methods of usability studies and it is considered as relatively objective. Eye-tracker SMI RED 250 with the sampling frequency 250 Hz was used in the study. As respondents, a group of 24 students of Geoinformatics and Geography was used. Their task was to select which image in the grid has the best color settings. The next task was to select which color settings they prefer. Respondents’ answers were evaluated and the most realistic and most preferable color settings were found. The advantage of the eye-tracking evaluation was that also the process of the selection of the answers was analyzed. Areas of Interest were marked around each image in the grid and the sequences of gaze movements were analyzed. Sequence chart was used for visualization and eye-tracking metrics were statistically tested. The presented paper shows the differences in the perception and preferences of aerial images with different color settings.


2020 ◽  
Vol 9 (1) ◽  
pp. 39-45
Author(s):  
Katarzyna Kujawa ◽  
◽  
Grzegorz Żurek ◽  
Agata Gorączko ◽  
Roman Olejniczak ◽  
...  

Patients who do not communicate verbally or speak in an understandable way are a serious problem in providing appropriate care to patients due to a lack of understanding of their needs. Therefore, it is important that nursing staff have the knowledge and skills of alternative and assistive communication to communicate with patients with speech disorders. The purpose of article is to present the current state of knowledge of the alternative and augmentative communication with special consideration the signs used in Poland with a practicular emphasis laid to the revelant description of the eye tracking device. The literature has been reviewed, including also in this relation topics: alternative and augmentative communication, examples of signs used in alternative communication in Poland and communication and eye tracking. Not everyone has the ability to communicate verbally with the environment. In relation to this problem the solution is the alternative and augmentative communication which uses signs and devices to enable the patient to communicate with other people. (JNNN 2020;9(1):39–45) Key Words: alternative communication, augmentative communication, AAC system, speech disorders, eye track, C-Eye


2021 ◽  
Vol 4 ◽  
pp. 1-6
Author(s):  
Martin Knura ◽  
Jochen Schiewe

Abstract. With the beginning of the COVID-19 pandemic, the execution of eye-tracking user studies in indoor environments was no longer possible, and remote and contactless substitutes are needed. With this paper, we want to introduce an alternative method to eye tracking, completely feasible under COVID-19 restrictions. Our main technique are think aloud interviews, where participants constantly verbalize their thoughts as they move through a test. We record the screen and the mouse movements during the interviews, and analyse both the statements and the mouse positions afterwards. With this information, we can encode the approximate map position of the user’s attention for each second of the interview. This allows us to use the same visual methods as for eye-tracking studies, like attention maps or trajectory maps. We implement our method conducting a user study with 21 participants to identify user behaviour while solving high-level interpretation tasks, and with the results of this study, we can show that or new method provides a useful substitute for eye-tracking user studies.


2017 ◽  
Vol 2017 (2) ◽  
pp. 23-37 ◽  
Author(s):  
Yousra Javed ◽  
Mohamed Shehab

Abstract Habituation is a key factor behind the lack of attention towards permission authorization dialogs during third party application installation. Various solutions have been proposed to combat the problem of achieving attention switch towards permissions. However, users continue to ignore these dialogs, and authorize dangerous permissions, which leads to security and privacy breaches. We leverage eye-tracking to approach this problem, and propose a mechanism for enforcing user attention towards application permissions before users are able to authorize them. We deactivate the dialog’s decision buttons initially, and use feedback from the eye-tracker to ensure that the user has looked at the permissions. After determining user attention, the buttons are activated. We implemented a prototype of our approach as a Chrome browser extension, and conducted a user study on Facebook’s application authorization dialogs. Using participants’ permission identification, eye-gaze fixations, and authorization decisions, we evaluate participants’ attention towards permissions. The participants who used our approach on authorization dialogs were able to identify the permissions better, compared to the rest of the participants, even after the habituation period. Their average number of eye-gaze fixations on the permission text was significantly higher than the other group participants. However, examining the rate in which participants denied a dangerous and unnecessary permission, the hypothesized increase from the control group to the treatment group was not statistically significant.


2020 ◽  
Vol 11 (12) ◽  
pp. 5977-5989 ◽  
Author(s):  
Christian Hirt ◽  
Marcel Eckard ◽  
Andreas Kunz

AbstractIn real life, it is well understood how stress can be induced and how it is measured. While virtual reality (VR) applications can resemble such stress inducers, it is still an open question if and how stress can be measured in a non-intrusive way during VR exposure. Usually, the quality of VR applications is estimated by user acceptance in the form of presence. Presence itself describes the individual’s acceptance of a virtual environment as real and is measured by specific questionnaires. Accordingly, it is expected that stress strongly affects this presence and thus also the quality assessment. Consequently, identifying the stress level of a VR user may enable content creators to engage users more immersively by adjusting the virtual environment to the measured stress. In this paper, we thus propose to use a commercially available eye tracking device to detect stress while users are exploring a virtual environment. We describe a user study in which a VR task was implemented to induce stress, while users’ pupil diameter and pulse were measured and evaluated against a self-reported stress level. The results show a statistically significant correlation between self-reported stress and users’ pupil dilation and pulse, indicating that stress measurements can indeed be conducted during the use of a head-mounted display. If this indication can be successfully proven in a larger scope, it will open up a new era of affective VR applications using individual and dynamic adjustments in the virtual environment.


Sign in / Sign up

Export Citation Format

Share Document