scholarly journals An Extended Method for Saccadic Eye Movement Measurements Using a Head-Mounted Display

Healthcare ◽  
2020 ◽  
Vol 8 (2) ◽  
pp. 104 ◽  
Author(s):  
Youngkeun Lee ◽  
Yadav Sunil Kumar ◽  
Daehyeon Lee ◽  
Jihee Kim ◽  
Junggwon Kim ◽  
...  

Saccadic eye movement is an important ability in our daily life and is especially important in driving and sports. Traditionally, the Developmental Eye Movement (DEM) test and the King–Devick (K-D) test have been used to measure saccadic eye movement, but these only involve measurements with “adjusted time”. Therefore, a different approach is required to obtain the eye movement speed and reaction rate in detail, as some are rapid eye movements, while others are slow actions, and vice versa. This study proposed an extended method that can acquire the “rest time” and “transfer time”, as well as the “adjusted time”, by implementing a virtual reality-based DEM test, using a FOVE virtual reality (VR) head-mounted display (HMD), equipped with an eye-tracking module. This approach was tested in 30 subjects with normal vision and no ophthalmologic disease by using a 2-diopter (50-cm) distance. This allowed for measurements of the “adjusted time” and the “rest time” for focusing on each target number character, the “transfer time” for moving to the next target number character, and recording of the gaze-tracking log. The results of this experiment showed that it was possible to analyze more parameters of the saccadic eye movement with the proposed method than with the traditional methods.

2017 ◽  
Vol 50 (5) ◽  
pp. 772-786 ◽  
Author(s):  
C-S Lee ◽  
J-H Lee ◽  
H Pak ◽  
SW Park ◽  
D-W Song

This paper evaluates the detectability of the phantom array and stroboscopic effects during light source motion, eye movement and their combination, using time modulated light-emitting diode light sources. It is well known that the phantom array can be observed when time-modulated light sources are observed during saccadic eye movements. We investigated whether light source motion can cause similar effects when the subject has fixed eyes. In addition, we estimated the detectability threshold frequency for the combination of stroboscopic effect and the phantom array, which is named the stroboscopic-phantom array effect, during two eye movements in opposite directions under one directional rotating light source with variable speed. Our results indicate that one of the most important factors for the stroboscopic-phantom array effect is eye movement speed relative to the speed of the light source. Therefore, time-modulated moving light sources induce a stroboscopic effect in subjects with fixed eyes that is similar to the stroboscopic-phantom array effect observed during saccadic eye movement. Our findings are likely to be useful for predicting the stroboscopic effect and the stroboscopic-phantom array effect during the fast motion of time-modulated LED light sources, like multi-functional rear lamps, in automotive lighting applications.


Sensors ◽  
2021 ◽  
Vol 21 (14) ◽  
pp. 4769
Author(s):  
Cristina Palmero ◽  
Abhishek Sharma ◽  
Karsten Behrendt ◽  
Kapil Krishnakumar ◽  
Oleg V. Komogortsev ◽  
...  

This paper summarizes the OpenEDS 2020 Challenge dataset, the proposed baselines, and results obtained by the top three winners of each competition: (1) Gaze prediction Challenge, with the goal of predicting the gaze vector 1 to 5 frames into the future based on a sequence of previous eye images, and (2) Sparse Temporal Semantic Segmentation Challenge, with the goal of using temporal information to propagate semantic eye labels to contiguous eye image frames. Both competitions were based on the OpenEDS2020 dataset, a novel dataset of eye-image sequences captured at a frame rate of 100 Hz under controlled illumination, using a virtual-reality head-mounted display with two synchronized eye-facing cameras. The dataset, which we make publicly available for the research community, consists of 87 subjects performing several gaze-elicited tasks, and is divided into 2 subsets, one for each competition task. The proposed baselines, based on deep learning approaches, obtained an average angular error of 5.37 degrees for gaze prediction, and a mean intersection over union score (mIoU) of 84.1% for semantic segmentation. The winning solutions were able to outperform the baselines, obtaining up to 3.17 degrees for the former task and 95.2% mIoU for the latter.


2016 ◽  
Vol 9 (6) ◽  
Author(s):  
Jung-Ho Kim ◽  
Ho-Jun Son ◽  
Sung-Jin Lee ◽  
Deok-Young Yun ◽  
Soon-Chul Kwon ◽  
...  

By transplanting the Developmental Eye Movement (DEM) test chart to a virtual reality head-mounted display (VR HMD) system, this study sought to evaluate the effectiveness of the DEM test for measuring dynamic visual acuity.Thirty-nine adults aged 20–39 years of both genders were the subjects of the study. After undergoing measurement of their visual function, through medical questionnaire, interpupillary distance, near point of convergence (NPC), near point of accommodation (NPA), and far and near phoria, the correlation between the tests was analyzed performing DEM vertical, horizontal test and VR HMD DEM (VHD) vertical, horizontal test.NPC and NPA decreased significantly after the VHD test, while phoria did not. The horizontal was quicker than the vertical in the DEM test, and vice versa in the VHD test. DEM was quicker than VHD in both the vertical and horizontal directions. There was no notable difference in error frequency between DEM and VHD. In terms of DEM and VHD test, there was no notable difference in the short-range IPD and subjective symptoms of the top 10 and bottom 10 subjects. There was also no notable difference between the exercise and non-exercise groups and the game and non-game groups.The performance time for VHD, in which the chart must be read while moving the body, was longer than that of DEM. Therefore, based on the consistency of the results of both tests and the lack of a difference in error frequency and subjective symptoms, the VHD equipment proposed in this thesis is as effective as dynamic visual acuity measurement equipment. In addition, the lack of a difference between the exercise and non-exercise groups and the game and non-game groups demonstrated that the amount of exercise and game by an ordinary person does not influence their dynamic visual function.


Author(s):  
Konstantin Ryabinin ◽  
Konstantin Belousov ◽  
Roman Chumakov

This paper is devoted to extending the previously created unified pipeline for conducting eye-tracking- based experiments in a virtual reality environment. In the previous work, we proposed using SciVi semantic data mining platform, Unreal Engine and HTC Vive Pro Eye head-mounted display to study reading process in the immersive virtual reality. The currently proposed extension enables to handle so-called polycode stimuli: compound visual objects, which consist of individual parts carrying different semantics for the viewer. To segment polycode stimuli extracting areas of interest (areas, where the informant’s eye gaze is being tracked) we adopt Creative Maps Studio vector graphics editor. To integrate Creative Maps Studio into the existing pipeline we created plugins for SciVi platform to load and handle the segmented stimuli, place them in the virtual reality scenes, collect corresponding eye gaze tracking data and perform visual analysis of the data collected. To analyze the eye gaze tracks, we utilize a circular graph that allows comprehensive visualization of hierarchical areas of interest (mapping them to color- coded graph nodes grouped into the hierarchy with a help of multilevel circular scale) and corresponding eye movements (mapped to the graph edges). We tested our pipeline on two different stimuli: the advertising poster and the painting “The Appearance of Christ Before the People” by A. Ivanov (1857).


Sign in / Sign up

Export Citation Format

Share Document