scholarly journals Single web camera robust interactive eye-gaze tracking method

2015 ◽  
Vol 63 (4) ◽  
pp. 879-886 ◽  
Author(s):  
A. Wojciechowski ◽  
K. Fornalczyk

Abstract Eye-gaze tracking is an aspect of human-computer interaction still growing in popularity,. Tracking human gaze point can help control user interfaces and may help evaluate graphical user interfaces. At the same time professional eye-trackers are very expensive and thus unavailable for most of user interface researchers and small companies. The paper presents very effective, low cost, computer vision based, interactive eye-gaze tracking method. On contrary to other authors results the method achieves very high precision (about 1.5 deg horizontally and 2.5 deg vertically) at 20 fps performance, exploiting a simple HD web camera with reasonable environment restrictions. The paper describes the algorithms used in the eye-gaze tracking method and results of experimental tests, both static absolute point of interest estimation, and dynamic functional gaze controlled cursor steering.

2020 ◽  
Vol 1518 ◽  
pp. 012020
Author(s):  
Shengfu Lu ◽  
Richeng Li ◽  
Jinan Jiao ◽  
Jiaming Kang ◽  
Nana Zhao ◽  
...  

Author(s):  
Ahmed Hossam EL-Din ◽  
S.S Mekhamer ◽  
Hadi M.El-Helw

This paper shows a Comparison between Conventional Method [P&O] and particle swarm optimization [PSO] Based on MPPT Algorithms for Photovoltaic Systems under uniform irradiance and temperature. The main idea is to show that PSO method has a very high tracking speed and has the ability to track MPP under different environmental conditions in addition to an easy hardware implementation using a low-cost microcontroller. MATLAB simulations are carried out under very challenging conditions, namely irradiance and temperature, which reflect a change in the load [KW]. The proposed PSO tracking method Results will be compared with conventional method called [P&O] through MATLAB/SIMULINK.


2021 ◽  
Vol 2120 (1) ◽  
pp. 012030
Author(s):  
J K Tan ◽  
W J Chew ◽  
S K Phang

Abstract The field of Human-Computer Interaction (HCI) has been developing tremendously since the past decade. The existence of smartphones or modern computers is already a norm in society these days which utilizes touch, voice and typing as a means for input. To further increase the variety of interaction, human eyes are set to be a good candidate for another form of HCI. The amount of information which the human eyes contain are extremely useful, hence, various methods and algorithm for eye gaze tracking are implemented in multiple sectors. However, some eye-tracking method requires infrared rays to be projected into the eye of the user which could potentially cause enzyme denaturation when the eye is subjected to those rays under extreme exposure. Therefore, to avoid potential harm from the eye-tracking method that utilizes infrared rays, this paper proposes an image-based eye tracking system using the Viola-Jones algorithm and Circular Hough Transform (CHT) algorithm. The proposed method uses visible light instead of infrared rays to control the mouse pointer using the eye gaze of the user. This research aims to implement the proposed algorithm for people with hand disability to interact with computers using their eye gaze.


Author(s):  
Prakash Kanade ◽  
Fortune David ◽  
Sunay Kanade

To avoid the rising number of car crash deaths, which are mostly caused by drivers' inattentiveness, a paradigm shift is expected. The knowledge of a driver's look area may provide useful details about his or her point of attention. Cars with accurate and low-cost gaze classification systems can increase driver safety. When drivers shift their eyes without turning their heads to look at objects, the margin of error in gaze detection increases. For new consumer electronic applications such as driver tracking systems and novel user interfaces, accurate and effective eye gaze prediction is critical. Such systems must be able to run efficiently in difficult, unconstrained conditions while using reduced power and expense. A deep learning-based gaze estimation technique has been considered to solve this issue, with an emphasis on WSN based Convolutional Neural Networks (CNN) based system. The proposed study proposes the following architecture, which is focused on data science: The first is a novel neural network model that is programmed to manipulate any possible visual feature, such as the states of both eyes and head location, as well as many augmentations; the second is a data fusion approach that incorporates several gaze datasets. However, due to different factors such as environment light shifts, reflections on glasses surface, and motion and optical blurring of the captured eye signal, the accuracy of detecting and classifying the pupil centre and corneal reflection centre depends on a car environment. This work also includes pre-trained models, network structures, and datasets for designing and developing CNN-based deep learning models for Eye-Gaze Tracking and Classification.


2008 ◽  
Vol 20 (5) ◽  
pp. 319-337 ◽  
Author(s):  
Eui Chul Lee ◽  
Kang Ryoung Park

1997 ◽  
Vol 487 ◽  
Author(s):  
W. K. Warburton ◽  
D. A. Darknell ◽  
B. Hubbard-Nelson

AbstractThe XIA DXP-4C is a 4 channel, CAMAC based, x-ray spectrometer which digitally processes directly digitized preamplifier signals. The DXP-4C was designed for instrumenting multi-detector arrays for synchrotron radiation applications, and optimized for very high count rates at a low cost per detector channel. This produced a very compact and low power (3.4 W/channel) instrument for its count rate and MCA capabilities, which thus provides a strong basis for portable applications. Because all functions are digitally controlled, it can be readily adapted to various user interfaces, including remote access interfaces. Here we describe the design and examine approaches to lowering its power to 50 mW/channel. We then consider the issues in applying it to three typical portable or remote spectrometry applications.


2015 ◽  
Vol 45 (4) ◽  
pp. 419-430 ◽  
Author(s):  
Yiu-ming Cheung ◽  
Qinmu Peng
Keyword(s):  
Eye Gaze ◽  

Sign in / Sign up

Export Citation Format

Share Document