scholarly journals Direction Estimation Model for Gaze Controlled Systems

2016 ◽  
Vol 9 (6) ◽  
Author(s):  
Anjana Sharma ◽  
Pawanesh Abrol

Detection of gaze requires estimation of the position and the relation between user’s pupil and glint. This position is mapped into the region of interest using different edge detectors by detecting the glint coordinates and further gaze direction. In this research paper, a Gaze Direction Estimation (GDE) model has been proposed for the comparative analysis of two standard edge detectors Canny and Sobel for estimating automatic detection of the glint, its coordinates and subsequently the gaze direction. The results indicate fairly good percentage of the cases where the correct glint coordinates and subsequently correct gaze direction quadrants have been estimated. These results can further be used for improving the accuracy and performance of different eye gaze based systems.

2021 ◽  
Author(s):  
Fumihiro Kano ◽  
Takeshi Furuichi ◽  
Chie Hashimoto ◽  
Christopher Krupenye ◽  
Jesse G Leinwand ◽  
...  

The gaze-signaling hypothesis and the related cooperative-eye hypothesis posit that humans have evolved special external eye morphology, including exposed white sclera (the white of the eye), to enhance the visibility of eye-gaze direction and thereby facilitate conspecific communication through joint-attentional interaction and ostensive communication. However, recent quantitative studies questioned these hypotheses based on new findings that humans are not necessarily unique in certain eye features compared to other great ape species. Therefore, there is currently a heated debate on whether external eye features of humans are distinguished from those of other apes and how such distinguished features contribute to the visibility of eye-gaze direction. This study leveraged updated image analysis techniques to test the uniqueness of human eye features in facial images of great apes. Although many eye features were similar between humans and other species, a key difference was that humans have uniformly white sclera which creates clear visibility of both eye outline and iris; the two essential features contributing to the visibility of eye-gaze direction. We then tested the robustness of the visibility of these features against visual noises such as darkening and distancing and found that both eye features remain detectable in the human eye, while eye outline becomes barely detectable in other species under these visually challenging conditions. Overall, we identified that humans have distinguished external eye morphology among other great apes, which ensures robustness of eye-gaze signal against various visual conditions. Our results support and also critically update the central premises of the gaze-signaling hypothesis.


2021 ◽  
Author(s):  
Fumihiro Kano ◽  
Yuri Kawaguchi ◽  
Hanling Yeow

Hallmark social activities of humans, such as cooperation and cultural learning, involve eye-gaze signaling through joint attentional interaction and ostensive communication. The gaze-signaling and related cooperative-eye hypotheses posit that humans evolved unique external eye morphology, including exposed white sclera (the white of the eye), to enhance the visibility of eye-gaze for conspecifics. However, experimental evidence is still lacking. This study tested the ability of human and chimpanzee participants to detect the eye-gaze directions of human and chimpanzee images in computerized tasks. We varied the level of brightness and size in the stimulus images to examine the robustness of the eye-gaze directional signal against visually challenging conditions. We found that both humans and chimpanzees detected gaze directions of the human eye better than that of the chimpanzee eye, particularly when eye stimuli were darker and smaller. Also, participants of both species detected gaze direction of the chimpanzee eye better when its color was inverted compared to when its color was normal; namely, when the chimpanzee eye has artificial white sclera. White sclera thus enhances the visibility of eye-gaze direction even across species, particularly in visually challenging conditions. Our findings supported but also critically updated the central premises of the gaze-signaling hypothesis.


1999 ◽  
Vol 42 (3) ◽  
pp. 526-539 ◽  
Author(s):  
Charissa R. Lansing ◽  
George W. McConkie

Two experiments were conducted to test the hypothesis that visual information related to segmental versus prosodic aspects of speech is distributed differently on the face of the talker. In the first experiment, eye gaze was monitored for 12 observers with normal hearing. Participants made decisions about segmental and prosodic categories for utterances presented without sound. The first experiment found that observers spend more time looking at and direct more gazes toward the upper part of the talker's face in making decisions about intonation patterns than about the words being spoken. The second experiment tested the Gaze Direction Assumption underlying Experiment 1—that is, that people direct their gaze to the stimulus region containing information required for their task. In this experiment, 18 observers with normal hearing made decisions about segmental and prosodic categories under conditions in which face motion was restricted to selected areas of the face. The results indicate that information in the upper part of the talker's face is more critical for intonation pattern decisions than for decisions about word segments or primary sentence stress, thus supporting the Gaze Direction Assumption. Visual speech perception proficiency requires learning where to direct visual attention for cues related to different aspects of speech.


2007 ◽  
Vol 60 (9) ◽  
pp. 1169-1177 ◽  
Author(s):  
Roger Watt ◽  
Ben Craven ◽  
Sandra Quinn

The human eye is unique amongst those of primates in having white sclera against which the dark iris is clearly visible. This high-contrast structure makes the gaze direction of a human potentially easily perceptible to others. For a social creature such as a human, the ability to perceive the direction of another's gaze may be very useful, since gaze usually signals attention. We report data showing that the accuracy of gaze deviation detection is independent of viewing distance up to a certain critical distance, beyond which it collapses. This is, of itself, surprising since most visual tasks are performed better at closer viewing distances. Our data also show that the critical distance, but not accuracy, is affected by the position of the eyebrows so that lowering the eyebrows reduces the critical distance. These findings show that mechanisms exist by which humans could expand or restrict the availability of their gaze direction to others. A way to regulate the availability of the gaze direction signal could be an advantage. We show that an interpretation of eyebrow function in these terms provides a novel explanation for several well-known eyebrow actions, including the eyebrow flash.


Author(s):  
Hideaki Touyama ◽  
◽  
Mitsuru Sakuda

In this paper, we propose a brain-computer interface (BCI) based on collaborative steady-state visually evoked potential (SSVEP). A technique for estimating the common direction of the gaze of multiple subjects is studied with a view to controlling a virtual object in a virtual environment. The electro-encephalograms (EEG) of eight volunteers are simultaneously recorded with two virtual cubes as visual stimuli. These two virtual cubes flicker at different rates, 6 Hz and 8 Hz, and the corresponding SSVEP is observed around the occipital area. The amplitude spectra of the EEG activity of individual subjects are analyzed, averaged, and synthesized to obtain the collaborative SSVEP. Machine learning is applied to estimate the common gaze direction of the eight subjects with the supervised data from fewer than eight subjects. The estimation accuracy is perfect only in the case of the collaborative SSVEP. One-dimensional control of a virtual ball is performed by controlling the common eye gaze direction, which induces the collaborative SSVEP.


2016 ◽  
Author(s):  
Helton M. Peixoto ◽  
Ana M. G. Guerreiro ◽  
Adrião D. D. Neto

Author(s):  
V. Vijaya Kishore ◽  
R.V.S. Satyanarayana

A vital necessity for clinical determination and treatment is an opportunity to prepare a procedure that is universally adaptable. Computer aided diagnosis (CAD) of various medical conditions has seen a tremendous growth in recent years. The frameworks combined with expanding capacity, the coliseum of CAD is touching new spaces. The goal of proposed work is to build an easy to understand multifunctional GUI Device for CAD that performs intelligent preparing of lung CT images. Functions implemented are to achieve region of interest (ROI) segmentation for nodule detection. The nodule extraction from ROI is implemented by morphological operations, reducing the complexity and making the system suitable for real-time applications. In addition, an interactive 3D viewer and performance measure tool that quantifies and measures the nodules is integrated. The results are validated through clinical expert. This serves as a foundation to determine, the decision of treatment and the prospect of recovery.


2014 ◽  
Vol 2014 ◽  
pp. 1-11 ◽  
Author(s):  
Dina Tell ◽  
Denise Davidson ◽  
Linda A. Camras

Eye gaze direction and expression intensity effects on emotion recognition in children with autism disorder and typically developing children were investigated. Children with autism disorder and typically developing children identified happy and angry expressions equally well. Children with autism disorder, however, were less accurate in identifying fear expressions across intensities and eye gaze directions. Children with autism disorder rated expressions with direct eyes, and 50% expressions, as more intense than typically developing children. A trend was also found for sad expressions, as children with autism disorder were less accurate in recognizing sadness at 100% intensity with direct eyes than typically developing children. Although the present research showed that children with autism disorder are sensitive to eye gaze direction, impairments in the recognition of fear, and possibly sadness, exist. Furthermore, children with autism disorder and typically developing children perceive the intensity of emotional expressions differently.


Sign in / Sign up

Export Citation Format

Share Document