eye motion
Recently Published Documents


TOTAL DOCUMENTS

165
(FIVE YEARS 30)

H-INDEX

21
(FIVE YEARS 3)

2021 ◽  
Vol 12 (1) ◽  
pp. 48
Author(s):  
Ihsan Ahmed ◽  
Wasif Muhammad ◽  
Ali Asghar ◽  
Muhammad Jehanzeb Irshad

The quick, simultaneous movements of both eyes in the same direction is called a saccade, and the process of developing an internal model for the eyes’ movement-control based on visual stimuli is called saccade learning. All humans use this type of eye motion to bring salient objects to the foveal locations of the retina, even if the objects are located randomly in the surrounding environment. To begin with, infants are not able to perform this type of eye motion, but sensory information motivates them to start learning saccadic behavior. In this paper, a sensory prediction-error-based intrinsically motivated model is proposed for learning saccadic eye movements, and this approach is more consistent with biological systems for saccade learning. Predicted Coding/Biased Competition using Divisive Input Modulation (PC/BC-DIM) network is used for saccade learning using sensory prediction errors. The quantification of sensory prediction errors provides an intrinsic reward. A simulated humanoid agent, iCub, is used to assess and quantify the performance of the proposed model. The performance metrics used for this purpose are percentage mean post-saccadic distance and standard deviation. The mean post-saccadic distance for the proposed model was less than 1°, which is biologically plausible.


Author(s):  
Basil Wahn ◽  
Laura Schmitz ◽  
Alan Kingstone ◽  
Anne Böckler-Raettig

AbstractEye contact is a dynamic social signal that captures attention and plays a critical role in human communication. In particular, direct gaze often accompanies communicative acts in an ostensive function: a speaker directs her gaze towards the addressee to highlight the fact that this message is being intentionally communicated to her. The addressee, in turn, integrates the speaker’s auditory and visual speech signals (i.e., her vocal sounds and lip movements) into a unitary percept. It is an open question whether the speaker’s gaze affects how the addressee integrates the speaker’s multisensory speech signals. We investigated this question using the classic McGurk illusion, an illusory percept created by presenting mismatching auditory (vocal sounds) and visual information (speaker’s lip movements). Specifically, we manipulated whether the speaker (a) moved his eyelids up/down (i.e., open/closed his eyes) prior to speaking or did not show any eye motion, and (b) spoke with open or closed eyes. When the speaker’s eyes moved (i.e., opened or closed) before an utterance, and when the speaker spoke with closed eyes, the McGurk illusion was weakened (i.e., addressees reported significantly fewer illusory percepts). In line with previous research, this suggests that motion (opening or closing), as well as the closed state of the speaker’s eyes, captured addressees’ attention, thereby reducing the influence of the speaker’s lip movements on the addressees’ audiovisual integration process. Our findings reaffirm the power of speaker gaze to guide attention, showing that its dynamics can modulate low-level processes such as the integration of multisensory speech signals.


2021 ◽  
Author(s):  
◽  
Wee Kiat Tay

<p>Emotion analytics is the study of human behavior by analyzing the responses when humans experience different emotions. In this thesis, we research into emotion analytics solutions using computer vision to detect emotions from facial expressions automatically using live video.  Considering anxiety is an emotion that can lead to more serious conditions like anxiety disorders and depression, we propose 2 hypotheses to detect anxiety from facial expressions. One hypothesis is that the complex emotion “anxiety” is a subset of the basic emotion “fear”. The other hypothesis is that anxiety can be distinguished from fear by differences in head and eye motion.  We test the first hypothesis by implementing a basic emotions detector based on facial action coding system (FACS) to detect fear from videos of anxious faces. When we discover that this is not as accurate as we would like, an alternative solution based on Gabor filters is implemented. A comparison is done between the solutions and the Gabor-based solution is found to be inferior.  The second hypothesis is tested by using scatter graphs and statistical analysis of the head and eye motions of videos for fear and anxiety expressions. It is found that head pitch has significant differences between fear and anxiety.  As a conclusion to the thesis, we implement a systems software using the basic emotions detector based on FACS and evaluate the software by comparing commercials using emotions detected from facial expressions of viewers.</p>


2021 ◽  
Author(s):  
◽  
Wee Kiat Tay

<p>Emotion analytics is the study of human behavior by analyzing the responses when humans experience different emotions. In this thesis, we research into emotion analytics solutions using computer vision to detect emotions from facial expressions automatically using live video.  Considering anxiety is an emotion that can lead to more serious conditions like anxiety disorders and depression, we propose 2 hypotheses to detect anxiety from facial expressions. One hypothesis is that the complex emotion “anxiety” is a subset of the basic emotion “fear”. The other hypothesis is that anxiety can be distinguished from fear by differences in head and eye motion.  We test the first hypothesis by implementing a basic emotions detector based on facial action coding system (FACS) to detect fear from videos of anxious faces. When we discover that this is not as accurate as we would like, an alternative solution based on Gabor filters is implemented. A comparison is done between the solutions and the Gabor-based solution is found to be inferior.  The second hypothesis is tested by using scatter graphs and statistical analysis of the head and eye motions of videos for fear and anxiety expressions. It is found that head pitch has significant differences between fear and anxiety.  As a conclusion to the thesis, we implement a systems software using the basic emotions detector based on FACS and evaluate the software by comparing commercials using emotions detected from facial expressions of viewers.</p>


2021 ◽  
Author(s):  
Norick R Bowers ◽  
Josselin Gautier ◽  
Samantha Lin ◽  
Austin Roorda

Human fixational eye movements are so small and precise that they require high-speed, accurate tools to fully reveal their properties and functional roles. Where the fixated image lands on the retina and how it moves for different levels of visually demanding tasks is the subject of the current study. An Adaptive Optics Scanning Laser Ophthalmoscope (AOSLO) was used to image, track and present Maltese cross, disk, concentric circles, Vernier and tumbling-E letter fixation targets to healthy subjects. During these different passive (static) or active (discriminating) fixation tasks under natural eye motion, the landing position of the target on the retina was tracked in space and time over the retinal image directly. We computed both the eye motion and the exact trajectory of the fixated target's motion over the retina. We confirmed that compared to passive fixation, active tasks elicited a partial inhibition of microsaccades, leading to longer drifts periods compensated by larger corrective saccades. Consequently the fixation stability during active tasks was larger overall than during passive tasks. The preferred retinal locus of fixation was the same for each task and did not coincide with the location of the peak cone density.


Robotics ◽  
2021 ◽  
Vol 10 (2) ◽  
pp. 54
Author(s):  
Lorenzo Scalera ◽  
Stefano Seriani ◽  
Paolo Gallina ◽  
Mattia Lentini ◽  
Alessandro Gasparetto

In this paper, authors present a novel architecture for controlling an industrial robot via an eye tracking interface for artistic purposes. Humans and robots interact thanks to an acquisition system based on an eye tracker device that allows the user to control the motion of a robotic manipulator with his gaze. The feasibility of the robotic system is evaluated with experimental tests in which the robot is teleoperated to draw artistic images. The tool can be used by artists to investigate novel forms of art and by amputees or people with movement disorders or muscular paralysis, as an assistive technology for artistic drawing and painting, since, in these cases, eye motion is usually preserved.


2021 ◽  
Vol 46 (4) ◽  
pp. 753
Author(s):  
Ting Luo ◽  
Raymond L. Warner ◽  
Kaitlyn A. Sapoznik ◽  
Brittany R. Walker ◽  
Stephen A. Burns

Sign in / Sign up

Export Citation Format

Share Document