gaze shift
Recently Published Documents


TOTAL DOCUMENTS

146
(FIVE YEARS 20)

H-INDEX

34
(FIVE YEARS 1)

2022 ◽  
Vol 12 (1) ◽  
pp. 83
Author(s):  
Sohaib Siddique Butt ◽  
Mahnoor Fatima ◽  
Ali Asghar ◽  
Wasif Muhammad

Sound Source Localization (SSL) and gaze shift to the sound source behavior is an integral part of a socially interactive humanoid robot perception system. In noisy and reverberant environments, it is non-trivial to estimate the location of a sound source and accurately shift gaze in its direction. Previous SSL algorithms are deficient in the optimum approximation of distance to audio sources and to accurately detect, interpret, and differentiate the actual sound from comparable sound sources due to challenging acoustic environments. In this article, a learning-based model is presented to achieve noiseless and reverberation-resistant sound source localization in the real-world scenarios. The proposed system utilizes a multi-layered Gaussian Cross-Correlation with Phase Transform (GCC-PHAT) signal processing technique as a baseline for a Generalized Cross Correlation Convolution Neural Network (GCC-CNN) model. The proposed model is integrated with an efficient rotation algorithm to predict and orient toward the sound source. The performance of the proposed method is compared with the state-of-art deep network-based sound source localization methods. The findings of the proposed method outperform the existing neural network-based approaches by achieving the highest accuracy of 96.21% for an active binaural auditory perceptual system.


2021 ◽  
Vol 12 (1) ◽  
Author(s):  
Jeroen Brus ◽  
Helena Aebersold ◽  
Marcus Grueschow ◽  
Rafael Polania

AbstractConfidence, the subjective estimate of decision quality, is a cognitive process necessary for learning from mistakes and guiding future actions. The origins of confidence judgments resulting from economic decisions remain unclear. We devise a task and computational framework that allowed us to formally tease apart the impact of various sources of confidence in value-based decisions, such as uncertainty emerging from encoding and decoding operations, as well as the interplay between gaze-shift dynamics and attentional effort. In line with canonical decision theories, trial-to-trial fluctuations in the precision of value encoding impact economic choice consistency. However, this uncertainty has no influence on confidence reports. Instead, confidence is associated with endogenous attentional effort towards choice alternatives and down-stream noise in the comparison process. These findings provide an explanation for confidence (miss)attributions in value-guided behaviour, suggesting mechanistic influences of endogenous attentional states for guiding decisions and metacognitive awareness of choice certainty.


2021 ◽  
pp. 1-19
Author(s):  
Jairo Perez-Osorio ◽  
Abdulaziz Abubshait ◽  
Agnieszka Wykowska

Abstract Understanding others' nonverbal behavior is essential for social interaction, as it allows, among others, to infer mental states. Although gaze communication, a well-established nonverbal social behavior, has shown its importance in inferring others' mental states, not much is known about the effects of irrelevant gaze signals on cognitive conflict markers during collaborative settings. Here, participants completed a categorization task where they categorized objects based on their color while observing images of a robot. On each trial, participants observed the robot iCub grasping an object from a table and offering it to them to simulate a handover. Once the robot “moved” the object forward, participants were asked to categorize the object according to its color. Before participants were allowed to respond, the robot made a lateral head/gaze shift. The gaze shifts were either congruent or incongruent with the object's color. We expected that incongruent head cues would induce more errors (Study 1), would be associated with more curvature in eye-tracking trajectories (Study 2), and induce larger amplitude in electrophysiological markers of cognitive conflict (Study 3). Results of the three studies show more oculomotor interference as measured in error rates (Study 1), larger curvatures eye-tracking trajectories (Study 2), and higher amplitudes of the N2 ERP of the EEG signals as well as higher event-related spectral perturbation amplitudes (Study 3) for incongruent trials compared with congruent trials. Our findings reveal that behavioral, ocular, and electrophysiological markers can index the influence of irrelevant signals during goal-oriented tasks.


2021 ◽  
Vol 13 (3) ◽  
pp. 389-402
Author(s):  
Vladimir I. Grachev ◽  
◽  
Vladimir V. Kolesov ◽  
Galina Ya. Menshikova ◽  
Viktor I. Ryabenkov ◽  
...  

The individual characteristics of the human visual apparatus are associated with the anatomical and psychophysiological parameters of his body. Based on the EyeTracking technology, the physiological aspects of the perception of visual information by the oculomotor apparatus, which are not associated with active cognitive activity, have been investigated. The individual features in the size of fixation when reading text and examining halftone graphic objects in various people have been investigated. The time durations of fixations in different people, associated with the process of accommodation, as well as the internal structure of fixations, were investigated. It is shown that the trajectory of the gaze shift in fixation has an internal heterogeneous structure. The total trajectory of eye movement in the fixation area is determined by a set of successive clusters. This fixation structure is apparently associated with the processes of restoration of the photosensitivity of rhodopsin in the photoreceptors of the retina. All the above studies of the fixations of various subjects on the basis of various images showed that the oculomotor system, taking into account the physiological characteristics of the visual apparatus, is equally controlled by the "video processor" of the brain when the eye is accommodated to the image elements. And the only objective individual feature of human vision, which uniquely characterizes the perception of graphic information, is the value of the average displacement in fixation. It is she who is the "visiting card" of the subject and remains practically unchanged both when reading and when examining halftone images and in test validation with forced fixation of the gaze.


2021 ◽  
Author(s):  
Jairo Pérez-Osorio ◽  
Abdulaziz Abubshait ◽  
Agnieszka Wykowska

Understanding others’ nonverbal behavior is essential for social interaction, as it allows, among others, to infer mental states. While gaze communication, a well-established nonverbal social behavior, has shown its importance in inferring others’ mental states, not much is known about the effects of irrelevant gaze signals on cognitive conflict markers during collaborative settings. Here, participants completed a categorization task where they categorized objects based on their color while observing images of a robot. On each trial, participants observed the robot iCub grasping an object from a table and offering it to them to simulate a handover. Once the robot “moved” the object forward, participants were asked to categorize the object according to its color. Before participants were allowed to respond, the robot made a lateral head/gaze shift. The gaze shifts were either congruent or incongruent with the object’s color. We expected that incongruent head-cues would induce more errors (Study 1), would be associated with more curvature in eye-tracking trajectories (Study 2), and induce larger amplitude in electrophysiological markers of cognitive conflict (Study 3). Results of the three studies show more oculomotor interference as measured in error rates (Study 1), larger curvatures eye-tracking trajectories (Study 2), and higher amplitudes of the N2 event-related potential (ERP) of the EEG signals as well as higher Event-Related Spectral Perturbation (ERSP) amplitudes (Study 3) for incongruent trials compared to congruent trials. Our findings reveal that behavioral, ocular and electrophysiological markers can index the influence of irrelevant signals during goal-oriented tasks.


2021 ◽  
Author(s):  
Ying Wang ◽  
Marc M. van Wanrooij ◽  
Rowena Emaus ◽  
Jorik Nonneke ◽  
Michael X Cohen ◽  
...  

Background Individuals with Parkinson disease can experience freezing of gait: a sudden, brief episode of an inability to move their feet despite the intention to walk . Since turning is the most sensitive condition to provoke freezing-of-gait episodes, and the eyes typically lead turning, we hypothesize that disturbances in saccadic eye movements are related to freezing-of-gait episodes. Objectives This study explores the relationship between freezing-of-gait episodes and saccadic eye movements for gaze shift and gaze stabilization during turning. Methods We analyzed 277 freezing-of-gait episodes provoked among 17 individuals with Parkinson disease during two conditions: self-selected speed and rapid speed 180-degree turns in alternating directions. Eye movements acquired from electrooculography signals were characterized by the average position of gaze, the amplitude of gaze shifts, and the speed of gaze stabilization. We analyzed these variables before and during freezing-of-gait episodes occurring at the different phase angles of a turn. Results Significant off-track changes of the gaze position were observed almost one 180-degree-turn time before freezing-of-gait episodes. In addition, the speed of gaze stabilization significantly decreased during freezing-of-gait episodes. Conclusions We argue that off-track changes of the gaze position could be a predictor of freezing-of-gait episodes due to continued failure in movement-error correction or an insufficient preparation for eye-to-foot coordination during turning. The decline in the speed of gaze stabilization is large during freezing-of-gait episodes given the slowness or stop of body turning. We argue that this could be evidence for a healthy compensatory system in individuals with freezing-of-gait.


Author(s):  
Samantha E. A. Gregory

AbstractThis study aimed to investigate the facilitatory versus inhibitory effects of dynamic non-predictive central cues presented in a realistic environment. Realistic human-avatars initiated eye contact and then dynamically looked to the left, right or centre of a table. A moving stick served as a non-social control cue and participants localised (Experiment 1) or discriminated (Experiment 2) a contextually relevant target (teapot/teacup). The cues movement took 500 ms and stimulus onset asynchronies (SOA, 150 ms/300 ms/500 ms/1000 ms) were measured from movement initiation. Similar cuing effects were seen for the social avatar and non-social stick cue across tasks. Results showed facilitatory processes without inhibition, though there was some variation by SOA and task. This is the first time facilitatory versus inhibitory processes have been directly investigated where eye contact is initiated prior to gaze shift. These dynamic stimuli allow a better understanding of how attention might be cued in more realistic environments.


Author(s):  
Palak Gupta ◽  
Sinem Beylergil ◽  
Jordan Murray ◽  
Camilla Kilbane ◽  
Fatema F Ghasia ◽  
...  

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Takao Fukui ◽  
Mrinmoy Chakrabarty ◽  
Misako Sano ◽  
Ari Tanaka ◽  
Mayuko Suzuki ◽  
...  

AbstractEye movements toward sequentially presented face images with or without gaze cues were recorded to investigate whether those with ASD, in comparison to their typically developing (TD) peers, could prospectively perform the task according to gaze cues. Line-drawn face images were sequentially presented for one second each on a laptop PC display, and the face images shifted from side-to-side and up-and-down. In the gaze cue condition, the gaze of the face image was directed to the position where the next face would be presented. Although the participants with ASD looked less at the eye area of the face image than their TD peers, they could perform comparable smooth gaze shift to the gaze cue of the face image in the gaze cue condition. This appropriate gaze shift in the ASD group was more evident in the second half of trials in than in the first half, as revealed by the mean proportion of fixation time in the eye area to valid gaze data in the early phase (during face image presentation) and the time to first fixation on the eye area. These results suggest that individuals with ASD may benefit from the short-period trial experiment by enhancing the usage of gaze cue.


Author(s):  
Ding Ding ◽  
Mark A Neerincx ◽  
Willem-Paul Brinkman

Abstract Virtual cognitions (VCs) are a stream of simulated thoughts people hear while emerged in a virtual environment, e.g. by hearing a simulated inner voice presented as a voice over. They can enhance people’s self-efficacy and knowledge about, for example, social interactions as previous studies have shown. Ownership and plausibility of these VCs are regarded as important for their effect, and enhancing both might, therefore, be beneficial. A potential strategy for achieving this is the synchronization of the VCs with people’s eye fixation using eye-tracking technology embedded in a head-mounted display. Hence, this paper tests this idea in the context of a pre-therapy for spider and snake phobia to examine the ability to guide people’s eye fixation. An experiment with 24 participants was conducted using a within-subjects design. Each participant was exposed to two conditions: one where the VCs were adapted to eye gaze of the participant and the other where they were not adapted, i.e. the control condition. The findings of a Bayesian analysis suggest that credibly more ownership was reported and more eye-gaze shift behaviour was observed in the eye-gaze-adapted condition than in the control condition. Compared to the alternative of no or negative mediation, the findings also give some more credibility to the hypothesis that ownership, at least partly, positively mediates the effect eye-gaze-adapted VCs have on eye-gaze shift behaviour. Only weak support was found for plausibility as a mediator. These findings help improve insight into how VCs affect people.


Sign in / Sign up

Export Citation Format

Share Document