Gaze estimation via bilinear pooling-based attention networks

Author(s):  
Dakai Ren ◽  
Jiazhong Chen ◽  
Jian Zhong ◽  
Zhaoming Lu ◽  
Tao Jia ◽  
...  
2018 ◽  
Vol 14 (2) ◽  
pp. 153-173 ◽  
Author(s):  
Jumana Waleed ◽  
◽  
Taha Mohammed Hasan ◽  
Qutaiba Kadhim Abed

2021 ◽  
Author(s):  
Dezhi Han ◽  
Shuli Zhou ◽  
Kuan Ching Li ◽  
Rodrigo Fernandes de Mello

Author(s):  
Pedro H. C. Avelar ◽  
Anderson R. Tavares ◽  
Thiago L. T. da Silveira ◽  
Cliudio R. Jung ◽  
Luis C. Lamb

Sensors ◽  
2020 ◽  
Vol 21 (1) ◽  
pp. 26
Author(s):  
David González-Ortega ◽  
Francisco Javier Díaz-Pernas ◽  
Mario Martínez-Zarzuela ◽  
Míriam Antón-Rodríguez

Driver’s gaze information can be crucial in driving research because of its relation to driver attention. Particularly, the inclusion of gaze data in driving simulators broadens the scope of research studies as they can relate drivers’ gaze patterns to their features and performance. In this paper, we present two gaze region estimation modules integrated in a driving simulator. One uses the 3D Kinect device and another uses the virtual reality Oculus Rift device. The modules are able to detect the region, out of seven in which the driving scene was divided, where a driver is gazing at in every route processed frame. Four methods were implemented and compared for gaze estimation, which learn the relation between gaze displacement and head movement. Two are simpler and based on points that try to capture this relation and two are based on classifiers such as MLP and SVM. Experiments were carried out with 12 users that drove on the same scenario twice, each one with a different visualization display, first with a big screen and later with Oculus Rift. On the whole, Oculus Rift outperformed Kinect as the best hardware for gaze estimation. The Oculus-based gaze region estimation method with the highest performance achieved an accuracy of 97.94%. The information provided by the Oculus Rift module enriches the driving simulator data and makes it possible a multimodal driving performance analysis apart from the immersion and realism obtained with the virtual reality experience provided by Oculus.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Huoyin Zhang ◽  
Shiyunmeng Zhang ◽  
Jiachen Lu ◽  
Yi Lei ◽  
Hong Li

AbstractPrevious studies in humans have shown that brain regions activating social exclusion overlap with those related to attention. However, in the context of social exclusion, how does behavioral monitoring affect individual behavior? In this study, we used the Cyberball game to induce the social exclusion effect in a group of participants. To explore the influence of social exclusion on the attention network, we administered the Attention Network Test (ANT) and compared results for the three subsystems of the attention network (orienting, alerting, and executive control) between exclusion (N = 60) and inclusion (N = 60) groups. Compared with the inclusion group, the exclusion group showed shorter overall response time and better executive control performance, but no significant differences in orienting or alerting. The excluded individuals showed a stronger ability to detect and control conflicts. It appears that social exclusion does not always exert a negative influence on individuals. In future research, attention to network can be used as indicators of social exclusion. This may further reveal how social exclusion affects individuals' psychosomatic mechanisms.


Sign in / Sign up

Export Citation Format

Share Document