scholarly journals Influence of soundtrack on eye movements during video exploration

2012 ◽  
Vol 5 (4) ◽  
Author(s):  
Antoine Coutrot ◽  
Nathalie Guyader ◽  
Gelu Ionescu ◽  
Alice Caplier

Models of visual attention rely on visual features such as orientation, intensity or motion to predict which regions of complex scenes attract the gaze of observers. So far, sound has never been considered as a possible feature that might influence eye movements. Here, we evaluate the impact of non-spatial sound on the eye movements of observers watching videos. We recorded eye movements of 40 participants watching assorted videos with and without their related soundtracks. We found that sound impacts on eye position, fixation duration and saccade amplitude. The effect of sound is not constant across time but becomes significant around one second after the beginning of video shots.

Sensors ◽  
2021 ◽  
Vol 21 (15) ◽  
pp. 5178
Author(s):  
Sangbong Yoo ◽  
Seongmin Jeong ◽  
Seokyeon Kim ◽  
Yun Jang

Gaze movement and visual stimuli have been utilized to analyze human visual attention intuitively. Gaze behavior studies mainly show statistical analyses of eye movements and human visual attention. During these analyses, eye movement data and the saliency map are presented to the analysts as separate views or merged views. However, the analysts become frustrated when they need to memorize all of the separate views or when the eye movements obscure the saliency map in the merged views. Therefore, it is not easy to analyze how visual stimuli affect gaze movements since existing techniques focus excessively on the eye movement data. In this paper, we propose a novel visualization technique for analyzing gaze behavior using saliency features as visual clues to express the visual attention of an observer. The visual clues that represent visual attention are analyzed to reveal which saliency features are prominent for the visual stimulus analysis. We visualize the gaze data with the saliency features to interpret the visual attention. We analyze the gaze behavior with the proposed visualization to evaluate that our approach to embedding saliency features within the visualization supports us to understand the visual attention of an observer.


2011 ◽  
Vol 106 (5) ◽  
pp. 2536-2545 ◽  
Author(s):  
Katharina Havermann ◽  
Eckart Zimmermann ◽  
Markus Lappe

Saccades are used by the visual system to explore visual space with the high accuracy of the fovea. The visual error after the saccade is used to adapt the control of subsequent eye movements of the same amplitude and direction in order to keep saccades accurate. Saccadic adaptation is thus specific to saccade amplitude and direction. In the present study we show that saccadic adaptation is also specific to the initial position of the eye in the orbit. This is useful, because saccades are normally accompanied by head movements and the control of combined head and eye movements depends on eye position. Many parts of the saccadic system contain eye position information. Using the intrasaccadic target step paradigm, we adaptively reduced the amplitude of reactive saccades to a suddenly appearing target at a selective position of the eyes in the orbitae and tested the resulting amplitude changes for the same saccade vector at other starting positions. For central adaptation positions the saccade amplitude reduction transferred completely to eccentric starting positions. However, for adaptation at eccentric starting positions, there was a reduced transfer to saccades from central starting positions or from eccentric starting positions in the opposite hemifield. Thus eye position information modifies the transfer of saccadic amplitude changes in the adaptation of reactive saccades. A gain field mechanism may explain the eye position dependence found.


2017 ◽  
Vol 102 (2) ◽  
pp. 253-259 ◽  
Author(s):  
Fatema F Ghasia ◽  
Jorge Otero-Millan ◽  
Aasef G Shaikh

IntroductionFixational saccades are miniature eye movements that constantly change the gaze during attempted visual fixation. Visually guided saccades and fixational saccades represent an oculomotor continuum and are produced by common neural machinery. Patients with strabismus have disconjugate binocular horizontal saccades. We examined the stability and variability of eye position during fixation in patients with strabismus and correlated the severity of fixational instability with strabismus angle and binocular vision.MethodsEye movements were measured in 13 patients with strabismus and 16 controls during fixation and visually guided saccades under monocular viewing conditions. Fixational saccades and intersaccadic drifts were analysed in the viewing and non-viewing eye of patients with strabismus and controls.ResultsWe found an increase in fixational instability in patients with strabismus compared with controls. We also found an increase in the disconjugacy of fixational saccades and intrasaccadic ocular drift in patients with strabismus compared with controls. The disconjugacy was worse in patients with large-angle strabismus and absent stereopsis. There was an increase in eye position variance during drifts in patients with strabismus. Our findings suggest that both fixational saccades and intersaccadic drifts are abnormal and likely contribute to the fixational instability in patients with strabismus.DiscussionFixational instability could be a useful tool for mass screenings of children to diagnose strabismus in the absence of amblyopia and latent nystagmus. The increased disconjugacy of fixational eye movements and visually guided saccades in patients with strabismus reflects the disruption of the fine-tuning of the motor and visual systems responsible for achieving binocular fusion in these patients.


2022 ◽  
Author(s):  
Lisa M Kroell ◽  
Martin Rolfs

Despite the fovea's singular importance for active human vision, the impact of large eye movements on foveal processing remains elusive. Building on findings from passive fixation tasks, we hypothesized that during the preparation of rapid eye movements (saccades), foveal processing anticipates soon-to-be fixated visual features. Using a dynamic large-field noise paradigm, we indeed demonstrate that sensitivity for defining features of a saccade target is enhanced in the pre-saccadic center of gaze. Enhancement manifested in higher Hit Rates for foveal probes with target-congruent orientation, and a sensitization to incidental, target-like orientation information in foveally presented noise. Enhancement was spatially confined to the center of gaze and its immediate vicinity. We suggest a crucial contribution of foveal processing to trans-saccadic visual continuity which has previously been overlooked: Foveal processing of saccade targets commences before the movement is executed and thereby enables a seamless transition once the center of gaze reaches the target.


2009 ◽  
Vol 3 (1) ◽  
Author(s):  
Oleg V. Komogortsev ◽  
Young Sam Ryu ◽  
Do H. Koh

This paper presents a new saccade amplitude prediction model. The model is based on a Kalman filter and regression analysis. The aim of the model is to predict a saccade’s am-plitude extremely quickly, i.e., within two eye position samples at the onset of a saccade. Specifically, the paper explores saccade amplitude prediction considering one or two sam-ples at the onset of a saccade. The models’ prediction performance was tested with 35 subjects. The amplitude accuracy results yielded approximately 5.26° prediction error, while the error for direction prediction was 5.3% for the first sample model and 1.5% for the two samples model. The practical use of the proposed model lays in the area of real-time gaze-contingent compression and extreme eye-gaze aware interaction applications. The paper provides theoretical evaluation of the benefits of saccade amplitude prediction to the gaze-contingent multimedia compression, estimating a 21% improvement in com-pression for short network delays.


Nutrients ◽  
2021 ◽  
Vol 13 (9) ◽  
pp. 2915
Author(s):  
Saar Bossuyt ◽  
Kathleen Custers ◽  
José Tummers ◽  
Laura Verbeyst ◽  
Bert Oben

Research on front-of-pack labels (FOPLs) demonstrated that Nutri-Score is one of the most promising FOPLs regarding healthfulness estimation accuracy. Nevertheless, as consumers are exposed to both the Nutri-Score and the mandatory Nutrition Facts Panel (NFP) in the supermarket, it is key to understand if and how both labels interact. This study investigates the contribution of Nutri-Score and NFP regarding healthfulness estimation accuracy, whether this impact differs depending on the product, and what role visual attention plays. We set up an eye-tracking experiment in a controlled setting in which 398 participants rated the healthfulness of 20 products. The results confirmed the positive impact of the Nutri-Score on healthfulness estimation accuracy, though the impact was larger for equivocal (i.e., difficult to judge) products. Interestingly, NFP either had no effect (compared to a package without Nutri-Score or NFP) or a negative effect (compared to a package with Nutri-Score alone) on healthfulness estimation accuracy. Eye-tracking data corroborated that ‘cognitive overload’ issues could explain why consumers exposed to Nutri-Score alone outperformed those exposed to both Nutri-Score and NFP. This study offers food for thought for policymakers and the industry seeking to maximize the potential of the Nutri-Score.


Author(s):  
Heiner Deubel

Planning and execution of goal-directed actions are closely related to visual attention. This chapter gives an overview of research on this relationship, focusing on the role of attention in the preparation of eye movements, manual reaching, and grasping. The studies suggest that major functions of attention during motor planning are to select the spatial goals of the movement, and to prioritize those visual features that are important for the action. For complex movements involving more than a single spatial location, it seems that action preparation comes along with a temporally changing ‘attentional landscape’ which includes multiple foci of attention.


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Lijie Zhou ◽  
Fei Xue

Purpose This paper aims to examine the effects of visual themes and view perspectives on users’ visual attention to brand posts on Instagram. The impact of visual attention on brand attitude and recognition is also explored. Design/methodology/approach The authors conducted a 4 (visual themes: customer-centric, employee-centric, product-centric and symbolic visuals) × 2 (view perspectives: first-person view vs third-person view) between-subject factorial eye-tracking experiment to explore their effects on viewers’ visual attention (fixation frequency and fixation duration), attitude toward the brand and brand recognition. Findings Results showed that, under a first-person view, participants spent the longest time viewing customer-centric images and paid the most attention to product-centric and customer-centric images. For images in the third-person view, product-centric images received the longest fixation duration and highest fixation frequency. Customer-centric image and product-centric image generated significantly higher amount of fixation duration and fixation frequency than the symbolic image, regardless of view perspective. Brand recognition was positively influenced by fixation frequency but not by fixation duration. Originality/value This study is an extended application of Aaker’s (1996) brand identity planning model in visual branding on Instagram. As the findings indicated, the effective use of visual strategies could lead to more positive responses toward the brand. By understanding how optical elements stimulate visual branding processing, marketing professionals will be able to improve information designing skills in visual-based social media platforms (such as Instagram).


Aerospace ◽  
2021 ◽  
Vol 8 (9) ◽  
pp. 260
Author(s):  
Yanjun Wang ◽  
Rongjin Hu ◽  
Siyuan Lin ◽  
Michael Schultz ◽  
Daniel Delahaye

Air traffic controllers have to make quick decisions to keep air traffic safe. Their behaviors have a significant impact on the operation of the air traffic management (ATM) system. Automation tools have enhanced the ATM system’s capability by reducing the controller’s task-load. Much attention has been devoted to developing advanced automation in the last decade. However, less is known about the impact of automation on the behaviors of air traffic controllers. Here, we empirically tested the effects of three levels of automation—including manual, attention-guided, and automated—as well as varying traffic levels on eye movements, situation awareness and mental workload. The results showed that there are significant differences in the gaze and saccade behaviors between the attention-guided group and automated group. Traffic affected eye movements under the manual mode or under the attention-guided mode, but had no effect on eye movements under the automated mode. The results also supported the use of automation for enhancing situation awareness while reducing mental workload. Our work has potential implications for the design of automation and operation procedures.


Author(s):  
Loïc Caroux ◽  
Ludovic Le Bigot ◽  
Nicolas Vibert

In many visual displays such as virtual environments, human tasks involve objects superimposed on both complex and moving backgrounds. However, most studies investigated the influence of background complexity or background motion in isolation. Two experiments were designed to investigate the joint influences of background complexity and lateral motion on a simple shooting task typical of video games. Participants had to perform the task on the moving and static versions of backgrounds of three levels of complexity, while their eye movements were recorded. The backgrounds displayed either an abstract (Experiment 1) or a naturalistic (Experiment 2) virtual environment. The results showed that performance was impaired by background motion in both experiments. The effects of motion and complexity were additive for the abstract background and multiplicative for the naturalistic background. Eye movement recordings showed that performance impairments reflected at least in part the impact of the background visual features on gaze control.


Sign in / Sign up

Export Citation Format

Share Document