scholarly journals Lateral presentation of faces alters overall viewing strategy

PeerJ ◽  
2016 ◽  
Vol 4 ◽  
pp. e2241 ◽  
Author(s):  
Christopher J. Luke ◽  
Petra M.J. Pollux

Eye tracking has been used during face categorisation and identification tasks to identify perceptually salient facial features and infer underlying cognitive processes. However, viewing patterns are influenced by a variety of gaze biases, drawing fixations to the centre of a screen and horizontally to the left side of face images (left-gaze bias). In order to investigate potential interactions between gaze biases uniquely associated with facial expression processing, and those associated with screen location, face stimuli were presented in three possible screen positions to the left, right and centre. Comparisons of fixations between screen locations highlight a significant impact of the screen centre bias, pulling fixations towards the centre of the screen and modifying gaze biases generally observed during facial categorisation tasks. A left horizontal bias for fixations was found to be independent of screen position but interacting with screen centre bias, drawing fixations to the left hemi-face rather than just to the left of the screen. Implications for eye tracking studies utilising centrally presented faces are discussed.

2016 ◽  
Author(s):  
Christopher J Luke ◽  
Petra M J Pollux

Eye tracking has been used during face categorisation and identification tasks to identify perceptually salient facial features and infer underlying cognitive processes. However, viewing patterns are influenced by a variety of gaze biases, drawing fixations to the centre of a screen and horizontally to the left side of face images (left-gaze bias). In order to investigate potential interactions between gaze biases uniquely associated with facial expression processing, and those associated with screen location, face stimuli were presented in three possible screen positions to the left, right and centre. Comparisons of fixations between screen locations highlight a significant impact of the screen centre bias, pulling fixations towards the centre of the screen and modifying gaze biases generally observed during facial categorisation tasks. A left horizontal bias for fixations was found to be independent of screen position but interacting with screen centre bias, drawing fixations to the left hemi-face rather than just to the left of the screen. Implications for eye tracking studies utilising centrally presented faces are discussed.


2016 ◽  
Author(s):  
Christopher J Luke ◽  
Petra M J Pollux

Eye tracking has been used during face categorisation and identification tasks to identify perceptually salient facial features and infer underlying cognitive processes. However, viewing patterns are influenced by a variety of gaze biases, drawing fixations to the centre of a screen and horizontally to the left side of face images (left-gaze bias). In order to investigate potential interactions between gaze biases uniquely associated with facial expression processing, and those associated with screen location, face stimuli were presented in three possible screen positions to the left, right and centre. Comparisons of fixations between screen locations highlight a significant impact of the screen centre bias, pulling fixations towards the centre of the screen and modifying gaze biases generally observed during facial categorisation tasks. A left horizontal bias for fixations was found to be independent of screen position but interacting with screen centre bias, drawing fixations to the left hemi-face rather than just to the left of the screen. Implications for eye tracking studies utilising centrally presented faces are discussed.


PeerJ ◽  
2018 ◽  
Vol 6 ◽  
pp. e5702
Author(s):  
Frouke Hermens ◽  
Marius Golubickis ◽  
C. Neil Macrae

Past studies examining how people judge faces for trustworthiness and dominance have suggested that they use particular facial features (e.g. mouth features for trustworthiness, eyebrow and cheek features for dominance ratings) to complete the task. Here, we examine whether eye movements during the task reflect the importance of these features. We here compared eye movements for trustworthiness and dominance ratings of face images under three stimulus configurations: Small images (mimicking large viewing distances), large images (mimicking face to face viewing), and a moving window condition (removing extrafoveal information). Whereas first area fixated, dwell times, and number of fixations depended on the size of the stimuli and the availability of extrafoveal vision, and varied substantially across participants, no clear task differences were found. These results indicate that gaze patterns for face stimuli are highly individual, do not vary between trustworthiness and dominance ratings, but are influenced by the size of the stimuli and the availability of extrafoveal vision.


2018 ◽  
Vol 31 (2) ◽  
pp. 107-133 ◽  
Author(s):  
Edward J. Lynch ◽  
Lindsay M. Andiola

ABSTRACT Recent advances in technology have increased the accessibility and ease in using eye-tracking as a research tool. These advances have the potential to benefit behavioral accounting researchers' understanding of the cognitive processes underlying individuals' judgments, decisions, and behaviors. However, despite its potential and wide use in other disciplines, few behavioral accounting studies use eye-tracking. The purpose of this paper is to familiarize accounting researchers with eye-tracking, including its advantages and limitations as a research tool. We start by providing an overview of eye-tracking and discussing essential terms and useful metrics, as well as the psychological constructs they proxy. We then summarize eye-tracking research across research domains, review accounting studies that use eye-tracking, and identify future research directions across accounting topics. Finally, we provide an instructional resource to guide those researchers interested in using eye-tracking, including important considerations at each stage of the study. JEL Classifications: M41; C91.


Author(s):  
Aideen McParland ◽  
Stephen Gallagher ◽  
Mickey Keenan

AbstractA defining feature of ASD is atypical gaze behaviour, however, eye-tracking studies in ‘real-world’ settings are limited, and the possibility of improving gaze behaviour for ASD children is largely unexplored. This study investigated gaze behaviour of ASD and typically developing (TD) children in their classroom setting. Eye-tracking technology was used to develop and pilot an operant training tool to positively reinforce typical gaze behaviour towards faces. Visual and statistical analyses of eye-tracking data revealed different gaze behaviour patterns during live interactions for ASD and TD children depending on the interaction type. All children responded to operant training with longer looking times observed on face stimuli post training. The promising application of operant gaze training in ecologically valid settings is discussed.


2021 ◽  
Vol 15 ◽  
pp. 183449092110004
Author(s):  
Jing Yu ◽  
Xue-Rui Peng ◽  
Ming Yan

People employ automatic inferential processing when confronting pragmatically implied claims in advertising. However, whether comprehension and memorization of pragmatic implications differ between young and older adults is unclear. In the present study, we used eye-tracking technology to investigate online cognitive processes during reading of misleading advertisements. We found an interaction between age and advertising content, manifested as our older participants generated higher misleading rates in health-related than in health-irrelevant products, whereas this content-bias did not appear in their younger counterparts. Eye movement data further showed that the older adults spent more time processing critical claims for the health-related products than for the health-irrelevant products. Moreover, the correlations between fixation duration on pragmatic implications and misleading rates showed opposite trends in the two groups. The eye-tracking evidence novelly suggests that young and older adults may adopt different information processing strategies to comprehend pragmatic implications in advertising: More reading possibly enhances young adults’ gist memory whereas it facilitates older adults’ verbatim memory instead.


2022 ◽  
Author(s):  
Ivan Bouchardet da Fonseca Grebot ◽  
Pedro Henrique Pinheiro Cintra ◽  
Emilly Fátima Ferreira de Lima ◽  
Michella Vaz de Castro ◽  
Rui de Moraes

2017 ◽  
Author(s):  
Chi-Hsun Chang ◽  
Dan Nemrodov ◽  
Andy C. H. Lee ◽  
Adrian Nestor

AbstractVisual memory for faces has been extensively researched, especially regarding the main factors that influence face memorability. However, what we remember exactly about a face, namely, the pictorial content of visual memory, remains largely unclear. The current work aims to elucidate this issue by reconstructing face images from both perceptual and memory-based behavioural data. Specifically, our work builds upon and further validates the hypothesis that visual memory and perception share a common representational basis underlying facial identity recognition. To this end, we derived facial features directly from perceptual data and then used such features for image reconstruction separately from perception and memory data. Successful levels of reconstruction were achieved in both cases for newly-learned faces as well as for familiar faces retrieved from long-term memory. Theoretically, this work provides insights into the content of memory-based representations while, practically, it opens the path to novel applications, such as computer-based ‘sketch artists’.


i-Perception ◽  
2021 ◽  
Vol 12 (6) ◽  
pp. 204166952110563
Author(s):  
Ronja Mueller ◽  
Sandra Utz ◽  
Claus-Christian Carbon ◽  
Tilo Strobach

Recognizing familiar faces requires a comparison of the incoming perceptual information with mental face representations stored in memory. Mounting evidence indicates that these representations adapt quickly to recently perceived facial changes. This becomes apparent in face adaptation studies where exposure to a strongly manipulated face alters the perception of subsequent face stimuli: original, non-manipulated face images then appear to be manipulated, while images similar to the adaptor are perceived as “normal.” The face adaptation paradigm serves as a good tool for investigating the information stored in facial memory. So far, most of the face adaptation studies focused on configural (second-order relationship) face information, mainly neglecting non-configural face information (i.e., that does not affect spatial face relations), such as color, although several (non-adaptation) studies were able to demonstrate the importance of color information in face perception and identification. The present study therefore focuses on adaptation effects on saturation color information and compares the results with previous findings on brightness. The study reveals differences in the effect pattern and robustness, indicating that adaptation effects vary considerably even within the same class of non-configural face information.


2021 ◽  
Author(s):  
Shira C. Segal

The ability to recognize facial expressions of emotion is a critical part of human social interaction. Infants improve in this ability across the first year of life, but the mechanisms driving these changes and the origins of individual differences in this ability are largely unknown. This thesis used eye tracking to characterize infant scanning patterns of expressions. In study 1 (n = 40), I replicated the preference for fearful faces, and found that infants either allocated more attention to the eyes or the mouth across both happy and fearful expressions. In study 2 (n = 40), I found that infants differentially scanned the critical facial features of dynamic expressions. In study 3 (n = 38), I found that maternal depressive symptoms and positive and negative affect were related to individual differences in infants’ scanning of emotional expressions. Implications for our understanding of the development of emotion recognition are discussed. Key Words: emotion recognition, infancy eye tracking, socioemotional development


Sign in / Sign up

Export Citation Format

Share Document