scholarly journals Focusing on the face or getting distracted by social signals? The effect of distracting gestures on attentional focus in natural interaction

Author(s):  
Jasmin Kajopoulos ◽  
Gordon Cheng ◽  
Koichi Kise ◽  
Hermann J. Müller ◽  
Agnieszka Wykowska
2021 ◽  
Author(s):  
Agnieszka Wykowska

Attentional orienting towards others’ gaze direction or pointing has been wellinvestigated in laboratory conditions. However, less is known about the operation ofattentional mechanisms in online naturalistic social interaction scenarios. It is equally plausible that following social directional cues (gaze, pointing) occurs reflexively, and/orthat it is influenced by top-down cognitive factors. In a mobile eye-tracking experiment,we show that under natural interaction conditions overt attentional orienting is notnecessarily reflexively triggered by pointing gestures or a combination of gaze shifts andpointing gestures. We found that participants conversing with an experimenter, who,during the interaction, would play out pointing gestures as well as directional gaze movements, continued to mostly focus their gaze on the face of the experimenter, demonstrating the significance of attending to the face of the interaction partner – in linewith effective top-down control over reflexive orienting of attention in the direction of social cues.


2021 ◽  
Author(s):  
Agnieszka Wykowska

Attentional orienting towards others’ gaze direction or pointing has been well investigated in laboratory conditions. However, less is known about the operation of attentional mechanisms in online naturalistic social interaction scenarios. It is equally plausible that following social directional cues (gaze, pointing) occurs reflexively, and/or that it is influenced by top-down cognitive factors. In a mobile eye-tracking experiment, we show that under natural interaction conditions overt attentional orienting is not necessarily reflexively triggered by pointing gestures or a combination of gaze shifts and pointing gestures. We found that participants conversing with an experimenter, who, during the interaction, would play out pointing gestures as well as directional gaze movements, continued to mostly focus their gaze on the face of the experimenter, demonstrating the significance of attending to the face of the interaction partner – in line with effective top-down control over reflexive orienting of attention in the direction of social cues.


2015 ◽  
Vol 112 (47) ◽  
pp. 14717-14722 ◽  
Author(s):  
Clark Fisher ◽  
Winrich A. Freiwald

The primate brain contains a set of face-selective areas, which are thought to extract the rich social information that faces provide, such as emotional state and personal identity. The nature of this information raises a fundamental question about these face-selective areas: Do they respond to a face purely because of its visual attributes, or because the face embodies a larger social agent? Here, we used functional magnetic resonance imaging to determine whether the macaque face patch system exhibits a whole-agent response above and beyond its responses to individually presented faces and bodies. We found a systematic development of whole-agent preference through the face patches, from subadditive integration of face and body responses in posterior face patches to superadditive integration in anterior face patches. Superadditivity was not observed for faces atop nonbody objects, implying categorical specificity of face–body interaction. Furthermore, superadditivity was robust to visual degradation of facial detail, suggesting whole-agent selectivity does not require prior face recognition. In contrast, even the body patches immediately adjacent to anterior face areas did not exhibit superadditivity. This asymmetry between face- and body-processing systems may explain why observers attribute bodies’ social signals to faces, and not vice versa. The development of whole-agent selectivity from posterior to anterior face patches, in concert with the recently described development of natural motion selectivity from ventral to dorsal face patches, identifies a single face patch, AF (anterior fundus), as a likely link between the analysis of facial shape and semantic inferences about other agents.


2018 ◽  
Vol 9 (2) ◽  
pp. jep.062917 ◽  
Author(s):  
Roy S. Hessels ◽  
Gijs A. Holleman ◽  
Tim H. W. Cornelissen ◽  
Ignace T. C. Hooge ◽  
Chantal Kemner

Research on social impairments in psychopathology has relied heavily on the face processing literature. However, although many sub-systems of facial information processing are described, recent evidence suggests that generalizability of these findings to social settings may be limited. The main argument is that in social interaction, the content of faces is more dynamic and dependent on the interplay between interaction partners, than the content of a non-responsive face (e.g. pictures or videos) as portrayed in a typical experiment. The question beckons whether gaze atypicalities to non-responsive faces in certain disorders generalize to faces in interaction. In the present study, a dual eye-tracking setup capable of recording gaze with high resolution was used to investigate how gaze behavior in interaction is related to traits of Autism Spectrum Disorder (ASD), and Social Anxiety Disorder (SAD). As clinical ASD and SAD groups have exhibited deficiencies in reciprocal social behavior, traits of these two conditions were assessed in a general population. We report that gaze behavior in interaction of individuals scoring high on ASD and SAD traits corroborates hypotheses posed in typical face-processing research using non-responsive stimuli. Moreover, our findings on the relation between paired gaze states (when and how often pairs look at each other’s eyes simultaneously or alternately) and ASD and SAD traits bear resemblance to prevailing models in the ASD literature (the ‘gaze aversion’ model) and SAD literature (the ‘vigilant-avoidance’ model). Pair-based analyses of gaze may reveal behavioral patterns crucial to our understanding of ASD and SAD, and more general to our understanding of eye movements as social signals in interaction.


2009 ◽  
Vol 364 (1535) ◽  
pp. 3497-3504 ◽  
Author(s):  
Ursula Hess ◽  
Reginald B. Adams ◽  
Robert E. Kleck

Faces are not simply blank canvases upon which facial expressions write their emotional messages. In fact, facial appearance and facial movement are both important social signalling systems in their own right. We here provide multiple lines of evidence for the notion that the social signals derived from facial appearance on the one hand and facial movement on the other interact in a complex manner, sometimes reinforcing and sometimes contradicting one another. Faces provide information on who a person is. Sex, age, ethnicity, personality and other characteristics that can define a person and the social group the person belongs to can all be derived from the face alone. The present article argues that faces interact with the perception of emotion expressions because this information informs a decoder's expectations regarding an expresser's probable emotional reactions. Facial appearance also interacts more directly with the interpretation of facial movement because some of the features that are used to derive personality or sex information are also features that closely resemble certain emotional expressions, thereby enhancing or diluting the perceived strength of particular expressions.


1997 ◽  
Vol 25 (1) ◽  
pp. 62 ◽  
Author(s):  
D.I. Perrett ◽  
M.W. Oram ◽  
E. Lorincz ◽  
N.J. Emery ◽  
C. Baker
Keyword(s):  

2019 ◽  
Vol 14 (1) ◽  
Author(s):  
Jimmy Debladis ◽  
Marion Valette ◽  
Kuzma Strenilkov ◽  
Carine Mantoulan ◽  
Denise Thuilleaux ◽  
...  

Abstract Background Faces are critical social cues that must be perfectly processed in order to engage appropriately in everyday social interactions. In Prader-Willi Syndrome (PWS), a rare genetic disorder characterized by cognitive and behavioural difficulties including autism spectrum disorder, the literature referring to face processing is sparse. Given reports of poor social interactions in individuals with PWS, we sought to assess their face and emotion recognition skills during eyetracking recordings. Results Compared with controls, patients with PWS performed more poorly on face/emotion recognition. We observed atypical facial exploration by patients with maternal disomy. These patients looked preferentially at the mouth region, whereas patients with a deletion and controls were more attracted to the eye region. During social scenes, the exploration became more atypical as the social content increased. Conclusions Our comprehensive study brings new insights into the face processing of patients with PWS. Atypical facial exploration was only displayed by patients with the maternal disomy subtype, corresponding to their higher rate of autism spectrum disorder. This finding strongly argues in favor of early identification of this genetic subgroup in order to optimize care by implementing tailored interventions for each patient as soon as possible.


2018 ◽  
Vol 71 (3) ◽  
pp. 569-594 ◽  
Author(s):  
Andrew W Young

The fact that the face is a source of diverse social signals allows us to use face and person perception as a model system for asking important psychological questions about how our brains are organised. A key issue concerns whether we rely primarily on some form of generic representation of the common physical source of these social signals (the face) to interpret them, or instead create multiple representations by assigning different aspects of the task to different specialist components. Variants of the specialist components hypothesis have formed the dominant theoretical perspective on face perception for more than three decades, but despite this dominance of formally and informally expressed theories, the underlying principles and extent of any division of labour remain uncertain. Here, I discuss three important sources of constraint: first, the evolved structure of the brain; second, the need to optimise responses to different everyday tasks; and third, the statistical structure of faces in the perceiver’s environment. I show how these constraints interact to determine the underlying functional organisation of face and person perception.


The direction of eye gaze and orientation of the face towards or away from another are important social signals for man and for macaque monkey. We have studied the effects of these signals in a region of the macaque temporal cortex where cells have been found to be responsive to the sight of faces. Of cells selectively responsive to the sight of the face or head but not to other objects (182 cells) 63% were sensitive to the orientation of the head. Different views of the head (full face, profile, back or top of the head, face rotated by 45° up to the ceiling or down to the floor) maximally activated different classes of cell. All classes of cell, however, remained active as the preferred view was rotated isomorphically or was changed in size or distance. Isomorphic rotation by 90–180° increased cell response latencies by 10–60 ms. Sensitivity to gaze direction was found for 64% of the cells tested that were tuned to head orientation. Eighteen cells most responsive to the full face preferred eye contact, while 18 cells tuned to the profile face preferred averted gaze. Sensitivity to gaze was thus compatible with, but could be independent of, sensitivity to head orientation. Results suggest that the recognition of one type of object may proceed via the independent high level analysis of several restricted views of the object (viewer-centred descriptions).


Author(s):  
Kayley Birch-Hurst ◽  
Magdalena Rychlowska ◽  
Michael B. Lewis ◽  
Ross E. Vanderwert

AbstractPeople tend to automatically imitate others’ facial expressions of emotion. That reaction, termed “facial mimicry” has been linked to sensorimotor simulation—a process in which the observer’s brain recreates and mirrors the emotional experience of the other person, potentially enabling empathy and deep, motivated processing of social signals. However, the neural mechanisms that underlie sensorimotor simulation remain unclear. This study tests how interfering with facial mimicry by asking participants to hold a pen in their mouth influences the activity of the human mirror neuron system, indexed by the desynchronization of the EEG mu rhythm. This response arises from sensorimotor brain areas during observed and executed movements and has been linked with empathy. We recorded EEG during passive viewing of dynamic facial expressions of anger, fear, and happiness, as well as nonbiological moving objects. We examine mu desynchronization under conditions of free versus altered facial mimicry and show that desynchronization is present when adult participants can freely move but not when their facial movements are inhibited. Our findings highlight the importance of motor activity and facial expression in emotion communication. They also have important implications for behaviors that involve occupying or hiding the lower part of the face.


Sign in / Sign up

Export Citation Format

Share Document