scholarly journals Overt and covert attention shifts to emotional faces – combining EEG, Eye-tracking and a Go/No-go paradigm

2021 ◽  
Author(s):  
Louisa Kulke ◽  
Lena Brümmer ◽  
Arezoo Pooresmaeili ◽  
Annekathrin Schacht

In everyday life, faces with emotional expressions quickly attract attention and eye-movements. To study the neural mechanisms of such emotion-driven attention by means of event-related brain potentials (ERPs), tasks that employ covert shifts of attention are commonly used, in which participants need to inhibit natural eye-movements towards stimuli. It remains, however, unclear how shifts of attention to emotional faces with and without eye-movements differ from each other. The current preregistered study aimed to investigate neural differences between covert and overt emotion-driven attention. We combined eye-tracking with measurements of ERPs to compare shifts of attention to faces with happy, angry or neutral expressions when eye-movements were either executed (Go conditions) or withheld (No-go conditions). Happy and angry faces led to larger EPN amplitudes, shorter latencies of the P1 component and faster saccades, suggesting that emotional expressions significantly affected shifts of attention. Several ERPs (N170, EPN, LPC), were augmented in amplitude when attention was shifted with an eye-movement, indicating an enhanced neural processing of faces if eye-movements had to be executed together with a reallocation of attention. However, the modulation of ERPs by facial expressions did not differ between the Go and No-go conditions, suggesting that emotional content enhances both covert and overt shifts of attention. In summary, our results indicate that overt and covert attention shifts differ but are comparably affected by emotional content.

2019 ◽  
Author(s):  
Louisa Kulke

Emotional faces draw attention and eye-movements towards them. However, the neural mechanisms of attention have mainly been investigated during fixation, which is uncommon in everyday life where people move their eyes to shift attention to faces. Therefore, the current study combined eye-tracking and Electroencephalography (EEG) to measure neural mechanisms of overt attention shifts to faces with happy, neutral and angry expressions, allowing participants to move their eyes freely towards the stimuli. Saccade latencies towards peripheral faces did not differ depending on expression and early neural response (P1) amplitudes and latencies were unaffected. However, the later occurring Early Posterior Negativity (EPN) was significantly larger for emotional than for neutral faces. This response occurs after saccades towards the faces. Therefore, emotion modulations only occurred after an overt shift of gaze towards the stimulus had already been completed. Visual saliency rather than emotional content may therefore drive early saccades, while later top-down processes reflect emotion processing.


PLoS ONE ◽  
2021 ◽  
Vol 16 (1) ◽  
pp. e0245777
Author(s):  
Fanny Poncet ◽  
Robert Soussignan ◽  
Margaux Jaffiol ◽  
Baptiste Gaudelus ◽  
Arnaud Leleu ◽  
...  

Recognizing facial expressions of emotions is a fundamental ability for adaptation to the social environment. To date, it remains unclear whether the spatial distribution of eye movements predicts accurate recognition or, on the contrary, confusion in the recognition of facial emotions. In the present study, we asked participants to recognize facial emotions while monitoring their gaze behavior using eye-tracking technology. In Experiment 1a, 40 participants (20 women) performed a classic facial emotion recognition task with a 5-choice procedure (anger, disgust, fear, happiness, sadness). In Experiment 1b, a second group of 40 participants (20 women) was exposed to the same materials and procedure except that they were instructed to say whether (i.e., Yes/No response) the face expressed a specific emotion (e.g., anger), with the five emotion categories tested in distinct blocks. In Experiment 2, two groups of 32 participants performed the same task as in Experiment 1a while exposed to partial facial expressions composed of actions units (AUs) present or absent in some parts of the face (top, middle, or bottom). The coding of the AUs produced by the models showed complex facial configurations for most emotional expressions, with several AUs in common. Eye-tracking data indicated that relevant facial actions were actively gazed at by the decoders during both accurate recognition and errors. False recognition was mainly associated with the additional visual exploration of less relevant facial actions in regions containing ambiguous AUs or AUs relevant to other emotional expressions. Finally, the recognition of facial emotions from partial expressions showed that no single facial actions were necessary to effectively communicate an emotional state. In contrast, the recognition of facial emotions relied on the integration of a complex set of facial cues.


2021 ◽  
Author(s):  
Louisa Kulke ◽  
Lena Brümmer ◽  
Arezoo Pooresmaeili ◽  
Annekathrin Schacht

Foods ◽  
2021 ◽  
Vol 10 (2) ◽  
pp. 354
Author(s):  
Jakub Berčík ◽  
Johana Paluchová ◽  
Katarína Neomániová

The appearance of food provides certain expectations regarding the harmonization of taste, delicacy, and overall quality, which subsequently affects not only the intake itself but also many other features of the behavior of customers of catering facilities. The main goal of this article is to find out what effect the visual design of food (waffles) prepared from the same ingredients and served in three different ways—a stone plate, street food style, and a white classic plate—has on the consumer’s preferences. In addition to the classic tablet assistance personal interview (TAPI) tools, biometric methods such as eye tracking and face reading were used in order to obtain unconscious feedback. During testing, air quality in the room by means of the Extech device and the influence of the visual design of food on the perception of its smell were checked. At the end of the paper, we point out the importance of using classical feedback collection techniques (TAPI) and their extension in measuring subconscious reactions based on monitoring the eye movements and facial expressions of the respondents, which provides a whole new perspective on the perception of visual design and serving food as well as more effective targeting and use of corporate resources.


2003 ◽  
Vol 358 (1431) ◽  
pp. 561-572 ◽  
Author(s):  
R. J. R. Blair

Human emotional expressions serve a crucial communicatory role allowing the rapid transmission of valence information from one individual to another. This paper will review the literature on the neural mechanisms necessary for this communication: both the mechanisms involved in the production of emotional expressions and those involved in the interpretation of the emotional expressions of others. Finally, reference to the neuro–psychiatric disorders of autism, psychopathy and acquired sociopathy will be made. In these conditions, the appropriate processing of emotional expressions is impaired. In autism, it is argued that the basic response to emotional expressions remains intact but that there is impaired ability to represent the referent of the individual displaying the emotion. In psychopathy, the response to fearful and sad expressions is attenuated and this interferes with socialization resulting in an individual who fails to learn to avoid actions that result in harm to others. In acquired sociopathy, the response to angry expressions in particular is attenuated resulting in reduced regulation of social behaviour.


2016 ◽  
Vol 29 (8) ◽  
pp. 749-771 ◽  
Author(s):  
Min Hooi Yong ◽  
Ted Ruffman

Dogs respond to human emotional expressions. However, it is unknown whether dogs can match emotional faces to voices in an intermodal matching task or whether they show preferences for looking at certain emotional facial expressions over others, similar to human infants. We presented 52 domestic dogs and 24 seven-month-old human infants with two different human emotional facial expressions of the same gender simultaneously, while listening to a human voice expressing an emotion that matched one of them. Consistent with most matching studies, neither dogs nor infants looked longer at the matching emotional stimuli, yet dogs and humans demonstrated an identical pattern of looking less at sad faces when paired with happy or angry faces (irrespective of the vocal stimulus), with no preference for happyversusangry faces. Discussion focuses on why dogs and infants might have an aversion to sad faces, or alternatively, heightened interest in angry and happy faces.


2019 ◽  
Vol 31 (11) ◽  
pp. 1631-1640 ◽  
Author(s):  
Maria Kuehne ◽  
Isabelle Siwy ◽  
Tino Zaehle ◽  
Hans-Jochen Heinze ◽  
Janek S. Lobmaier

Facial expressions provide information about an individual's intentions and emotions and are thus an important medium for nonverbal communication. Theories of embodied cognition assume that facial mimicry and resulting facial feedback plays an important role in the perception of facial emotional expressions. Although behavioral and electrophysiological studies have confirmed the influence of facial feedback on the perception of facial emotional expressions, the influence of facial feedback on the automatic processing of such stimuli is largely unexplored. The automatic processing of unattended facial expressions can be investigated by visual expression-related MMN. The expression-related MMN reflects a differential ERP of automatic detection of emotional changes elicited by rarely presented facial expressions (deviants) among frequently presented facial expressions (standards). In this study, we investigated the impact of facial feedback on the automatic processing of facial expressions. For this purpose, participants ( n = 19) performed a centrally presented visual detection task while neutral (standard), happy, and sad faces (deviants) were presented peripherally. During the task, facial feedback was manipulated by different pen holding conditions (holding the pen with teeth, lips, or nondominant hand). Our results indicate that automatic processing of facial expressions is influenced and thus dependent on the own facial feedback.


2018 ◽  
Vol 122 (4) ◽  
pp. 1432-1448 ◽  
Author(s):  
Charlott Maria Bodenschatz ◽  
Anette Kersting ◽  
Thomas Suslow

Orientation of gaze toward specific regions of the face such as the eyes or the mouth helps to correctly identify the underlying emotion. The present eye-tracking study investigates whether facial features diagnostic of specific emotional facial expressions are processed preferentially, even when presented outside of subjective awareness. Eye movements of 73 healthy individuals were recorded while completing an affective priming task. Primes (pictures of happy, neutral, sad, angry, and fearful facial expressions) were presented for 50 ms with forward and backward masking. Participants had to evaluate subsequently presented neutral faces. Results of an awareness check indicated that participants were subjectively unaware of the emotional primes. No affective priming effects were observed but briefly presented emotional facial expressions elicited early eye movements toward diagnostic regions of the face. Participants oriented their gaze more rapidly to the eye region of the neutral mask after a fearful facial expression. After a happy facial expression, participants oriented their gaze more rapidly to the mouth region of the neutral mask. Moreover, participants dwelled longest on the eye region after a fearful facial expression, and the dwell time on the mouth region was longest for happy facial expressions. Our findings support the idea that briefly presented fearful and happy facial expressions trigger an automatic mechanism that is sensitive to the distribution of relevant facial features and facilitates the orientation of gaze toward them.


2013 ◽  
Vol 6 (4) ◽  
Author(s):  
Banu Cangöz ◽  
Arif Altun ◽  
Petek Aşkar ◽  
Zeynel Baran ◽  
Sacide Güzin Mazman

The main objective of the study is to investigate the effects of age of model, gender of observer, and lateralization on visual screening patterns while looking at the emotional facial expressions. Data were collected through eye tracking methodology. The areas of interests were set to include eyes, nose and mouth. The selected eye metrics were first fixation duration, fixation duration and fixation count. Those eye tracking metrics were recorded for different emotional expressions (sad, happy, neutral), and conditions (the age of model, part of face and lateralization). The results revealed that participants looked at the older faces shorter in time and fixated their gaze less compared to the younger faces. This study also showed that when participants were asked to passively look at the face expressions, eyes were important areas in determining sadness and happiness, whereas eyes and noise were important in determining neutral expression. The longest fixated face area was on eyes for both young and old models. Lastly, hemispheric lateralization hypothesis regarding emotional face process was supported.


Perception ◽  
1997 ◽  
Vol 26 (1_suppl) ◽  
pp. 297-297
Author(s):  
Y Osada ◽  
Y Nagasaka ◽  
R Yamazaki

We recorded eye movements by the method of corneal reflection while ten subjects viewed schematic faces drawn by lines. Each subject viewed different emotional faces: happy, angry, sad, disgusted, interested, frightened, and surprised. We measured the subject's judgements in terms of percentage ‘correct’ and reaction time. Schematic faces were composed of the face outline contours and of the brow, eyes, nose, and mouth which could all be modified to produce particular expressions. By masking parts of the face, we examined which features would have the greatest effects on judgements of emotion. Subjects always gave a saccade to the eyes and fixated even when the eyes were not important for the judgement. They also gave a saccade to the centre of the face and fixated it even when only the mouth was presented. The presentation of only the brow decreased the correct rate on the expression of ‘surprise’ but played an important role in the ‘sad’ judgement. The ‘angry’ judgement depended significantly on the brow and mouth. The eyes contributed greatly to the ‘disgusted’ judgement. These results suggest that the judgement of facial expressions of emotion can be strongly affected by each part of the schematic face. The concentration of saccades on the centre of the face suggests that the ‘configuration balance’ of the face is also likely to be important.


Sign in / Sign up

Export Citation Format

Share Document