scholarly journals Overt and covert attention shifts to emotional faces: Combining EEG, eye tracking, and a go/no‐go paradigm

2021 ◽  
Author(s):  
Louisa Kulke ◽  
Lena Brümmer ◽  
Arezoo Pooresmaeili ◽  
Annekathrin Schacht
2021 ◽  
Author(s):  
Louisa Kulke ◽  
Lena Brümmer ◽  
Arezoo Pooresmaeili ◽  
Annekathrin Schacht

In everyday life, faces with emotional expressions quickly attract attention and eye-movements. To study the neural mechanisms of such emotion-driven attention by means of event-related brain potentials (ERPs), tasks that employ covert shifts of attention are commonly used, in which participants need to inhibit natural eye-movements towards stimuli. It remains, however, unclear how shifts of attention to emotional faces with and without eye-movements differ from each other. The current preregistered study aimed to investigate neural differences between covert and overt emotion-driven attention. We combined eye-tracking with measurements of ERPs to compare shifts of attention to faces with happy, angry or neutral expressions when eye-movements were either executed (Go conditions) or withheld (No-go conditions). Happy and angry faces led to larger EPN amplitudes, shorter latencies of the P1 component and faster saccades, suggesting that emotional expressions significantly affected shifts of attention. Several ERPs (N170, EPN, LPC), were augmented in amplitude when attention was shifted with an eye-movement, indicating an enhanced neural processing of faces if eye-movements had to be executed together with a reallocation of attention. However, the modulation of ERPs by facial expressions did not differ between the Go and No-go conditions, suggesting that emotional content enhances both covert and overt shifts of attention. In summary, our results indicate that overt and covert attention shifts differ but are comparably affected by emotional content.


2019 ◽  
Author(s):  
Louisa Kulke

Emotional faces draw attention and eye-movements towards them. However, the neural mechanisms of attention have mainly been investigated during fixation, which is uncommon in everyday life where people move their eyes to shift attention to faces. Therefore, the current study combined eye-tracking and Electroencephalography (EEG) to measure neural mechanisms of overt attention shifts to faces with happy, neutral and angry expressions, allowing participants to move their eyes freely towards the stimuli. Saccade latencies towards peripheral faces did not differ depending on expression and early neural response (P1) amplitudes and latencies were unaffected. However, the later occurring Early Posterior Negativity (EPN) was significantly larger for emotional than for neutral faces. This response occurs after saccades towards the faces. Therefore, emotion modulations only occurred after an overt shift of gaze towards the stimulus had already been completed. Visual saliency rather than emotional content may therefore drive early saccades, while later top-down processes reflect emotion processing.


2021 ◽  
Vol 12 ◽  
Author(s):  
Kendra Gimhani Kandana Arachchige ◽  
Wivine Blekic ◽  
Isabelle Simoes Loureiro ◽  
Laurent Lefebvre

Numerous studies have explored the benefit of iconic gestures in speech comprehension. However, only few studies have investigated how visual attention was allocated to these gestures in the context of clear versus degraded speech and the way information is extracted for enhancing comprehension. This study aimed to explore the effect of iconic gestures on comprehension and whether fixating the gesture is required for information extraction. Four types of gestures (i.e., semantically and syntactically incongruent iconic gestures, meaningless configurations, and congruent iconic gestures) were presented in a sentence context in three different listening conditions (i.e., clear, partly degraded or fully degraded speech). Using eye tracking technology, participants’ gaze was recorded, while they watched video clips after which they were invited to answer simple comprehension questions. Results first showed that different types of gestures differently attract attention and that the more speech was degraded, the less participants would pay attention to gestures. Furthermore, semantically incongruent gestures appeared to particularly impair comprehension although not being fixated while congruent gestures appeared to improve comprehension despite also not being fixated. These results suggest that covert attention is sufficient to convey information that will be processed by the listener.


2018 ◽  
Vol 9 ◽  
Author(s):  
Kathrin Cohen Kadosh ◽  
Simone P. Haller ◽  
Lena Schliephake ◽  
Mihaela Duta ◽  
Gaia Scerif ◽  
...  
Keyword(s):  

2002 ◽  
Vol 42 (22) ◽  
pp. 2533-2545 ◽  
Author(s):  
Ziad M Hafed ◽  
James J Clark

2014 ◽  
Vol 29 (5) ◽  
pp. 807-815 ◽  
Author(s):  
S. V. Pavlov ◽  
V. V. Korenyok ◽  
N. V. Reva ◽  
A. V. Tumyalis ◽  
K. V. Loktev ◽  
...  

2020 ◽  
Vol 37 (7) ◽  
pp. 2166-2183
Author(s):  
Shayne Sanscartier ◽  
Jessica A. Maxwell ◽  
Penelope Lockwood

Attachment avoidance (discomfort with closeness and intimacy) has been inconsistently linked to visual disengagement from emotional faces, with some studies finding disengagement toward specific emotional faces and others finding no effects. Although most studies use stranger faces as stimuli, it is likely that attachment effects would be most pronounced in the context of attachment relationships. The present study ( N = 92) combined ecologically valid stimuli (i.e., pictures of romantic partner’s face) with eye-tracking methods to more precisely test whether highly avoidant individuals are faster at disengaging from emotional faces. Unexpectedly, attachment avoidance had no effect on saccadic reaction time, regardless of face type or emotion. Instead, all participants took longer to disengage from romantic partner faces than from strangers’ faces, although this effect should be replicated in the future. Our results suggest that romantic attachments capture visual attention on an oculomotor level, regardless of one’s personal attachment orientations.


Author(s):  
Anna Kis ◽  
Anna Hernádi ◽  
Bernadett Miklósi ◽  
Orsolya Kanizsár ◽  
József Topál

Sign in / Sign up

Export Citation Format

Share Document