Pre-exposure to ambiguous faces modulates top-down control of attentional orienting to counterpredictive gaze cues

2020 ◽  
Author(s):  
Abdulaziz Abubshait ◽  
Ali Momen ◽  
Eva Wiese

Understanding and reacting to others’ nonverbal social signals, such as changes in gaze direction (i.e., gaze cue), is essential for social interactions, as its important for processes such as joint attention and mentalizing. Although attentional orienting in response to gaze cues has a strong reflexive component, accumulating evidence shows that it can be top-down controlled by context information regarding the signals’ social relevance. For example, when a gazer is believed to be an entity “with a mind” (i.e., mind perception), people exert more top-down control on attention orienting. Although increasing an agent’s physical human-likeness can enhance mind perception, it could have negative consequences on top-down control of social attention when a gazer’s physical appearance is categorically ambiguous (i.e., difficult to categorize as human or nonhuman), as resolving this ambiguity would require using cognitive resources that otherwise could be used to top-down control attention orienting. To examine this question, we used mouse-tracking to explore if categorically ambiguos agents are associated with increased processing costs (Experiment 1), whether categorically ambiguous stimuli negatively impact top-down control of social attention (Experiment 2) and if resolving the conflict related to the agent’s categorical ambiguity (using exposure) would restore top-down control to orient attention (Experiment 3). The findings suggest that categorically ambigious stimuli are associated with cognitive conflict, which negatively impact the ability to exert top-down control on attentional orienting in a counterpredicitive gaze cueing paradigm; this negative impact, however, is attenuated when being pre-exposed to the stimuli prior to the gaze cueing task. Taken together, these findings suggest that manipulating physical human-likeness is a powerful way to affect mind perception in human-robot interaction but has a diminishing returns effect on social attention when it is categorically ambiguous due to drainage of cognitive resources and impairment of top-down control.

2019 ◽  
Vol 374 (1771) ◽  
pp. 20180430 ◽  
Author(s):  
Eva Wiese ◽  
Abdulaziz Abubshait ◽  
Bobby Azarian ◽  
Eric J. Blumberg

In social interactions, we rely on non-verbal cues like gaze direction to understand the behaviour of others. How we react to these cues is determined by the degree to which we believe that they originate from an entity with a mind capable of having internal states and showing intentional behaviour, a process called mind perception . While prior work has established a set of neural regions linked to mind perception, research has just begun to examine how mind perception affects social-cognitive mechanisms like gaze processing on a neuronal level. In the current experiment, participants performed a social attention task (i.e. attentional orienting to gaze cues) with either a human or a robot agent (i.e. manipulation of mind perception) while transcranial direct current stimulation (tDCS) was applied to prefrontal and temporo-parietal brain areas. The results show that temporo-parietal stimulation did not modulate mechanisms of social attention, neither in response to the human nor in response to the robot agent, whereas prefrontal stimulation enhanced attentional orienting in response to human gaze cues and attenuated attentional orienting in response to robot gaze cues. The findings suggest that mind perception modulates low-level mechanisms of social cognition via prefrontal structures, and that a certain degree of mind perception is essential in order for prefrontal stimulation to affect mechanisms of social attention. This article is part of the theme issue ‘From social brains to social robots: applying neurocognitive insights to human–robot interaction’.


2018 ◽  
Author(s):  
Eva Wiese ◽  
Aziz Abubshait ◽  
Bobby Azarian ◽  
Eric J. Blumberg

In social interactions, we rely on nonverbal cues like gaze direction to understand the behavior of others. How we react to these cues is determined by the degree to which we believe that they originate from an entity with a mind capable of having internal states and showing intentional behavior, a process called mind perception. While prior work has established a set of neural regions linked to mind perception, research has just begun to examine how mind perception affects social-cognitive mechanisms like gaze processing on a neuronal level. In the current experiment, participants performed a social attention task (i.e., attentional orienting to gaze cues) with either a human or a robot agent (i.e., variation of mind perception), while transcranial direct current stimulation (tDCS) was applied either to prefrontal or temporo-parietal areas, both regions that have been linked to mind perception in previous studies. The results show that stimulation to temporo-parietal areas did not modulate social attention, neither in response to the human nor the robot agent. In contrast, stimulation to prefrontal areas enhanced attentional orienting in response to hu-man gaze cues and attenuated attentional orienting in response to robot gaze cues. Post-hoc analyses revealed that prefrontal stimulation particularly affected those participants who have followed human gaze more strongly than robot gaze at baseline. These findings suggest that mind perception modulates low-level mechanisms of social cognition via pre-frontal structures, and that a certain degree of mind perception is essential in order to benefit from active stimulation to prefrontal areas.


2020 ◽  
Author(s):  
Abdulaziz Abubshait ◽  
Agnieszka Wykowska

Gaze behavior is an important social signal between humans as it communicates locations of interest. People typically orient their attention to where others look, as this informs about others’ intentions and future actions. Studies have shown that humans can engage in similar gaze behavior with robots, but presumably more so when they adopt the intentional stance towards them (i.e., believing robot behaviors are intentional). In laboratory settings, the phenomenon of attending towards the direction of others’ gaze has been examined with the use of the gaze-cueing paradigm. While the gaze-cueing paradigm has been successful in investigating the relationship between adopting the intentional stance towards robots and attention orienting to gaze cues, it is unclear if the repetitiveness of the gaze-cueing paradigm influences adopting the intentional stance. Here, we examined if the duration of exposure to repetitive robot gaze behavior in a gaze-cueing task has a negative impact on subjective attribution of intentionality. Participants performed a short, medium, or long face-to-face gaze-cueing paradigm with an embodied robot while subjective ratings were collected pre and post the interaction. Results show that participants in the long exposure condition had the smallest change in their intention attribution scores, if any, while those in the short exposure condition had a positive change in their intention attribution indicating that participants attributed more intention to the robot after short interactions. The results also show that attention orienting to robot gaze-cues was positively related to how much intention was attributed to the robot, but this relationship became more negative as the length of exposure increased. On the contrary of subjective ratings, the gaze cueing effects increased as a function of the duration of exposure to repetitive behavior. The data suggest a tradeoff between the desired number of trials needed for observing various mechanisms of social cognition, such as gaze cueing effects, and the likelihood of adopting the intentional stance towards a robot.


2019 ◽  
Vol 31 (5) ◽  
pp. 686-698 ◽  
Author(s):  
Brianna Ruth Doherty ◽  
Freek van Ede ◽  
Alexander Fraser ◽  
Eva Zita Patai ◽  
Anna Christina Nobre ◽  
...  

Social attention when viewing natural social (compared with nonsocial) images has functional consequences on contextual memory in healthy human adults. In addition to attention affecting memory performance, strong evidence suggests that memory, in turn, affects attentional orienting. Here, we ask whether the effects of social processing on memory alter subsequent memory-guided attention orienting and corresponding anticipatory dynamics of 8–12 Hz alpha-band oscillations as measured with EEG. Eighteen young adults searched for targets in scenes that contained either social or nonsocial distracters and their memory precision tested. Subsequently, RT was measured as participants oriented to targets appearing in those scenes at either valid (previously learned) locations or invalid (different) locations. Memory precision was poorer for target locations in social scenes. In addition, distractor type moderated the validity effect during memory-guided attentional orienting, with a larger cost in RT when targets appeared at invalid (different) locations within scenes with social distractors. The poorer memory performance was also marked by reduced anticipatory dynamics of spatially lateralized 8–12 Hz alpha-band oscillations for scenes with social distractors. The functional consequences of a social attention bias therefore extend from memory to memory-guided attention orienting, a bidirectional chain that may further reinforce attentional biases.


2021 ◽  
Vol 12 ◽  
Author(s):  
Zelin Chen ◽  
Sarah D. McCrackin ◽  
Alicia Morgan ◽  
Roxane J. Itier

The gaze cueing effect is characterized by faster attentional orienting to a gazed-at than a non-gazed-at target. This effect is often enhanced when the gazing face bears an emotional expression, though this finding is modulated by a number of factors. Here, we tested whether the type of task performed might be one such modulating factor. Target localization and target discrimination tasks are the two most commonly used gaze cueing tasks, and they arguably differ in cognitive resources, which could impact how emotional expression and gaze cues are integrated to orient attention. In a within-subjects design, participants performed both target localization and discrimination gaze cueing tasks with neutral, happy, and fearful faces. The gaze cueing effect for neutral faces was greatly reduced in the discrimination task relative to the localization task, and the emotional enhancement of the gaze cueing effect was only present in the localization task and only when this task was performed first. These results suggest that cognitive resources are needed for gaze cueing and for the integration of emotional expressions and gaze cues. We propose that a shift toward local processing may be the mechanism by which the discrimination task interferes with the emotional modulation of gaze cueing. The results support the idea that gaze cueing can be greatly modulated by top-down influences and cognitive resources and thus taps into endogenous attention. Results are discussed within the context of the recently proposed EyeTune model of social attention.


PLoS ONE ◽  
2021 ◽  
Vol 16 (2) ◽  
pp. e0246577
Author(s):  
Ronda F. Lo ◽  
Andy H. Ng ◽  
Adam S. Cohen ◽  
Joni Y. Sasaki

We examined whether activating independent or interdependent self-construal modulates attention shifting in response to group gaze cues. European Canadians (Study 1) and East Asian Canadians (Study 2) primed with independence vs. interdependence completed a multi-gaze cueing task with a central face gazing left or right, flanked by multiple background faces that either matched or mismatched the direction of the foreground gaze. Results showed that European Canadians (Study 1) mostly ignored background gaze cues and were uninfluenced by the self-construal primes. However, East Asian Canadians (Study 2), who have cultural backgrounds relevant to both independence and interdependence, showed different attention patterns by prime: those primed with interdependence were more distracted by mismatched (vs. matched) background gaze cues, whereas there was no change for those primed with independence. These findings suggest activating an interdependent self-construal modulates social attention mechanisms to attend broadly, but only for those who may find these representations meaningful.


2019 ◽  
Author(s):  
Brianna Doherty ◽  
Frederik van Ede ◽  
Eva Zita Patai ◽  
Alex Fraser ◽  
Anna C. Nobre ◽  
...  

Social attention when viewing natural social (compared to non-social) images has functional consequences on contextual memory in healthy human adults. In addition to attention affecting memory performance, strong evidence suggests that memory in turn affects attentional orienting. Here we ask whether the effects of social processing on memory alter subsequent memory-guided attention orienting, and corresponding anticipatory dynamics of 8-12 Hz alpha-band oscillations as measured with EEG. Eighteen young adults searched for targets in scenes that contained either social or non-social distracters and their memory precision tested. Subsequently, reaction time was measured as participants oriented to targets appearing in those scenes at either valid (previously learned) locations or invalid (different) locations. Memory precision was poorer for target locations in social scenes. In addition, distractor type moderated the validity effect during memory-guided attentional orienting, with a larger cost in reaction time when targets appeared at invalid (different) locations within scenes with social distractors. The poorer memory performance was also marked by reduced anticipatory dynamics of spatially lateralized 8-12 Hz alpha-band oscillations for scenes with social distractors. The functional consequences of a social attention bias therefore extend from memory to memory-guided attention orienting, a bi-directional chain that may further reinforce attentional biases.


2020 ◽  
Author(s):  
Abdulaziz Abubshait ◽  
Patrick P. Weis ◽  
Eva Wiese

Social signals, such as changes in gaze direction, are essential cues to predict others’ mental states and behaviors (i.e., mentalizing). Studies show that humans can mentalize with non-human agents when they perceive a mind in them (i.e., mind perception). Robots that physically and/or behaviorally resemble humans likely trigger mind perception, which enhances the relevance of social cues and improves social-cognitive performance. The current ex-periments examine whether the effect of physical and behavioral influencers of mind perception on social-cognitive processing is modulated by the lifelikeness of a social interaction. Participants interacted with robots of varying degrees of physical (humanlike vs. robot-like) and behavioral (reliable vs. random) human-likeness while the lifelikeness of a social attention task was manipulated across five experiments. The first four experiments manipulated lifelikeness via the physical realism of the robot images (Study 1 and 2), the biological plausibility of the social signals (Study 3), and the plausibility of the social con-text (Study 4). They showed that humanlike behavior affected social attention whereas appearance affected mind perception ratings. However, when the lifelikeness of the interaction was increased by using videos of a human and a robot sending the social cues in a realistic environment (Study 5), social attention mechanisms were affected both by physical appearance and behavioral features, while mind perception ratings were mainly affected by physical appearance. This indicates that in order to understand the effect of physical and behavioral features on social cognition, paradigms should be used that adequately simulate the lifelikeness of social interactions.


Sign in / Sign up

Export Citation Format

Share Document