scholarly journals Epistemic affordances in gestalt perception as well as in emotional facial expressions and gestures

2021 ◽  
Vol 43 (2) ◽  
pp. 179-198
Author(s):  
Klaus Schwarzfischer

Summary Methodological problems often arise when a special case is confused with the general principle. So you will find affordances only for ‚artifacts’ if you restrict the analysis to ‚artifacts’. The general principle, however, is an ‚invitation character’, which triggers an action. Consequently, an action-theoretical approach known as ‚pragmatic turn’ in cognitive science is recommended. According to this approach, the human being is not a passive-receptive being but actively produces those action effects that open up the world to us (through ‚active inferences’). This ‚ideomotor approach’ focuses on the so-called ‚epistemic actions’, which guide our perception as conscious and unconscious cognitions. Due to ‚embodied cognition’ the own body is assigned an indispensable role. The action theoretical approach of ‚enactive cognition’ enables that every form can be consistently processualized. Thus, each ‚Gestalt’ is understood as the process result of interlocking cognitions of ‚forward modelling’ (which produces anticipations and enables prognoses) and ‚inverse modelling’ (which makes hypotheses about genesis and causality). As can be shown, these cognitions are fed by previous experiences of real interaction, which later changes into a mental trial treatment, which is highly automated and can therefore take place unconsciously. It is now central that every object may have such affordances that call for instrumental or epistemic action. In the simplest case, it is the body and the facial expressions of our counterpart that can be understood as a question and provoke an answer/reaction. Thus, emotion is not only to be understood as expression/output according to the scheme ‚input-processing-output’, but acts itself as a provocative act/input. Consequently, artifacts are neither necessary nor sufficient conditions for affordances. Rather, they exist in all areas of cognition—from Enactive Cognition to Social Cognition.

2021 ◽  
Vol 11 (9) ◽  
pp. 1203 ◽  
Author(s):  
Sara Borgomaneri ◽  
Francesca Vitale ◽  
Simone Battaglia ◽  
Alessio Avenanti

The ability to rapidly process others’ emotional signals is crucial for adaptive social interactions. However, to date it is still unclear how observing emotional facial expressions affects the reactivity of the human motor cortex. To provide insights on this issue, we employed single-pulse transcranial magnetic stimulation (TMS) to investigate corticospinal motor excitability. Healthy participants observed happy, fearful and neutral pictures of facial expressions while receiving TMS over the left or right motor cortex at 150 and 300 ms after picture onset. In the early phase (150 ms), we observed an enhancement of corticospinal excitability for the observation of happy and fearful emotional faces compared to neutral expressions specifically in the right hemisphere. Interindividual differences in the disposition to experience aversive feelings (personal distress) in interpersonal emotional contexts predicted the early increase in corticospinal excitability for emotional faces. No differences in corticospinal excitability were observed at the later time (300 ms) or in the left M1. These findings support the notion that emotion perception primes the body for action and highlights the role of the right hemisphere in implementing a rapid and transient facilitatory response to emotional arousing stimuli, such as emotional facial expressions.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
G. Ruggiero ◽  
M. Rapuano ◽  
A. Cartaud ◽  
Y. Coello ◽  
T. Iachini

AbstractThe space around the body crucially serves a variety of functions, first and foremost, preserving one’s own safety and avoiding injury. Recent research has shown that emotional information, in particular threatening facial expressions, affects the regulation of peripersonal-reaching space (PPS, for action with objects) and interpersonal-comfort space (IPS, for social interaction). Here we explored if emotional facial expressions may similarly or differently affect both spaces in terms of psychophysiological reactions (cardiac inter-beat intervals: IBIs, i.e. inverse of heart rate; Skin Conductance Response amplitude: SCR amplitude) and spatial distance. Through Immersive Virtual Reality technology, participants determined reaching-distance (PPS) and comfort-distance (IPS) from virtual confederates exhibiting happy/angry/neutral facial expressions while being approached by them. During these interactions, spatial distance and psychophysiological reactions were recorded. Results revealed that when interacting with angry virtual confederates the distance increased similarly in both comfort-social and reaching-action spaces. Moreover, interacting with virtual confederates exhibiting angry rather than happy or neutral expressions provoked similar psychophysiological activations (SCR amplitude, IBIs) in both spaces. Regression analyses showed that psychophysiological activations, particularly SCR amplitude in response to virtual confederates approaching with angry expressions, were able to predict the increase of PPS and IPS. These findings suggest that self-protection functions could be the expression of a common defensive mechanism shared by social and action spaces.


2020 ◽  
Author(s):  
Motonori Yamaguchi ◽  
Jack Dylan Moore ◽  
Sarah Hendry ◽  
Felicity Wolohan

The emotional basis of cognitive control has been investigated in the flanker task with various procedures and materials across different studies. The present study examined the issue with the same flanker task but with different types of emotional stimuli and design. In seven experiments, the flanker effect and its sequential modulation according to the preceding trial type were assessed. Experiments 1 and 2 used affective pictures and emotional facial expressions as emotional stimuli, and positive and negative stimuli were intermixed. There was little evidence that emotional stimuli influenced cognitive control. Experiments 3 and 4 used the same affective pictures and facial expressions, but positive and negative stimuli were separated between different participant groups. Emotional stimuli reduced the flanker effect as well as its sequential modulation regardless of valence. Experiments 5 and 6 used affective pictures but manipulated arousal and valence of stimuli orthogonally The results did not replicate the reduced flanker effect or sequential modulation by valence, nor did they show consistent effects of arousal. Experiment 7 used a mood induction technique and showed that sequential modulation was positively correlated with valence rating (the higher the more positive) but was negatively correlated with arousal rating. These results are inconsistent with several previous findings and are difficult to reconcile within a single theoretical framework, confirming an elusive nature of the emotional basis of cognitive control in the flanker task.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Ami Cohen ◽  
Kfir Asraf ◽  
Ivgeny Saveliev ◽  
Orrie Dan ◽  
Iris Haimov

AbstractThe ability to recognize emotions from facial expressions is essential to the development of complex social cognition behaviors, and impairments in this ability are associated with poor social competence. This study aimed to examine the effects of sleep deprivation on the processing of emotional facial expressions and nonfacial stimuli in young adults with and without attention-deficit/hyperactivity disorder (ADHD). Thirty-five men (mean age 25.4) with (n = 19) and without (n = 16) ADHD participated in the study. During the five days preceding the experimental session, the participants were required to sleep at least seven hours per night (23:00/24:00–7:00/9:00) and their sleep was monitored via actigraphy. On the morning of the experimental session, the participants completed a 4-stimulus visual oddball task combining facial and nonfacial stimuli, and repeated it after 25 h of sustained wakefulness. At baseline, both study groups had poorer performance in response to facial rather than non-facial target stimuli on all indices of the oddball task, with no differences between the groups. Following sleep deprivation, rates of omission errors, commission errors and reaction time variability increased significantly in the ADHD group but not in the control group. Time and target type (face/non-face) did not have an interactive effect on any indices of the oddball task. Young adults with ADHD are more sensitive to the negative effects of sleep deprivation on attentional processes, including those related to the processing of emotional facial expressions. As poor sleep and excessive daytime sleepiness are common in individuals with ADHD, it is feasible that poor sleep quality and quantity play an important role in cognitive functioning deficits, including the processing of emotional facial expressions that are associated with ADHD.


2021 ◽  
pp. 174702182199299
Author(s):  
Mohamad El Haj ◽  
Emin Altintas ◽  
Ahmed A Moustafa ◽  
Abdel Halim Boudoukha

Future thinking, which is the ability to project oneself forward in time to pre-experience an event, is intimately associated with emotions. We investigated whether emotional future thinking can activate emotional facial expressions. We invited 43 participants to imagine future scenarios, cued by the words “happy,” “sad,” and “city.” Future thinking was video recorded and analysed with a facial analysis software to classify whether facial expressions (i.e., happy, sad, angry, surprised, scared, disgusted, and neutral facial expression) of participants were neutral or emotional. Analysis demonstrated higher levels of happy facial expressions during future thinking cued by the word “happy” than “sad” or “city.” In contrast, higher levels of sad facial expressions were observed during future thinking cued by the word “sad” than “happy” or “city.” Higher levels of neutral facial expressions were observed during future thinking cued by the word “city” than “happy” or “sad.” In the three conditions, the neutral facial expressions were high compared with happy and sad facial expressions. Together, emotional future thinking, at least for future scenarios cued by “happy” and “sad,” seems to trigger the corresponding facial expression. Our study provides an original physiological window into the subjective emotional experience during future thinking.


2017 ◽  
Vol 29 (5) ◽  
pp. 1749-1761 ◽  
Author(s):  
Johanna Bick ◽  
Rhiannon Luyster ◽  
Nathan A. Fox ◽  
Charles H. Zeanah ◽  
Charles A. Nelson

AbstractWe examined facial emotion recognition in 12-year-olds in a longitudinally followed sample of children with and without exposure to early life psychosocial deprivation (institutional care). Half of the institutionally reared children were randomized into foster care homes during the first years of life. Facial emotion recognition was examined in a behavioral task using morphed images. This same task had been administered when children were 8 years old. Neutral facial expressions were morphed with happy, sad, angry, and fearful emotional facial expressions, and children were asked to identify the emotion of each face, which varied in intensity. Consistent with our previous report, we show that some areas of emotion processing, involving the recognition of happy and fearful faces, are affected by early deprivation, whereas other areas, involving the recognition of sad and angry faces, appear to be unaffected. We also show that early intervention can have a lasting positive impact, normalizing developmental trajectories of processing negative emotions (fear) into the late childhood/preadolescent period.


2017 ◽  
Vol 172 ◽  
pp. 10-18 ◽  
Author(s):  
Gerly Tamm ◽  
Kairi Kreegipuu ◽  
Jaanus Harro ◽  
Nelson Cowan

Sign in / Sign up

Export Citation Format

Share Document