neutral face
Recently Published Documents


TOTAL DOCUMENTS

84
(FIVE YEARS 37)

H-INDEX

13
(FIVE YEARS 2)

2022 ◽  
Vol 12 ◽  
Author(s):  
Marta F. Nudelman ◽  
Liana C. L. Portugal ◽  
Izabela Mocaiber ◽  
Isabel A. David ◽  
Beatriz S. Rodolpho ◽  
...  

Background: Evidence indicates that the processing of facial stimuli may be influenced by incidental factors, and these influences are particularly powerful when facial expressions are ambiguous, such as neutral faces. However, limited research investigated whether emotional contextual information presented in a preceding and unrelated experiment could be pervasively carried over to another experiment to modulate neutral face processing.Objective: The present study aims to investigate whether an emotional text presented in a first experiment could generate negative emotion toward neutral faces in a second experiment unrelated to the previous experiment.Methods: Ninety-nine students (all women) were randomly assigned to read and evaluate a negative text (negative context) or a neutral text (neutral text) in the first experiment. In the subsequent second experiment, the participants performed the following two tasks: (1) an attentional task in which neutral faces were presented as distractors and (2) a task involving the emotional judgment of neutral faces.Results: The results show that compared to the neutral context, in the negative context, the participants rated more faces as negative. No significant result was found in the attentional task.Conclusion: Our study demonstrates that incidental emotional information available in a previous experiment can increase participants’ propensity to interpret neutral faces as more negative when emotional information is directly evaluated. Therefore, the present study adds important evidence to the literature suggesting that our behavior and actions are modulated by previous information in an incidental or low perceived way similar to what occurs in everyday life, thereby modulating our judgments and emotions.


2021 ◽  
Author(s):  
◽  
Sara C. Moshenrose

<p>Previous research has shown that there may be an association between affect (negative vs. positive) and vertical position (up vs. down) of stimuli. The following research aimed to investigate whether individuals show spatial biases, either up or down, when asked to respond to neutral targets after seeing valenced faces. The research also aimed to investigate what impact manipulating automatic facial mimicry responses would have on response times. The research was conducted over three experiments. In Experiment 1, participants responded to neutral targets in either high or low vertical positions on a computer screen that were preceded by happy and sad schematic faces. There were two facial manipulation conditions. One group held a straw between their lips to inhibit smiling and another group held a straw between their teeth to facilitate smiling. A third group performed the response task without a straw (control condition). The procedure of Experiment 2 was identical to Experiment 1 except the happy and sad schematic faces had additional internal facial features (noses, eyebrows) that varied across trials. For both Experiment 1 and 2, targets preceded by a happy face were responded to significantly faster. In Experiment 3, the procedure was identical to Experiments 1 and 2, except photographic images of happy, neutral, and sad expressions were used. Participants were significantly faster to respond to targets in the high vertical position. Participants were also faster to respond to targets in the control (no straw) condition than the other two straw conditions. In the inhibition smiling condition, participants were faster to respond to targets in the high vertical position than low vertical position after seeing a happy or neutral face. These findings indicate that there may be an association between valenced faces and vertical selective attention that is consistent with orientational metaphors (positive = up), but further research is needed to clarify this.</p>


2021 ◽  
Author(s):  
◽  
Sara C. Moshenrose

<p>Previous research has shown that there may be an association between affect (negative vs. positive) and vertical position (up vs. down) of stimuli. The following research aimed to investigate whether individuals show spatial biases, either up or down, when asked to respond to neutral targets after seeing valenced faces. The research also aimed to investigate what impact manipulating automatic facial mimicry responses would have on response times. The research was conducted over three experiments. In Experiment 1, participants responded to neutral targets in either high or low vertical positions on a computer screen that were preceded by happy and sad schematic faces. There were two facial manipulation conditions. One group held a straw between their lips to inhibit smiling and another group held a straw between their teeth to facilitate smiling. A third group performed the response task without a straw (control condition). The procedure of Experiment 2 was identical to Experiment 1 except the happy and sad schematic faces had additional internal facial features (noses, eyebrows) that varied across trials. For both Experiment 1 and 2, targets preceded by a happy face were responded to significantly faster. In Experiment 3, the procedure was identical to Experiments 1 and 2, except photographic images of happy, neutral, and sad expressions were used. Participants were significantly faster to respond to targets in the high vertical position. Participants were also faster to respond to targets in the control (no straw) condition than the other two straw conditions. In the inhibition smiling condition, participants were faster to respond to targets in the high vertical position than low vertical position after seeing a happy or neutral face. These findings indicate that there may be an association between valenced faces and vertical selective attention that is consistent with orientational metaphors (positive = up), but further research is needed to clarify this.</p>


Author(s):  
Tingji Chen ◽  
Yanting Sun ◽  
Chengzhi Feng ◽  
Wenfeng Feng

Abstract. Emotional signals from the face and body are normally perceived as an integrated whole in everyday life. Previous studies have revealed an incongruent effect which refers to distinctive behavioral and neural responses to emotionally congruent versus incongruent face-body compounds. However, it remains unknown which kind of the face-body compounds caused the incongruence effect. In the present study, we added neutral face and neutral body stimuli to form new face-body compounds. Forty subjects with normal or corrected-to-normal vision participated in this experiment. By comparing the face-body compounds with emotional conflict and face-body compounds with neutral stimuli, we could investigate the source of the incongruent effect. For both behavioral and event-related potential (ERP) data, a 2 (bodily expression: happiness, fear) × 2 (congruence: congruent, incongruent) repeated-measure analysis of variance (ANOVA) was performed to re-investigate the incongruent effect and a 3 (facial expression: fearful, happy, neutral) × 3 (bodily expression: fearful, happy, neutral) repeated-measure ANOVA was performed to clarify the source of the incongruent effect. As expected, both behavioral and ERP results have successfully repeated the incongruent effect. Specifically, the behavioral data showed that emotionally congruent versus incongruent face-body compounds were recognized more accurately ( p < .05). The ERP component of N2 was modulated by the emotional congruency between the facial and bodily expression showing that the emotionally incongruent compounds elicited greater N2 amplitudes than emotionally congruent compounds ( p < .05). No incongruent effect was found for P1 or P3 component ( p = .079, p = .99, respectively). Furthermore, by comparing the emotionally incongruent pairs with the neutral baseline, the present study suggests that the source of the incongruent effect might be from the happy face-fearful body compounds. We speculate that the emotion expressed by the fearful body was much more intensive than the emotion expressed by the happy body and thus caused a stronger interference in judging the facial expressions.


2021 ◽  
Vol 4 (1) ◽  
Author(s):  
Mohammad Rafayet Ali ◽  
Taylor Myers ◽  
Ellen Wagner ◽  
Harshil Ratnu ◽  
E. Ray Dorsey ◽  
...  

AbstractA prevalent symptom of Parkinson’s disease (PD) is hypomimia — reduced facial expressions. In this paper, we present a method for diagnosing PD that utilizes the study of micro-expressions. We analyzed the facial action units (AU) from 1812 videos of 604 individuals (61 with PD and 543 without PD, with a mean age 63.9 y/o, sd. 7.8) collected online through a web-based tool (www.parktest.net). In these videos, participants were asked to make three facial expressions (a smiling, disgusted, and surprised face) followed by a neutral face. Using techniques from computer vision and machine learning, we objectively measured the variance of the facial muscle movements and used it to distinguish between individuals with and without PD. The prediction accuracy using the facial micro-expressions was comparable to methodologies that utilize motor symptoms. Logistic regression analysis revealed that participants with PD had less variance in AU6 (cheek raiser), AU12 (lip corner puller), and AU4 (brow lowerer) than non-PD individuals. An automated classifier using Support Vector Machine was trained on the variances and achieved 95.6% accuracy. Using facial expressions as a future digital biomarker for PD could be potentially transformative for patients in need of remote diagnoses due to physical separation (e.g., due to COVID) or immobility.


Children ◽  
2021 ◽  
Vol 8 (9) ◽  
pp. 774
Author(s):  
Malgorzata Kostecka ◽  
Joanna Kostecka-Jarecka ◽  
Mariola Kowal ◽  
Izabella Jackowska

Children develop food preferences by coming into direct contact with various food products through the senses of taste, touch, sight and smell. The aim of this study was to analyze the food preferences of children aged 4 to 6 years and to determine whether age and gender influence children’s food preferences and whether the preference for sweet taste changes with age. The study involved a paper questionnaire containing images of 115 different food products and dishes. The respondents expressed their preferences by choosing the appropriate emoji (happy, sad or neutral face). The study was conducted between 2018 and 2020, and it involved 684 children from 10 kindergartens. Girls chose a significantly higher number of foods and dishes they liked than boys (p = 0.002), and 4-year-olds gave a higher number of “neutral” responses than 5- and 6-year-olds (p = 0.001). Dietary diversity increased with age, and younger children were familiar with fewer foods than 6-year-olds (p = 0.002). Children had a clear preference for sweet taste, regardless of age and gender. Young children (4-year-olds) were more likely to accept healthy foods despite the fact that they were familiar with fewer products and dishes.


Leonardo ◽  
2021 ◽  
pp. 1-11
Author(s):  
Ana Jofre

Abstract I present a method to simulate facial character development by accumulating an expressive history onto a face. The model analytically combines facial features from Paul Ekman's seven universal facial expressions using a simple Markov chain algorithm. The output is a series of 3d digital faces created in Blender with Python. The results of this work show that systematically imprinting features from emotional expressions onto a neutral face transforms it into one with distinct character. This method could be applied to creative works that depend on character creation, ranging from figurative sculpture to game design, and allows the creator to incorporate chance into the creative process. I demonstrate the sculpture application in this paper with ceramic casts of the generated faces.


2021 ◽  
Vol 12 ◽  
Author(s):  
Marc A. Nordmann ◽  
Ralf Schäfer ◽  
Tobias Müller ◽  
Matthias Franz

Facial mimicry is the automatic tendency to imitate facial expressions of emotions. Alexithymia is associated with a reduced facial mimicry ability to affect expressions of adults. There is evidence that the baby schema may influence this process. In this study it was tested experimentally whether facial mimicry of the alexithymic group (AG) is different from the control group (CG) in response to dynamic facial affect expressions of children and adults. A multi-method approach (20-point Toronto Alexithymia Scale and Toronto Structured Interview for Alexithymia) was used for assessing levels of alexithymia. From 3503 initial data sets, two groups of 38 high and low alexithymic individuals without relevant mental or physical diseases were matched regarding age, gender, and education. Facial mimicry was induced by presentation of naturalistic affect-expressive video sequences (fear, sadness, disgust, anger, and joy) taken from validated sets of faces from adults (Averaged Karolinska Directed Emotional Faces) and children (Picture-Set of Young Children’s Affective Facial Expressions). The videos started with a neutral face and reached maximum affect expression within 2 s. The responses of the groups were measured by facial electromyographic activity (fEMG) of corrugator supercilii and zygomaticus major muscles. Differences in fEMG response (4000 ms) were tested in a variance analytical model. There was one significant main effect for the factor emotion and four interaction effects for the factors group × age, muscle × age, muscle × emotion, and for the triple interaction muscle × age × emotion. The participants of AG showed a decreased fEMG activity in response to the presented faces of adults compared to the CG but not for the faces of children. The affect-expressive faces of children induced enhanced zygomatic and reduced corrugator muscle activity in both groups. Despite existing deficits in the facial mimicry of alexithymic persons, affect-expressive faces of children seem to trigger a stronger positive emotional involvement even in the AG.


The present study consists of two separate experimental studies focusing on how labelling face images influence sexual attraction. In Study 1, 30 gender-neutral face photographs were shown to the participants. A total of 407 participants, 278 (%68) women and 129 (%32) men, participated in the first study. In Study 2, 75% feminine face photos were shown to male participants and 75% masculine face photos were shown to female participants. The number of photos shown to each participant was 30. A total of 282 participants, 151(%54) women and 131 men (%46) participated in the second study. In both studies, some of the participants were told that the photographs belonged to men, while others were told that they belonged to women and questions about photographs in the first study were repeated. Kruskal-Wallis H test conducted to investigate whether the responses changed according to the gender tag of the photos. The results show that the perception of attraction does not only change according to physical features, but also according to the gender label of the body to which physical features belong. The findings are discussed by referring to evolutionary, social constructionist approaches and the concept of sexual fluidity. Keywords: Sexuality, Facial Attractiveness, Sexual Attractiveness, Sexual Fluidity


Author(s):  
Daniela Ruzzante ◽  
Bianca Monachesi ◽  
Noemi Orabona ◽  
Jeroen Vaes

AbstractSexual objectification – perceiving or treating a woman as a sexual object – is a widespread phenomenon. Studies on sexual objectification and its consequences have grown dramatically over the last decades covering multiple and diverse areas of research. However, research studying sexual objectification might have limited internal and external validity due to the lack of a controlled and standardized picture database. Moreover, there is a need to extend this research to other fields including the study of emotions. Therefore, in this paper we introduce the SOBEM Database, a free tool consisting of 280 high-resolution pictures depicting objectified and non-objectified female models expressing a neutral face and three different emotions (happiness, anger, and sadness) with different intensity. We report the validation of this dataset by analyzing results of 134 participants judging pictures on the six basic emotions and on a range of social judgments related to sexual objectification. Results showed how the SOBEM can constitute an appropriate instrument to study both sexual objectification per se and its relation with emotions. This database could therefore become an important instrument able to improve the experimental control in future studies on sexual objectification and to create new links with different fields of research.


Sign in / Sign up

Export Citation Format

Share Document