scholarly journals Facial Expressions Depicting Compassionate and Critical Emotions: The Development and Validation of a New Emotional Face Stimulus Set

PLoS ONE ◽  
2014 ◽  
Vol 9 (2) ◽  
pp. e88783 ◽  
Author(s):  
Kirsten McEwan ◽  
Paul Gilbert ◽  
Stephane Dandeneau ◽  
Sigrid Lipka ◽  
Frances Maratos ◽  
...  
Author(s):  
Michela Balconi

Neuropsychological studies have underlined the significant presence of distinct brain correlates deputed to analyze facial expression of emotion. It was observed that some cerebral circuits were considered as specific for emotional face comprehension as a function of conscious vs. unconscious processing of emotional information. Moreover, the emotional content of faces (i.e. positive vs. negative; more or less arousing) may have an effect in activating specific cortical networks. Between the others, recent studies have explained the contribution of hemispheres in comprehending face, as a function of type of emotions (mainly related to the distinction positive vs. negative) and of specific tasks (comprehending vs. producing facial expressions). Specifically, ERPs (event-related potentials) analysis overview is proposed in order to comprehend how face may be processed by an observer and how he can make face a meaningful construct even in absence of awareness. Finally, brain oscillations is considered in order to explain the synchronization of neural populations in response to emotional faces when a conscious vs. unconscious processing is activated.


Author(s):  
Walter Mahler ◽  
Sandra Reder

Twenty one adults looked at emotional (sad, happy, fearful) or neutral faces. EEG measures showed that emotional significance of face (stimulus type) modulated the amplitude of EEG, especially for theta and delta frequency band power. Also, emotional discrimination by theta was more distributed on the posterior sites of the scalp for the emotional stimuli. Thus, this frequency band variation could represent a complex set of cognitive processes whereby selective attention becomes focused on an emotional-relevant stimulus.


2018 ◽  
Vol 270 ◽  
pp. 1059-1067 ◽  
Author(s):  
May I. Conley ◽  
Danielle V. Dellarco ◽  
Estee Rubien-Thomas ◽  
Alexandra O. Cohen ◽  
Alessandra Cervera ◽  
...  

Pain ◽  
2014 ◽  
Vol 155 (11) ◽  
pp. 2282-2290 ◽  
Author(s):  
Joseph Walsh ◽  
Christopher Eccleston ◽  
Edmund Keogh

2014 ◽  
Vol 47 (2) ◽  
pp. 562-570 ◽  
Author(s):  
Adam Naples ◽  
Alyssa Nguyen-Phuc ◽  
Marika Coffman ◽  
Anna Kresse ◽  
Susan Faja ◽  
...  

2017 ◽  
Vol 26 (1) ◽  
pp. e1553 ◽  
Author(s):  
Nicole R. Giuliani ◽  
John C. Flournoy ◽  
Elizabeth J. Ivie ◽  
Arielle Von Hippel ◽  
Jennifer H. Pfeifer

2021 ◽  
Author(s):  
Kristina Safar

Experience is suggested to shape the development of emotion processing abilities in infancy. The current dissertation investigated the influence of familiarity with particular face types and emotional faces on emotional face processing within the first year of life using a variety of metrics. The first study examined whether experience with a particular face type (own- vs. other-race faces) affected 6- and 9-month-old infants’ attentional looking preference to fearful facial expressions in a visual paired-comparison (VPC) task. Six-month-old infants showed an attentional preference for fearful over happy facial expressions when expressed by own-race faces, but not other race-faces, whereas 9-month-old infants showed an attentional preference for fearful expressions when expressed by both own-race and other-race faces, suggesting that experience influences how infants deploy their attention to different facial expressions. Using a longitudinal design, the second study examined whether exposure to emotional faces via picture book training at 3 months of age affected infants’ allocation of attention to fearful over happy facial expressions in both a VPC and ERP task at 5 months of age. In the VPC task, 3- and 5-month-olds without exposure to emotional faces demonstrated greater allocation of attention to fearful facial expressions. Differential exposure to emotional faces revealed a potential effect of training: 5-month-olds infants who experienced fearful faces showed an attenuated preference for fearful facial expressions compared to infants who experienced happy faces or no training. Three- and 5-month-old infants did not, however, show differential neural processing of happy and fearful facial expressions. The third study examined whether 5- and 7-month-old infants can match fearful and happy faces and voices in an intermodal preference task, and whether exposure to happy or fearful faces influences this ability. Neither 5- nor 7-month-old infants showed intermodal matching of happy or fearful facial expressions, regardless of exposure to emotional faces. Overall, results from this series of studies add to our understanding of how experience influences the development of emotional face processing in infancy.


2020 ◽  
Vol 34 (5) ◽  
pp. 585-594
Author(s):  
Shivangi Anthwal ◽  
Dinesh Ganotra

Facial expressions are the most preeminent means of conveying one’s emotions and play a significant role in interpersonal communication. Researchers are in pursuit of endowing machines with the ability to interpret emotions from facial expressions as that will make human-computer interaction more efficient. With the objective of effective affect cognition from visual information, we present two dynamic descriptors that can recognise seven principal emotions. The variables of the appearance-based descriptor, FlowCorr, indicate intra-class similarity and inter-class difference by quantifying the degree of correlation of optical flow associated with the image pair and each pre-designed template describing the motion pattern associated with different expressions. The second shape-based descriptor, dyn-HOG, finds the HOG values of the difference image derived by subtracting neutral face from emotional face, and is demonstrated to be more discriminative than previously used static HOG descriptors for classifying facial expressions. Recognition accuracies with multi-class support vector machine obtained on the CK+ and KDEF-dyn datasets are competent with the results of state-of-the-art techniques and empirical analysis of human cognition of emotions.


2021 ◽  
Vol 112 ◽  
pp. 106643
Author(s):  
Richard J. Macatee ◽  
Meghan Carr ◽  
Kaveh Afshar ◽  
Thomas J. Preston

Sign in / Sign up

Export Citation Format

Share Document