emotional face
Recently Published Documents


TOTAL DOCUMENTS

362
(FIVE YEARS 109)

H-INDEX

39
(FIVE YEARS 5)

Assessment ◽  
2022 ◽  
pp. 107319112110680
Author(s):  
Trevor F. Williams ◽  
Niko Vehabovic ◽  
Leonard J. Simms

Facial emotion recognition (FER) tasks are often digitally altered to vary expression intensity; however, such tasks have unknown psychometric properties. In these studies, an FER task was developed and validated—the Graded Emotional Face Task (GEFT)—which provided an opportunity to examine the psychometric properties of such tasks. Facial expressions were altered to produce five intensity levels for six emotions (e.g., 40% anger). In Study 1, 224 undergraduates viewed subsets of these faces and labeled the expressions. An item selection algorithm was used to maximize internal consistency and balance gender and ethnicity. In Study 2, 219 undergraduates completed the final GEFT and a multimethod battery of validity measures. Finally, in Study 3, 407 undergraduates oversampled for borderline personality disorder (BPD) completed the GEFT and a self-report BPD measure. Broad FER scales (e.g., overall anger) demonstrated evidence of reliability and validity; however, more specific subscales (e.g., 40% anger) had more variable psychometric properties. Notably, ceiling/floor effects appeared to decrease both internal consistency and limit external validity correlations. The findings are discussed from the perspective of measurement issues in the social cognition literature.


Author(s):  
Tiana Borgers ◽  
Marla Kürten ◽  
Anna Kappelhoff ◽  
Verena Enneking ◽  
Anne Möllmann ◽  
...  

2021 ◽  
pp. 114355
Author(s):  
Bastian Hillmann ◽  
Agnieszka Zuberer ◽  
Lena Obermeyer ◽  
Michael Erb ◽  
Klaus Scheffler ◽  
...  

2021 ◽  
Vol 11 (9) ◽  
pp. 1195
Author(s):  
Rosalind Hutchings ◽  
Romina Palermo ◽  
Jessica L. Hazelton ◽  
Olivier Piguet ◽  
Fiona Kumfor

Face processing relies on a network of occipito-temporal and frontal brain regions. Temporal regions are heavily involved in looking at and processing emotional faces; however, the contribution of each hemisphere to this process remains under debate. Semantic dementia (SD) is a rare neurodegenerative brain condition characterized by anterior temporal lobe atrophy, which is either predominantly left- (left-SD) or right-lateralised (right-SD). This syndrome therefore provides a unique lesion model to understand the role of laterality in emotional face processing. Here, we investigated facial scanning patterns in 10 left-SD and 6 right-SD patients, compared to 22 healthy controls. Eye tracking was recorded via a remote EyeLink 1000 system, while participants passively viewed fearful, happy, and neutral faces over 72 trials. Analyses revealed that right-SD patients had more fixations to the eyes than controls in the Fear (p = 0.04) condition only. Right-SD patients also showed more fixations to the eyes than left-SD patients in all conditions: Fear (p = 0.01), Happy (p = 0.008), and Neutral (p = 0.04). In contrast, no differences between controls and left-SD patients were observed for any emotion. No group differences were observed for fixations to the mouth, or the whole face. This study is the first to examine patterns of facial scanning in left- versus right- SD, demonstrating more of a focus on the eyes in right-SD. Neuroimaging analyses showed that degradation of the right superior temporal sulcus was associated with increased fixations to the eyes. Together these results suggest that right lateralised brain regions of the face processing network are involved in the ability to efficiently utilise changeable cues from the face.


Sign in / Sign up

Export Citation Format

Share Document