scholarly journals Affective Face Processing Modified by Different Tastes

2021 ◽  
Vol 12 ◽  
Author(s):  
Pei Liang ◽  
Jiayu Jiang ◽  
Jie Chen ◽  
Liuqing Wei

Facial emotional recognition is something used often in our daily lives. How does the brain process the face search? Can taste modify such a process? This study employed two tastes (sweet and acidic) to investigate the cross-modal interaction between taste and emotional face recognition. The behavior responses (reaction time and correct response ratios) and the event-related potential (ERP) were applied to analyze the interaction between taste and face processing. Behavior data showed that when detecting a negative target face with a positive face as a distractor, the participants perform the task faster with an acidic taste than with sweet. No interaction effect was observed with correct response ratio analysis. The early (P1, N170) and mid-stage [early posterior negativity (EPN)] components have shown that sweet and acidic tastes modified the ERP components with the affective face search process in the ERP results. No interaction effect was observed in the late-stage (LPP) component. Our data have extended the understanding of the cross-modal mechanism and provided electrophysiological evidence that affective facial processing could be influenced by sweet and acidic tastes.

2005 ◽  
Vol 62 (1) ◽  
pp. 9-17
Author(s):  
Olga Yaqob

Media coverage of Iraq generally has overlooked the daily lives of ordinary Iraqis. In all the wars Iraq has endured since 1980, we have lost sight of human faces. Every nation is its people, not merely its geographic territory, and these people are all made in the image of God. The illustrations accompanying this article include both images of Iraq's geography (the land) and an image, in the shape of Iraq, formed out of the faces of many different ordinary Iraqi people, from all different religious and geographical areas of the country. In the center of this image is the face of Jesus on the cross. In the suffering of the Iraqi people, I have seen the face of God.


PLoS ONE ◽  
2020 ◽  
Vol 15 (12) ◽  
pp. e0243929
Author(s):  
Siyu Jiang ◽  
Ming Peng ◽  
Xiaohui Wang

It has been widely accepted that moral violations that involve impurity (such as spitting in public) induce the emotion of disgust, but there has been a debate about whether moral violations that do not involve impurity (such as swearing in public) also induce the same emotion. The answer to this question may have implication for understanding where morality comes from and how people make moral judgments. This study aimed to compared the neural mechanisms underlying two kinds of moral violation by using an affective priming task to test the effect of sentences depicting moral violation behaviors with and without physical impurity on subsequent detection of disgusted faces in a visual search task. After reading each sentence, participants completed the face search task. Behavioral and electrophysiological (event-related potential, or ERP) indices of affective priming (P2, N400, LPP) and attention allocation (N2pc) were analyzed. Results of behavioral data and ERP data showed that moral violations both with and without impurity promoted the detection of disgusted faces (RT, N2pc); moral violations without impurity impeded the detection of neutral faces (N400). No priming effect was found on P2 and LPP. The results suggest both types of moral violation influenced the processing of disgusted faces and neutral faces, but the neural activity with temporal characteristics was different.


2010 ◽  
Vol 121 ◽  
pp. S203
Author(s):  
M. Tada ◽  
K. Kirihara ◽  
T. Araki ◽  
Y. Kawakubo ◽  
T. Onitsuka ◽  
...  

2007 ◽  
Vol 19 (11) ◽  
pp. 1815-1826 ◽  
Author(s):  
Roxane J. Itier ◽  
Claude Alain ◽  
Katherine Sedore ◽  
Anthony R. McIntosh

Unlike most other objects that are processed analytically, faces are processed configurally. This configural processing is reflected early in visual processing following face inversion and contrast reversal, as an increase in the N170 amplitude, a scalp-recorded event-related potential. Here, we show that these face-specific effects are mediated by the eye region. That is, they occurred only when the eyes were present, but not when eyes were removed from the face. The N170 recorded to inverted and negative faces likely reflects the processing of the eyes. We propose a neural model of face processing in which face- and eye-selective neurons situated in the superior temporal sulcus region of the human brain respond differently to the face configuration and to the eyes depending on the face context. This dynamic response modulation accounts for the N170 variations reported in the literature. The eyes may be central to what makes faces so special.


2021 ◽  
Vol 11 (7) ◽  
pp. 942
Author(s):  
Antonio Maffei ◽  
Jennifer Goertzen ◽  
Fern Jaspers-Fayer ◽  
Killian Kleffner ◽  
Paola Sessa ◽  
...  

Behavioral and electrophysiological correlates of the influence of task demands on the processing of happy, sad, and fearful expressions were investigated in a within-subjects study that compared a perceptual distraction condition with task-irrelevant faces (e.g., covert emotion task) to an emotion task-relevant categorization condition (e.g., overt emotion task). A state-of-the-art non-parametric mass univariate analysis method was used to address the limitations of previous studies. Behaviorally, participants responded faster to overtly categorized happy faces and were slower and less accurate to categorize sad and fearful faces; there were no behavioral differences in the covert task. Event-related potential (ERP) responses to the emotional expressions included the N170 (140–180 ms), which was enhanced by emotion irrespective of task, with happy and sad expressions eliciting greater amplitudes than neutral expressions. EPN (200–400 ms) amplitude was modulated by task, with greater voltages in the overt condition, and by emotion, however, there was no interaction of emotion and task. ERP activity was modulated by emotion as a function of task only at a late processing stage, which included the LPP (500–800 ms), with fearful and sad faces showing greater amplitude enhancements than happy faces. This study reveals that affective content does not necessarily require attention in the early stages of face processing, supporting recent evidence that the core and extended parts of the face processing system act in parallel, rather than serially. The role of voluntary attention starts at an intermediate stage, and fully modulates the response to emotional content in the final stage of processing.


2001 ◽  
Vol 13 (7) ◽  
pp. 937-951 ◽  
Author(s):  
Noam Sagiv ◽  
Shlomo Bentin

The range of specificity and the response properties of the extrastriate face area were investigated by comparing the N170 event-related potential (ERP) component elicited by photographs of natural faces, realistically painted portraits, sketches of faces, schematic faces, and by nonface meaningful and meaningless visual stimuli. Results showed that the N170 distinguished between faces and nonface stimuli when the concept of a face was clearly rendered by the visual stimulus, but it did not distinguish among different face types: Even a schematic face made from simple line fragments triggered the N170. However, in a second experiment, inversion seemed to have a different effect on natural faces in which face components were available and on the pure gestalt-based schematic faces: The N170 amplitude was enhanced when natural faces were presented upside down but reduded when schematic faces were inverted. Inversion delayed the N170 peak latency for both natural and schematic faces. Together, these results suggest that early face processing in the human brain is subserved by a multiple-component neural system in which both whole-face configurations and face parts are processed. The relative involvement of the two perceptual processes is probably determined by whether the physiognomic value of the stimuli depends upon holistic configuration, or whether the individual components can be associated with faces even when presented outside the face context.


2002 ◽  
Vol 14 (2) ◽  
pp. 199-209 ◽  
Author(s):  
Michelle de Haan ◽  
Olivier Pascalis ◽  
Mark H. Johnson

Newborn infants respond preferentially to simple face-like patterns, raising the possibility that the face-specific regions identified in the adult cortex are functioning from birth. We sought to evaluate this hypothesis by characterizing the specificity of infants' electrocortical responses to faces in two ways: (1) comparing responses to faces of humans with those to faces of nonhuman primates; and 2) comparing responses to upright and inverted faces. Adults' face-responsive N170 event-related potential (ERP) component showed specificity to upright human faces that was not observable at any point in the ERPs of infants. A putative “infant N170” did show sensitivity to the species of the face, but the orientation of the face did not influence processing until a later stage. These findings suggest a process of gradual specialization of cortical face processing systems during postnatal development.


Sign in / Sign up

Export Citation Format

Share Document