scholarly journals Robot faces elicit responses intermediate to human faces and objects at face-sensitive ERP components

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Allie R. Geiger ◽  
Benjamin Balas

AbstractFace recognition is supported by selective neural mechanisms that are sensitive to various aspects of facial appearance. These include event-related potential (ERP) components like the P100 and the N170 which exhibit different patterns of selectivity for various aspects of facial appearance. Examining the boundary between faces and non-faces using these responses is one way to develop a more robust understanding of the representation of faces in extrastriate cortex and determine what critical properties an image must possess to be considered face-like. Robot faces are a particularly interesting stimulus class to examine because they can differ markedly from human faces in terms of shape, surface properties, and the configuration of facial features, but are also interpreted as social agents in a range of settings. In the current study, we thus chose to investigate how ERP responses to robot faces may differ from the response to human faces and non-face objects. In two experiments, we examined how the P100 and N170 responded to human faces, robot faces, and non-face objects (clocks). In Experiment 1, we found that robot faces elicit intermediate responses from face-sensitive components relative to non-face objects (clocks) and both real human faces and artificial human faces (computer-generated faces and dolls). These results suggest that while human-like inanimate faces (CG faces and dolls) are processed much like real faces, robot faces are dissimilar enough to human faces to be processed differently. In Experiment 2 we found that the face inversion effect was only partly evident in robot faces. We conclude that robot faces are an intermediate stimulus class that offers insight into the perceptual and cognitive factors that affect how social agents are identified and categorized.

2020 ◽  
Author(s):  
Allie R. Geiger ◽  
Benjamin Balas

AbstractFace recognition is supported by selective neural mechanisms that are sensitive to various aspects of facial appearance. These include ERP components like the P100, N170, and P200 which exhibit different patterns of selectivity for various aspects of facial appearance. Examining the boundary between faces and non-faces using these responses is one way to develop a more robust understanding of the representation of faces in visual cortex and determine what critical properties an image must possess to be considered face-like. Here, we probe this boundary by examining how face-sensitive ERP components respond to robot faces. Robot faces are an interesting stimulus class because they can differ markedly from human faces in terms of shape, surface properties, and the configuration of facial features, but are also interpreted as social agents in a range of settings. In two experiments, we examined how the P100 and N170 responded to human faces, robot faces, and non-face objects (clocks). We found that robot faces elicit intermediate responses from face-sensitive components relative to non-face objects and both real and artificial human faces (Exp. 1), and also that the face inversion effect was only partly evident in robot faces (Exp. 2). We conclude that robot faces are an intermediate stimulus class that offers insight into the perceptual and cognitive factors that affect how social agents are identified and categorized.


Author(s):  
R. L. Palmer ◽  
P. Helmholz ◽  
G. Baynam

Abstract. Facial appearance has long been understood to offer insight into a person’s health. To an experienced clinician, atypical facial features may signify the presence of an underlying rare or genetic disease. Clinicians use their knowledge of how disease affects facial appearance along with the patient’s physiological and behavioural traits, and their medical history, to determine a diagnosis. Specialist expertise and experience is needed to make a dysmorphological facial analysis. Key to this is accurately assessing how a face is significantly different in shape and/or growth compared to expected norms. Modern photogrammetric systems can acquire detailed 3D images of the face which can be used to conduct a facial analysis in software with greater precision than can be obtained in person. Measurements from 3D facial images are already used as an alternative to direct measurement using instruments such as tape measures, rulers, or callipers. However, the ability to take accurate measurements – whether virtual or not – presupposes the assessor’s facility to accurately place the endpoints of the measuring tool at the positions of standardised anatomical facial landmarks. In this paper, we formally introduce Cliniface – a free and open source application that uses a recently published highly precise method of detecting facial landmarks from 3D facial images by non-rigidly transforming an anthropometric mask (AM) to the target face. Inter-landmark measurements are then used to automatically identify facial traits that may be of clinical significance. Herein, we show how non-experts with minimal guidance can use Cliniface to extract facial anthropometrics from a 3D facial image at a level of accuracy comparable to an expert. We further show that Cliniface itself is able to extract the same measurements at a similar level of accuracy – completely automatically.


Author(s):  
Sarah Schroeder ◽  
Kurtis Goad ◽  
Nicole Rothner ◽  
Ali Momen ◽  
Eva Wiese

People process human faces configurally—as a Gestalt or integrated whole—but perceive objects in terms of their individual features. As a result, faces—but not objects—are more difficult to process when presented upside down versus upright. Previous research demonstrates that this inversion effect is not observed when recognizing previously seen android faces, suggesting they are processed more like objects, perhaps due to a lack of perceptual experience and/or motivation to recognize android faces. The current study aimed to determine whether negative emotions, particularly fear of androids, may lessen configural processing of android faces compared to human faces. While the current study replicated previous research showing a greater inversion effect for human compared to android faces, we did not find evidence that negative emotions—such as fear—towards androids influenced the face inversion effect. We discuss the implications of this study and opportunities for future research.


2020 ◽  
Author(s):  
Jordan Wehrman ◽  
Sidsel Sörensen ◽  
Peter de Lissa ◽  
Nicholas A. Badcock

AbstractLow-cost, portable electroencephalographic (EEG) headsets have become commercially available in the last 10 years. One such system, Emotiv’s EPOC, has been modified to allow event-related potential (ERP) research. Because of these innovations, EEG research may become more widely available in non-traditional settings. Although the EPOC has previously been shown to provide data comparable to research-grade equipment and has been used in real-world settings, how EPOC performs without the electrical shielding used in research-grade laboratories is yet to be systematically tested. In the current article we address this gap in the literature by asking participants to perform a simple EEG experiment in shielded and unshielded contexts. The experiment was the observation of human versus wristwatch faces which were either inverted or noninverted. This method elicited the face-sensitive N170 ERP.In both shielded and unshielded contexts, the N170 amplitude was larger when participants viewed human faces and peaked later when a human face was inverted. More importantly, Bayesian analysis showed no difference in the N170 measured in the shielded and unshielded contexts. Further, the signal recorded in both contexts was highly correlated. The EPOC appears to reliably record EEG signals without a purpose-built electrically-shielded room or laboratory-grade preamplifier.


2019 ◽  
Author(s):  
Yasmin Allen-Davidian ◽  
Manuela Russo ◽  
Naohide Yamamoto ◽  
Jordy Kaufman ◽  
Alan J. Pegna ◽  
...  

Face Inversion Effects (FIEs) – differences in response to upside down faces compared to upright faces – occur for both behavioural and electrophysiological responses when people view face stimuli. In EEG, the inversion of a face is often reported to evoke an enhanced amplitude and delayed latency of the N170 event-related potential. This response has historically been attributed to the indexing of specialised face processing mechanisms within the brain. However, inspection of the literature revealed that while the N170 is consistently delayed to photographed, schematic, Mooney and line drawn face stimuli, only naturally photographed faces enhance the amplitude upon inversion. This raises the possibility that the increased N170 amplitudes to inverted faces may have other origins than the inversion of the face’s structural components. In line with previous research establishing the N170 as a prediction error signal, we hypothesise that the unique N170 amplitude response to inverted photographed faces stems from multiple expectation violations, over and above structural inversion. For instance, rotating an image of a face upside down not only violates the expectation that faces appear upright, but also lifelong priors that illumination comes from above and gravity pulls from below. To test this hypothesis, we recorded EEG whilst participants viewed face stimuli (upright versus inverted), where the faces were illuminated from above versus below, and where the models were photographed upright versus hanging upside down. The N170 amplitudes were found to be modulated by a complex interaction between orientation, lighting and gravity factors, with the amplitudes largest when faces consistently violated all three expectations and smallest when all these factors concurred with expectations. These results confirm our hypothesis that FIEs on N170 amplitudes are driven by a violation of the viewer’s expectations across several parameters that characterise faces, rather than a disruption in the configurational disposition of its features.


2002 ◽  
Vol 14 (2) ◽  
pp. 199-209 ◽  
Author(s):  
Michelle de Haan ◽  
Olivier Pascalis ◽  
Mark H. Johnson

Newborn infants respond preferentially to simple face-like patterns, raising the possibility that the face-specific regions identified in the adult cortex are functioning from birth. We sought to evaluate this hypothesis by characterizing the specificity of infants' electrocortical responses to faces in two ways: (1) comparing responses to faces of humans with those to faces of nonhuman primates; and 2) comparing responses to upright and inverted faces. Adults' face-responsive N170 event-related potential (ERP) component showed specificity to upright human faces that was not observable at any point in the ERPs of infants. A putative “infant N170” did show sensitivity to the species of the face, but the orientation of the face did not influence processing until a later stage. These findings suggest a process of gradual specialization of cortical face processing systems during postnatal development.


2010 ◽  
Vol 69 (3) ◽  
pp. 161-167 ◽  
Author(s):  
Jisien Yang ◽  
Adrian Schwaninger

Configural processing has been considered the major contributor to the face inversion effect (FIE) in face recognition. However, most researchers have only obtained the FIE with one specific ratio of configural alteration. It remains unclear whether the ratio of configural alteration itself can mediate the occurrence of the FIE. We aimed to clarify this issue by manipulating the configural information parametrically using six different ratios, ranging from 4% to 24%. Participants were asked to judge whether a pair of faces were entirely identical or different. The paired faces that were to be compared were presented either simultaneously (Experiment 1) or sequentially (Experiment 2). Both experiments revealed that the FIE was observed only when the ratio of configural alteration was in the intermediate range. These results indicate that even though the FIE has been frequently adopted as an index to examine the underlying mechanism of face processing, the emergence of the FIE is not robust with any configural alteration but dependent on the ratio of configural alteration.


2011 ◽  
Vol 5 (2) ◽  
pp. 297-332
Author(s):  
Kate Zebiri

This article aims to explore the Shaykh-mur?d (disciple) or teacher-pupil relationship as portrayed in Western Sufi life writing in recent decades, observing elements of continuity and discontinuity with classical Sufism. Additionally, it traces the influence on the texts of certain developments in religiosity in contemporary Western societies, especially New Age understandings of religious authority. Studying these works will provide an insight into the diversity of expressions of contemporary Sufism, while shedding light on a phenomenon which seems to fly in the face of contemporary social and religious trends which deemphasize external authority and promote the authority of the self or individual autonomy.


2009 ◽  
Vol 8 (3) ◽  
pp. 887-897
Author(s):  
Vishal Paika ◽  
Er. Pankaj Bhambri

The face is the feature which distinguishes a person. Facial appearance is vital for human recognition. It has certain features like forehead, skin, eyes, ears, nose, cheeks, mouth, lip, teeth etc which helps us, humans, to recognize a particular face from millions of faces even after a large span of time and despite large changes in their appearance due to ageing, expression, viewing conditions and distractions such as disfigurement of face, scars, beard or hair style. A face is not merely a set of facial features but is rather but is rather something meaningful in its form.In this paper, depending on the various facial features, a system is designed to recognize them. To reveal the outline of the face, eyes, ears, nose, teeth etc different edge detection techniques have been used. These features are extracted in the term of distance between important feature points. The feature set obtained is then normalized and are feed to artificial neural networks so as to train them for reorganization of facial images.


Sign in / Sign up

Export Citation Format

Share Document