Face Context Influences Local Part Processing: An ERP Study
Perception of face parts on the basis of features is thought to be different from perception of whole faces, which is more based on configural information. Face context is also suggested to play an important role in face processing. To investigate how face context influences the early-stage perception of facial local parts, we used an oddball paradigm that tested perceptual stages of face processing rather than recognition. We recorded the event-related potentials (ERPs) elicited by whole faces and face parts presented in four conditions (upright-normal, upright-thatcherised, inverted-normal and inverted-thatcherised), as well as the ERPs elicited by non-face objects (whole houses and house parts) with corresponding conditions. The results showed that face context significantly affected the N170 with increased amplitudes and earlier peak latency for upright normal faces. Removing face context delayed the P1 latency but did not affect the P1 amplitude prominently for both upright and inverted normal faces. Across all conditions, neither the N170 nor the P1 was modulated by house context. The significant changes on the N170 and P1 components revealed that face context influences local part processing at the early stage of face processing and this context effect might be specific for face perception. We further suggested that perceptions of whole faces and face parts are functionally distinguished.