Faculty Opinions recommendation of Abnormal FMRI adaptation to unfamiliar faces in a case of developmental prosopamnesia.

Author(s):  
Aina Puce
2007 ◽  
Vol 17 (14) ◽  
pp. 1259-1264 ◽  
Author(s):  
Mark A. Williams ◽  
Nadja Berberovic ◽  
Jason B. Mattingley

2014 ◽  
Author(s):  
Douglas Martin ◽  
Rachel Swainson ◽  
Gillian Slessor ◽  
Jacqui Hutchison ◽  
Diana Marosi

2021 ◽  
pp. 174702182110097
Author(s):  
Niamh Hunnisett ◽  
Simone Favelle

Unfamiliar face identification is concerningly error prone, especially across changes in viewing conditions. Within-person variability has been shown to improve matching performance for unfamiliar faces, but this has only been demonstrated using images of a front view. In this study, we test whether the advantage of within-person variability from front views extends to matching to target images of a face rotated in view. Participants completed either a simultaneous matching task (Experiment 1) or a sequential matching task (Experiment 2) in which they were tested on their ability to match the identity of a face shown in an array of either one or three ambient front-view images, with a target image shown in front, three-quarter, or profile view. While the effect was stronger in Experiment 2, we found a consistent pattern in match trials across both experiments in that there was a multiple image matching benefit for front, three-quarter, and profile-view targets. We found multiple image effects for match trials only, indicating that providing observers with multiple ambient images confers an advantage for recognising different images of the same identity but not for discriminating between images of different identities. Signal detection measures also indicate a multiple image advantage despite a more liberal response bias for multiple image trials. Our results show that within-person variability information for unfamiliar faces can be generalised across views and can provide insights into the initial processes involved in the representation of familiar faces.


2017 ◽  
Vol 7 (1) ◽  
Author(s):  
Matteo Visconti di Oleggio Castello ◽  
Yaroslav O. Halchenko ◽  
J. Swaroop Guntupalli ◽  
Jason D. Gors ◽  
M. Ida Gobbini

Author(s):  
Louise Neil ◽  
Essi Viding ◽  
Diana Armbruster‐Genc ◽  
Matteo Lisi ◽  
Isabelle Mareshal ◽  
...  

2010 ◽  
Vol 103 (3) ◽  
pp. 1467-1477 ◽  
Author(s):  
John C. Taylor ◽  
Alison J. Wiggett ◽  
Paul E. Downing

People are easily able to perceive the human body across different viewpoints, but the neural mechanisms underpinning this ability are currently unclear. In three experiments, we used functional MRI (fMRI) adaptation to study the view-invariance of representations in two cortical regions that have previously been shown to be sensitive to visual depictions of the human body—the extrastriate and fusiform body areas (EBA and FBA). The BOLD response to sequentially presented pairs of bodies was treated as an index of view invariance. Specifically, we compared trials in which the bodies in each image held identical poses (seen from different views) to trials containing different poses. EBA and FBA adapted to identical views of the same pose, and both showed a progressive rebound from adaptation as a function of the angular difference between views, up to ∼30°. However, these adaptation effects were eliminated when the body stimuli were followed by a pattern mask. Delaying the mask onset increased the response (but not the adaptation effect) in EBA, leaving FBA unaffected. We interpret these masking effects as evidence that view-dependent fMRI adaptation is driven by later waves of neuronal responses in the regions of interest. Finally, in a whole brain analysis, we identified an anterior region of the left inferior temporal sulcus (l-aITS) that responded linearly to stimulus rotation, but showed no selectivity for bodies. Our results show that body-selective cortical areas exhibit a similar degree of view-invariance as other object selective areas—such as the lateral occipitotemporal area (LO) and posterior fusiform gyrus (pFs).


Sign in / Sign up

Export Citation Format

Share Document