scholarly journals Our Faces in the Dog's Brain: Functional Imaging Reveals Temporal Cortex Activation during Perception of Human Faces

PLoS ONE ◽  
2016 ◽  
Vol 11 (3) ◽  
pp. e0149431 ◽  
Author(s):  
Laura V. Cuaya ◽  
Raúl Hernández-Pérez ◽  
Luis Concha
2017 ◽  
Author(s):  
Raúl Hernández-Pérez ◽  
Luis Concha ◽  
Laura V. Cuaya

AbstractDogs can interpret emotional human faces (especially the ones expressing happiness), yet the cerebral correlates of this process are unknown. Using functional magnetic resonance imaging (fMRI) we studied eight awake and unrestrained dogs. In Experiment 1 dogs observed happy and neutral human faces, and found increased brain activity when viewing happy human faces in temporal cortex and caudate. In Experiment 2 the dogs were presented with human faces expressing happiness, anger, fear, or sadness. Using the resulting cluster from Experiment 1 we trained a linear support vector machine classifier to discriminate between pairs of emotions and found that it could only discriminate between happiness and the other emotions. Finally, evaluation of the whole-brain fMRI time courses through a similar classifier allowed us to predict the emotion being observed by the dogs. Our results show that human emotions are specifically represented in dogs’ brains, highlighting their importance for inter-species communication.


2006 ◽  
Vol 6 (3) ◽  
pp. 201-213 ◽  
Author(s):  
T. T. ROGERS ◽  
J. HOCKING ◽  
U. NOPPENEY ◽  
A. MECHELLI ◽  
M. L. GORNO-TEMPINI ◽  
...  

2021 ◽  
Author(s):  
Diane Rekow ◽  
Jean-Yves Baudouin ◽  
Karine Durand ◽  
Arnaud Leleu

Visual categorization is the brain ability to rapidly and automatically respond to widely variable visual inputs in a category-selective manner (i.e., distinct responses between categories and similar responses within categories). Whether category-selective neural responses are purely visual or can be influenced by other sensory modalities remains unclear. Here, we test whether odors modulate visual categorization, expecting that odors facilitate the neural categorization of congruent visual objects, especially when the visual category is ambiguous. Scalp electroencephalogram (EEG) was recorded while natural images depicting various objects were displayed in rapid 12-Hz streams (i.e., 12 images / second) and variable exemplars of a target category (either human faces, cars, or facelike objects in dedicated sequences) were interleaved every 9th stimulus to tag category-selective responses at 12/9 = 1.33 Hz in the EEG frequency spectrum. During visual stimulation, participants (N = 26) were implicitly exposed to odor contexts (either body, gasoline or baseline odors) and performed an orthogonal cross-detection task. We identify clear category-selective responses to every category over the occipito-temporal cortex, with the largest response for human faces and the lowest for facelike objects. Critically, body odor boosts the response to the ambiguous facelike objects (i.e., either perceived as nonface objects or faces) over the right hemisphere, especially for participants reporting their presence post-stimulation. By contrast, odors do not significantly modulate other category-selective responses, nor the general visual response recorded at 12 Hz, revealing a specific influence on the categorization of congruent ambiguous stimuli. Overall, these findings support the view that the brain actively uses cues from the different senses to readily categorize visual inputs, and that olfaction, which is generally considered as poorly functional in humans, is well placed to disambiguate visual information.


2015 ◽  
Author(s):  
Daniel D Dilks ◽  
Peter Cook ◽  
Samuel K Weiller ◽  
Helen P Berns ◽  
Mark H Spivak ◽  
...  

Recent behavioral evidence suggests that dogs, like humans and monkeys, are capable of visual face recognition. But do dogs also exhibit specialized cortical face regions similar to humans and monkeys? Using functional magnetic resonance imaging (fMRI) in six dogs trained to remain motionless during scanning without restraint or sedation, we found a region in the canine temporal lobe that responded significantly more to movies of human faces than to movies of everyday objects. Next, using a new stimulus set to investigate face selectivity in this predefined candidate dog face area, we found that this region responded similarly to images of human faces and dog faces, yet significantly more to both human and dog faces than to images of objects. Such face selectivity was not found in dog primary visual cortex. Taken together, these findings: 1) provide the first evidence for a face-selective region in the temporal cortex of dogs, which cannot be explained by simple low-level visual feature extraction; 2) reveal that neural machinery dedicated to face processing is not unique to primates; and 3) may help explain dogs’ exquisite sensitivity to human social cues.


2020 ◽  
Vol 31 (8) ◽  
pp. 1001-1012 ◽  
Author(s):  
Colin J. Palmer ◽  
Colin W. G. Clifford

Face pareidolia is the phenomenon of seeing facelike structures in everyday objects. Here, we tested the hypothesis that face pareidolia, rather than being limited to a cognitive or mnemonic association, reflects the activation of visual mechanisms that typically process human faces. We focused on sensory cues to social attention, which engage cell populations in temporal cortex that are susceptible to habituation effects. Repeated exposure to “pareidolia faces” that appear to have a specific direction of attention causes a systematic bias in the perception of where human faces are looking, indicating that overlapping sensory mechanisms are recruited when we view human faces and when we experience face pareidolia. These cross-adaptation effects are significantly reduced when pareidolia is abolished by removing facelike features from the objects. These results indicate that face pareidolia is essentially a perceptual phenomenon, occurring when sensory input is processed by visual mechanisms that have evolved to extract specific social content from human faces.


2005 ◽  
Vol 94 (2) ◽  
pp. 1587-1596 ◽  
Author(s):  
Roozbeh Kiani ◽  
Hossein Esteky ◽  
Keiji Tanaka

Neurons in the visual system respond to different visual stimuli with different onset latencies. However, it has remained unknown which stimulus features, aside from stimulus contrast, determine the onset latencies of responses. To examine the possibility that response onset latencies carry information about complex object images, we recorded single-cell responses in the inferior temporal cortex of alert monkeys, while they viewed >1,000 object stimuli. Many cells responded to human and non-primate animal faces with comparable magnitudes but responded significantly more quickly to human faces than to non-primate animal faces. Differences in onset latency may be used to increase the coding capacity or enhance or suppress information about particular object groups by time-dependent modulation.


2020 ◽  
pp. 1-8
Author(s):  
P. Fuentes-Claramonte ◽  
L. López-Araquistain ◽  
S. Sarró ◽  
B. Sans-Sansa ◽  
J. Ortiz-Gil ◽  
...  

Abstract Background One hypothesis proposed to underlie formal thought disorder (FTD), the incoherent speech is seen in some patients with schizophrenia, is that it reflects impairment in frontal/executive function. While this proposal has received support in neuropsychological studies, it has been relatively little tested using functional imaging. This study aimed to examine brain activations associated with FTD, and its two main factor-analytically derived subsyndromes, during the performance of a working memory task. Methods Seventy patients with schizophrenia showing a full range of FTD scores and 70 matched healthy controls underwent fMRI during the performance of the 2-back version of the n-back task. Whole-brain corrected, voxel-based correlations with FTD scores were examined in the patient group. Results During 2-back performance the patients showed clusters of significant inverse correlation with FTD scores in the inferior frontal cortex and dorsolateral prefrontal cortex bilaterally, the left temporal cortex and subcortically in the basal ganglia and thalamus. Further analysis revealed that these correlations reflected an association only with ‘alogia’ (poverty of speech, poverty of content of speech and perseveration) and not with the ‘fluent disorganization’ component of FTD. Conclusions This study provides functional imaging support for the view that FTD in schizophrenia may involve impaired executive/frontal function. However, the relationship appears to be exclusively with alogia and not with the variables contributing to fluent disorganization.


Author(s):  
Evelyne Mercure ◽  
Laura Kischkel

From the first days of life, babies appear to be naturally attracted to human faces and voices. These early biases maximize their social interaction and experience of social stimuli, leading to an impressive neurocognitive development of social perception in their first year. Recent advances in neuroimaging methods have revealed patterns of voice sensitivity in the infant’s temporal cortex. These patterns resemble the ones observed in adults, and their voice sensitivity increases rapidly in the first few months of life. Voice sensitivity in the infant temporal cortex is observed cross-culturally but seems to be altered in infants at risk of developing autism spectrum disorder. Activation of the infant social brain is also modulated by the emotional content of human vocalizations in regions involved in emotion processing in adulthood, such as the temporal voice-sensitive area, amygdala, and orbitofrontal cortex. Very young infants also appear to be sensitive to the integration of facial and vocal cues of gender, age, identity, emotions, and speech articulation. These skills are present from the first months of life and are increasingly more sophisticated as infants get closer to their first birthday. This chapter discusses the most recent findings in the early development of voice processing in the infant brain, as well as the emergence of audiovisual integration in social perception.


2011 ◽  
Vol 23 (8) ◽  
pp. 1911-1920 ◽  
Author(s):  
Maria Ida Gobbini ◽  
Claudio Gentili ◽  
Emiliano Ricciardi ◽  
Claudia Bellucci ◽  
Pericle Salvini ◽  
...  

We designed an fMRI experiment comparing perception of human faces and robotic faces producing emotional expressions. The purpose of our experiment was to investigate engagement of different parts of the social brain by viewing these animate and inanimate agents. Both human and robotic face expressions evoked activity in face-responsive regions in the fusiform gyrus and STS and in the putative human mirror neuron system. These results suggest that these areas mediate perception of agency, independently of whether the agents are living or not. By contrast, the human faces evoked stronger activity than did robotic faces in the medial pFC and the anterior temporal cortex—areas associated with the representation of others' mental states (theory of mind), whereas robotic faces evoked stronger activity in areas associated with perception of objects and mechanical movements. Our data demonstrate that the representation of the distinction between animate and inanimate agents involves areas that participate in attribution of mental stance.


2015 ◽  
Author(s):  
Daniel D Dilks ◽  
Peter Cook ◽  
Samuel K Weiller ◽  
Helen P Berns ◽  
Mark H Spivak ◽  
...  

Recent behavioral evidence suggests that dogs, like humans and monkeys, are capable of visual face recognition. But do dogs also exhibit specialized cortical face regions similar to humans and monkeys? Using functional magnetic resonance imaging (fMRI) in six dogs trained to remain motionless during scanning without restraint or sedation, we found a region in the canine temporal lobe that responded significantly more to movies of human faces than to movies of everyday objects. Next, using a new stimulus set to investigate face selectivity in this predefined candidate dog face area, we found that this region responded similarly to images of human faces and dog faces, yet significantly more to both human and dog faces than to images of objects. Such face selectivity was not found in dog primary visual cortex. Taken together, these findings: 1) provide the first evidence for a face-selective region in the temporal cortex of dogs, which cannot be explained by simple low-level visual feature extraction; 2) reveal that neural machinery dedicated to face processing is not unique to primates; and 3) may help explain dogs’ exquisite sensitivity to human social cues.


Sign in / Sign up

Export Citation Format

Share Document