scholarly journals Rapid neural categorization of facelike objects predicts the perceptual awareness of a face (face pareidolia)

2021 ◽  
Author(s):  
Diane Rekow ◽  
Jean-Yves Baudouin ◽  
Renaud Brochard ◽  
Bruno Rossion ◽  
Arnaud Leleu

AbstractThe human brain rapidly and automatically categorizes faces vs. other visual objects. However, whether face-selective neural activity predicts the subjective experience of a face – perceptual awareness – is debated. To clarify this issue, here we use face pareidolia, i.e., the illusory perception of a face, as a proxy to relate the neural categorization of a variety of facelike objects to conscious face perception. In Experiment 1, scalp electroencephalogram (EEG) is recorded while pictures of human faces or facelike objects – in different stimulation sequences – are interleaved every second (i.e., at 1 Hz) in a rapid 6-Hz train of natural images of nonface objects. Participants do not perform any explicit face categorization task during stimulation, and report whether they perceived illusory faces post-stimulation. A robust categorization response to facelike objects is identified at 1 Hz and harmonics in the EEG frequency spectrum with a facelike occipito-temporal topography. Across all individuals, the facelike categorization response is of about 20% of the response to human faces, but more strongly right-lateralized. Critically, its amplitude is much larger in participants who report having perceived illusory faces. In Experiment 2, facelike or matched nonface objects from the same categories appear at 1 Hz in sequences of nonface objects presented at variable stimulation rates (60 Hz to 12 Hz) and participants explicitly report after each sequence whether they perceived illusory faces. The facelike categorization response already emerges at the shortest stimulus duration (i.e., 17 ms at 60 Hz) and predicts the behavioral report of conscious perception. Strikingly, neural facelike-selectivity emerges exclusively when participants report illusory faces. Collectively, these experiments characterize a neural signature of face pareidolia in the context of rapid categorization, supporting the view that face-selective brain activity reliably predicts the subjective experience of a face from a single glance at a variety of stimuli.Highlights- EEG frequency-tagging measures the rapid categorization of facelike objects- Facelike objects elicit a facelike neural categorization response- Neural face categorization predicts conscious face perception across variable inputs

2017 ◽  
Author(s):  
Raúl Hernández-Pérez ◽  
Luis Concha ◽  
Laura V. Cuaya

AbstractDogs can interpret emotional human faces (especially the ones expressing happiness), yet the cerebral correlates of this process are unknown. Using functional magnetic resonance imaging (fMRI) we studied eight awake and unrestrained dogs. In Experiment 1 dogs observed happy and neutral human faces, and found increased brain activity when viewing happy human faces in temporal cortex and caudate. In Experiment 2 the dogs were presented with human faces expressing happiness, anger, fear, or sadness. Using the resulting cluster from Experiment 1 we trained a linear support vector machine classifier to discriminate between pairs of emotions and found that it could only discriminate between happiness and the other emotions. Finally, evaluation of the whole-brain fMRI time courses through a similar classifier allowed us to predict the emotion being observed by the dogs. Our results show that human emotions are specifically represented in dogs’ brains, highlighting their importance for inter-species communication.


2021 ◽  
Author(s):  
Maria Sancho ◽  
Nicholas R. Klug ◽  
Amreen Mughal ◽  
Thomas J. Heppner ◽  
David Hill-Eubanks ◽  
...  

SUMMARYThe dense network of capillaries composed of capillary endothelial cells (cECs) and pericytes lies in close proximity to all neurons, ideally positioning it to sense neuro/glial-derived compounds that regulate regional and global cerebral perfusion. The membrane potential (VM) of vascular cells serves as the essential output in this scenario, linking brain activity to vascular function. The ATP-sensitive K+ channel (KATP) is a key regulator of vascular VM in other beds, but whether brain capillaries possess functional KATP channels remains unknown. Here, we demonstrate that brain capillary ECs and pericytes express KATP channels that robustly control VM. We further show that the endogenous mediator adenosine acts through A2A receptors and the Gs/cAMP/PKA pathway to activate capillary KATP channels. Moreover, KATP channel stimulation in vivo causes vasodilation and increases cerebral blood flow (CBF). These findings establish the presence of KATP channels in cECs and pericytes and suggest their significant influence on CBF.HIGHLIGHTSCapillary network cellular components—endothelial cells and pericytes—possess functional KATP channels.Activation of KATP channels causes profound hyperpolarization of capillary cell membranes.Capillary KATP channels are activated by exogenous adenosine via A2A receptors and cAMP-dependent protein kinase.KATP channel activation by adenosine or synthetic openers increases cerebral blood flow.


2021 ◽  
Author(s):  
◽  
Daniel Jenkins

<p>Multisensory integration describes the cognitive processes by which information from various perceptual domains is combined to create coherent percepts. For consciously aware perception, multisensory integration can be inferred when information in one perceptual domain influences subjective experience in another. Yet the relationship between integration and awareness is not well understood. One current question is whether multisensory integration can occur in the absence of perceptual awareness. Because there is subjective experience for unconscious perception, researchers have had to develop novel tasks to infer integration indirectly. For instance, Palmer and Ramsey (2012) presented auditory recordings of spoken syllables alongside videos of faces speaking either the same or different syllables, while masking the videos to prevent visual awareness. The conjunction of matching voices and faces predicted the location of a subsequent Gabor grating (target) on each trial. Participants indicated the location/orientation of the target more accurately when it appeared in the cued location (80% chance), thus the authors inferred that auditory and visual speech events were integrated in the absence of visual awareness. In this thesis, I investigated whether these findings generalise to the integration of auditory and visual expressions of emotion. In Experiment 1, I presented spatially informative cues in which congruent facial and vocal emotional expressions predicted the target location, with and without visual masking. I found no evidence of spatial cueing in either awareness condition. To investigate the lack of spatial cueing, in Experiment 2, I repeated the task with aware participants only, and had half of those participants explicitly report the emotional prosody. A significant spatial-cueing effect was found only when participants reported emotional prosody, suggesting that audiovisual congruence can cue spatial attention during aware perception. It remains unclear whether audiovisual congruence can cue spatial attention without awareness, and whether such effects genuinely imply multisensory integration.</p>


2020 ◽  
Vol 32 (7) ◽  
pp. 1369-1380 ◽  
Author(s):  
Nicola Binetti ◽  
Alessandro Tomassini ◽  
Karl Friston ◽  
Sven Bestmann

Timing emerges from a hierarchy of computations ranging from early encoding of physical duration (time sensation) to abstract time representations (time perception) suitable for storage and decisional processes. However, the neural basis of the perceptual experience of time remains elusive. To address this, we dissociate brain activity uniquely related to lower-level sensory and higher-order perceptual timing operations, using event-related fMRI. Participants compared subsecond (500 msec) sinusoidal gratings drifting with constant velocity (standard) against two probe stimuli: (1) control gratings drifting at constant velocity or (2) accelerating gratings, which induced illusory shortening of time. We tested two probe intervals: a 500-msec duration (Short) and a longer duration required for an accelerating probe to be perceived as long as the standard (Long—individually determined). On each trial, participants classified the probe as shorter or longer than the standard. This allowed for comparison of trials with an “Objective” (physical) or “Subjective” (perceived) difference in duration, based on participant classifications. Objective duration revealed responses in bilateral early extrastriate areas, extending to higher visual areas in the fusiform gyrus (at more lenient thresholds). By contrast, Subjective duration was reflected by distributed responses in a cortical/subcortical areas. This comprised the left superior frontal gyrus and the left cerebellum, and a wider set of common timing areas including the BG, parietal cortex, and posterior cingulate cortex. These results suggest two functionally independent timing stages: early extraction of duration information in sensory cortices and Subjective experience of duration in a higher-order cortical–subcortical timing areas.


2019 ◽  
Vol 9 (1) ◽  
Author(s):  
Charlotte Martial ◽  
Armand Mensen ◽  
Vanessa Charland-Verville ◽  
Audrey Vanhaudenhuyse ◽  
Daniel Rentmeister ◽  
...  

Abstract The neurobiological basis of near-death experiences (NDEs) is unknown, but a few studies attempted to investigate it by reproducing in laboratory settings phenomenological experiences that seem to closely resemble NDEs. So far, no study has induced NDE-like features via hypnotic modulation while simultaneously measuring changes in brain activity using high-density EEG. Five volunteers who previously had experienced a pleasant NDE were invited to re-experience the NDE memory and another pleasant autobiographical memory (dating to the same time period), in normal consciousness and with hypnosis. We compared the hypnosis-induced subjective experience with the one of the genuine experience memory. Continuous high-density EEG was recorded throughout. At a phenomenological level, we succeeded in recreating NDE-like features without any adverse effects. Absorption and dissociation levels were reported as higher during all hypnosis conditions as compared to normal consciousness conditions, suggesting that our hypnosis-based protocol increased the felt subjective experience in the recall of both memories. The recall of a NDE phenomenology was related to an increase of alpha activity in frontal and posterior regions. This study provides a proof-of-concept methodology for studying the phenomenon, enabling to prospectively explore the NDE-like features and associated EEG changes in controlled settings.


2012 ◽  
Vol 37 (2) ◽  
pp. 95-99 ◽  
Author(s):  
Elisa Di Giorgio ◽  
David Méary ◽  
Olivier Pascalis ◽  
Francesca Simion

The current study aimed at investigating own- vs. other-species preferences in 3-month-old infants. The infants’ eye movements were recorded during a visual preference paradigm to assess whether they show a preference for own-species faces when contrasted with other-species faces. Human and monkey faces, equated for all low-level perceptual characteristics, were used. Our results demonstrated that 3-month-old infants preferred the human face, suggesting that the face perception system becomes species-specific after 3 months of visual experience with a specific class of faces. The eye tracking results are also showing that fixations were more focused on the eye area of human faces, supporting the notion of their importance in holding visual attention.


2014 ◽  
Vol 26 (5) ◽  
pp. 955-969 ◽  
Author(s):  
Annelinde R. E. Vandenbroucke ◽  
Johannes J. Fahrenfort ◽  
Ilja G. Sligte ◽  
Victor A. F. Lamme

Every day, we experience a rich and complex visual world. Our brain constantly translates meaningless fragmented input into coherent objects and scenes. However, our attentional capabilities are limited, and we can only report the few items that we happen to attend to. So what happens to items that are not cognitively accessed? Do these remain fragmentary and meaningless? Or are they processed up to a level where perceptual inferences take place about image composition? To investigate this, we recorded brain activity using fMRI while participants viewed images containing a Kanizsa figure, an illusion in which an object is perceived by means of perceptual inference. Participants were presented with the Kanizsa figure and three matched nonillusory control figures while they were engaged in an attentionally demanding distractor task. After the task, one group of participants was unable to identify the Kanizsa figure in a forced-choice decision task; hence, they were “inattentionally blind.” A second group had no trouble identifying the Kanizsa figure. Interestingly, the neural signature that was unique to the processing of the Kanizsa figure was present in both groups. Moreover, within-subject multivoxel pattern analysis showed that the neural signature of unreported Kanizsa figures could be used to classify reported Kanizsa figures and that this cross-report classification worked better for the Kanizsa condition than for the control conditions. Together, these results suggest that stimuli that are not cognitively accessed are processed up to levels of perceptual interpretation.


2002 ◽  
Vol 13 (2) ◽  
pp. 190-193 ◽  
Author(s):  
Shlomo Bentin ◽  
Noam Sagiv ◽  
Axel Mecklinger ◽  
Angela Friederici ◽  
Yves D. von Cramon

Accumulated evidence from electrophysiology and neuroimaging suggests that face perception involves extrastriate visual mechanisms specialized in processing physiognomic features and building a perceptual representation that is categorically distinct and can be identified by face-recognition units. In the present experiment, we recorded event-related brain potentials in order to explore possible contextual influences on the activity of this perceptual mechanism. Subjects were first exposed to pairs of small shapes, which did not elicit any face-specific brain activity. The same stimuli, however, elicited face-specific brain activity after subjects saw them embedded in schematic faces, which probably primed the subjects to interpret the shapes as schematic eyes. No face-specific activity was observed when objects rather than faces were used to form the context. We conclude that the activity of face-specific extrastriate perceptual mechanisms can be modulated by contextual constraints that determine the significance of the visual input.


1972 ◽  
Vol 27 (5) ◽  
pp. 509-510 ◽  
Author(s):  
Zenon W. Pylyshyn

2014 ◽  
Vol 369 (1641) ◽  
pp. 20130534 ◽  
Author(s):  
Theofanis I. Panagiotaropoulos ◽  
Vishal Kapoor ◽  
Nikos K. Logothetis

The combination of electrophysiological recordings with ambiguous visual stimulation made possible the detection of neurons that represent the content of subjective visual perception and perceptual suppression in multiple cortical and subcortical brain regions. These neuronal populations, commonly referred to as the neural correlates of consciousness , are more likely to be found in the temporal and prefrontal cortices as well as the pulvinar, indicating that the content of perceptual awareness is represented with higher fidelity in higher-order association areas of the cortical and thalamic hierarchy, reflecting the outcome of competitive interactions between conflicting sensory information resolved in earlier stages. However, despite the significant insights into conscious perception gained through monitoring the activities of single neurons and small, local populations, the immense functional complexity of the brain arising from correlations in the activity of its constituent parts suggests that local, microscopic activity could only partially reveal the mechanisms involved in perceptual awareness. Rather, the dynamics of functional connectivity patterns on a mesoscopic and macroscopic level could be critical for conscious perception. Understanding these emergent spatio-temporal patterns could be informative not only for the stability of subjective perception but also for spontaneous perceptual transitions suggested to depend either on the dynamics of antagonistic ensembles or on global intrinsic activity fluctuations that may act upon explicit neural representations of sensory stimuli and induce perceptual reorganization. Here, we review the most recent results from local activity recordings and discuss the potential role of effective, correlated interactions during perceptual awareness.


Sign in / Sign up

Export Citation Format

Share Document