perceptual object
Recently Published Documents


TOTAL DOCUMENTS

45
(FIVE YEARS 7)

H-INDEX

10
(FIVE YEARS 0)

2021 ◽  
Vol 2021 (1) ◽  
Author(s):  
Lars Sandved-Smith ◽  
Casper Hesp ◽  
Jérémie Mattout ◽  
Karl Friston ◽  
Antoine Lutz ◽  
...  

Abstract Meta-awareness refers to the capacity to explicitly notice the current content of consciousness and has been identified as a key component for the successful control of cognitive states, such as the deliberate direction of attention. This paper proposes a formal model of meta-awareness and attentional control using hierarchical active inference. To do so, we cast mental action as policy selection over higher-level cognitive states and add a further hierarchical level to model meta-awareness states that modulate the expected confidence (precision) in the mapping between observations and hidden cognitive states. We simulate the example of mind-wandering and its regulation during a task involving sustained selective attention on a perceptual object. This provides a computational case study for an inferential architecture that is apt to enable the emergence of these central components of human phenomenology, namely, the ability to access and control cognitive states. We propose that this approach can be generalized to other cognitive states, and hence, this paper provides the first steps towards the development of a computational phenomenology of mental action and more broadly of our ability to monitor and control our own cognitive states. Future steps of this work will focus on fitting the model with qualitative, behavioural, and neural data.


2021 ◽  
Vol 12 (1) ◽  
Author(s):  
Amber M. Kline ◽  
Destinee A. Aponte ◽  
Hiroaki Tsukano ◽  
Andrea Giovannucci ◽  
Hiroyuki K. Kato

Abstract Integration of multi-frequency sounds into a unified perceptual object is critical for recognizing syllables in speech. This “feature binding” relies on the precise synchrony of each component’s onset timing, but little is known regarding its neural correlates. We find that multi-frequency sounds prevalent in vocalizations, specifically harmonics, preferentially activate the mouse secondary auditory cortex (A2), whose response deteriorates with shifts in component onset timings. The temporal window for harmonics integration in A2 was broadened by inactivation of somatostatin-expressing interneurons (SOM cells), but not parvalbumin-expressing interneurons (PV cells). Importantly, A2 has functionally connected subnetworks of neurons preferentially encoding harmonic over inharmonic sounds. These subnetworks are stable across days and exist prior to experimental harmonics exposure, suggesting their formation during development. Furthermore, A2 inactivation impairs performance in a discrimination task for coincident harmonics. Together, we propose A2 as a locus for multi-frequency integration, which may form the circuit basis for vocal processing.


2021 ◽  
Author(s):  
Amber M Kline ◽  
Destinee A Aponte ◽  
Hiroaki Tsukano ◽  
Andrea Giovannucci ◽  
Hiroyuki K Kato

Integration of multi-frequency sounds into a unified perceptual object is critical for recognizing syllables in speech. This "feature binding" relies on the precise synchrony of each component's onset timing, but little is known regarding its neural correlates. We find that multi-frequency sounds prevalent in vocalizations, specifically harmonics, preferentially activate the mouse secondary auditory cortex (A2), whose response deteriorates with shifts in component onset timings. The temporal window for harmonics integration in A2 was broadened by inactivation of somatostatin-expressing interneurons (SOM cells), but not parvalbumin-expressing interneurons (PV cells). Importantly, A2 has functionally connected subnetworks of neurons encoding harmonic, but not inharmonic sounds. These subnetworks are stable across days and exist prior to experimental harmonics exposure, suggesting their formation during development. Furthermore, A2 inactivation impairs performance in a discrimination task for coincident harmonics. Together, we propose A2 as a locus for harmonic integration, which may form the circuit basis for vocal processing.


2020 ◽  
Author(s):  
Aaron Nidiffer ◽  
Ramnarayan Ramachandran ◽  
Mark Wallace

Our perceptual system is adept at exploiting sensory regularities to better extract information about our environment. One clear example of this is how the sensory and multisensory systems can utilize consistency to group sensory features into a perceptual object and segregate objects from each other and background noise. Leveraging tenets of object-based attention and multisensory binding, we sought whether this ability scaled with the strength of that consistency. We presented participants with amplitude modulated (AM) auditory and visual streams and asked them to detect imbedded orthogonal, near-threshold frequency modulation (FM) events. We modulated the correlation of the streams by varying the phase of the visual AM. In line with a previous report, we first observed peak performance that was shifted from 0°. After accounting for this, we found that across participants discriminability of the FM event linearly improved with correlation. Additionally, we sought to answer a question left dangling from our previous report as to the possible explanation for the phase shift. We found that phase shift correlated with auditory and visual response time differences, but not point of subjective simultaneity, suggesting differences in sensory processing times may account for the observed phase shift. These results suggest that our perceptual system can bind multisensory features across a spectrum of temporal correlations, a process necessary for multisensory binding in complex environments where unrelated signals may have small errant correlations.


2019 ◽  
Author(s):  
Moritz F. Wurm ◽  
Katharine B. Porter ◽  
Alfonso Caramazza

AbstractObject identification and enumeration rely on the ability to distinguish, or individuate, objects from the background. Does multiple object individuation operate only over bounded, separable objects or does it operate equally over connected features within a single object? While previous fMRI experiments suggest that connectedness affects the processing and enumeration of objects, recent behavioral and EEG studies demonstrated that parallel individuation occurs over both object parts and distinct objects. However, it is unclear whether individuation of object parts and distinct objects relies on a common or independent neural mechanisms. Using fMRI-based multivariate pattern analyses, we here demonstrate that activity patterns in inferior and superior intraparietal sulci (IPS) encode numerosity independently of whether the individuated items are connected parts of a single object or distinct objects. Lateral occipital cortex is more sensitive to perceptual aspects of the two stimulus types and the targets of the stimuli, suggesting a dissociation between ventral and dorsal areas in representing perceptual object properties and more general information about numerosity, respectively. Our results suggest that objecthood is not a necessary prerequisite for parallel individuation in IPS. Rather, our results point toward a common individuation mechanism that selects targets over a flexible object hierarchy, independently of whether the targets are distinct separable objects or parts of a single object.


2019 ◽  
Vol 19 (7) ◽  
pp. 12
Author(s):  
Ning Wei ◽  
Tiangang Zhou ◽  
Zihao Zhang ◽  
Yan Zhuo ◽  
Lin Chen

Author(s):  
Ann-Sophie Barwich

How much does stimulus input shape perception? The common-sense view is that our perceptions are representations of objects and their features and that the stimulus structures the perceptual object. The problem for this view concerns perceptual biases as responsible for distortions and the subjectivity of perceptual experience. These biases are increasingly studied as constitutive factors of brain processes in recent neuroscience. In neural network models the brain is said to cope with the plethora of sensory information by predicting stimulus regularities on the basis of previous experiences. Drawing on this development, this chapter analyses perceptions as processes. Looking at olfaction as a model system, it argues for the need to abandon a stimulus-centred perspective, where smells are thought of as stable percepts, computationally linked to external objects such as odorous molecules. Perception here is presented as a measure of changing signal ratios in an environment informed by expectancy effects from top-down processes.


Sign in / Sign up

Export Citation Format

Share Document