scholarly journals Cross-modal integration of reward value during oculomotor planning

2019 ◽  
Author(s):  
Felicia Pei-Hsin Cheng ◽  
Adem Saglam ◽  
Selina André ◽  
Arezoo Pooresmaeili

AbstractReward value guides goal-directed behavior and modulates early sensory processing. Rewarding stimuli are often multisensory but it is not known how reward value is combined across sensory modalities. Here we show that the integration of reward value critically depends on whether the distinct sensory inputs are perceived to emanate from the same multisensory object. We systematically manipulated the congruency in monetary reward values and the relative spatial positions of co-occurring auditory and visual stimuli that served as bimodal distractors during an oculomotor task. The amount of interference induced by the distractors was used as an indicator of their perceptual salience. Our results across two experiments show that when reward value is linked to each modality separately, the value congruence between vision and audition determines the combined salience of the bimodal distractors. However, reward value of vision wins over the value of audition if visual and auditory stimuli have been experienced as belonging to the same audiovisual object prior to the learning of the reward values. The perceived spatial alignment of auditory and visual stimuli is a prerequisite for the integration of their reward values, as no effect of reward value was observed when the two modalities were perceived to be misaligned. These results show that in a task that highly relies on the processing of visual spatial information, the reward values from multiple sensory modalities are integrated with each other, each with their respective weights. This weighting depends on the congruency in reward values, exposure history, and spatial co-localization.


2016 ◽  
Author(s):  
Adrien Peyrache ◽  
Natalie Schieferstein ◽  
Gyorgy Buzsaki

AbstractAnimals integrate multiple sensory inputs to successfully navigate in their environments. Head direction (HD), boundary vector, grid and place cells in the entorhinal-hippocampal system form the brain’s navigational system that allows to identify the animal’s current location, but how the functions of these specialized neuron types are acquired remain to be understood. Here we report that activity of HD neurons are influenced by the ambulatory constraints imposed upon the animal by the boundaries of the explored environment, leading to spurious spatial information. However, in the post-subiculum, the main cortical stage of HD signal processing, HD neurons convey true spatial information in the form of border modulated activity through the integration of additional sensory modalities relative to egocentric position, unlike their driving thalamic inputs. These findings demonstrate how the combination of HD and egocentric information can be transduced into a spatial code.



2021 ◽  
pp. 1-12
Author(s):  
Georg F. Striedter ◽  
R. Glenn Northcutt

Comparative neurobiologists have long wondered when and how the dorsal pallium (e.g., mammalian neocortex) evolved. For the last 50 years, the most widely accepted answer has been that this structure was already present in the earliest vertebrates and, therefore, homologous between the major vertebrate lineages. One challenge for this hypothesis is that the olfactory bulbs project throughout most of the pallium in the most basal vertebrate lineages (notably lampreys, hagfishes, and lungfishes) but do not project to the putative dorsal pallia in teleosts, cartilaginous fishes, and amniotes (i.e., reptiles, birds, and mammals). To make sense of these data, one may hypothesize that a dorsal pallium existed in the earliest vertebrates and received extensive olfactory input, which was subsequently lost in several lineages. However, the dorsal pallium is notoriously difficult to delineate in many vertebrates, and its homology between the various lineages is often based on little more than its topology. Therefore, we suspect that dorsal pallia evolved independently in teleosts, cartilaginous fishes, and amniotes. We further hypothesize that the emergence of these dorsal pallia was accompanied by the phylogenetic restriction of olfactory projections to the pallium and the expansion of inputs from other sensory modalities. We do not deny that the earliest vertebrates may have possessed nonolfactory sensory inputs to some parts of the pallium, but such projections alone do not define a dorsal pallium.



2021 ◽  
Vol 33 (3) ◽  
pp. 506-511
Author(s):  
Sheikh Mohd Saleem ◽  
Chaitnya Aggarwal ◽  
Om Prakash Bera ◽  
Radhika Rana ◽  
Gurmandeep Singh ◽  
...  

"Geographic information system (GIS) collects various kinds of data based on the geographic relationship across space." Data in GIS is stored to visualize, analyze, and interpret geographic data to learn about an area, an ongoing project, site planning, business, health economics and health-related surveys and information. GIS has evolved from ancient disease maps to 3D digital maps and continues to grow even today. The visual-spatial mapping of the data has given us an insight into different diseases ranging from diarrhea, pneumonia to non-communicable diseases like diabetes mellitus, hypertension, cardiovascular diseases, or risk factors like obesity, being overweight, etc. All in a while, this information has highlighted health-related issues and knowledge about these in a contemporary manner worldwide. Researchers, scientists, and administrators use GIS for research project planning, execution, and disease management. Cases of diseases in a specific area or region, the number of hospitals, roads, waterways, and health catchment areas are examples of spatially referenced data that can be captured and easily presented using GIS. Currently, we are facing an epidemic of non-communicable diseases, and a powerful tool like GIS can be used efficiently in such a situation. GIS can provide a powerful and robust framework for effectively monitoring and identifying the leading cause behind such diseases.  GIS, which provides a spatial viewpoint regarding the disease spectrum, pattern, and distribution, is of particular importance in this area and helps better understand disease transmission dynamics and spatial determinants. The use of GIS in public health will be a practical approach for surveillance, monitoring, planning, optimization, and service delivery of health resources to the people at large. The GIS platform can link environmental and spatial information with the disease itself, which makes it an asset in disease control progression all over the globe.



Author(s):  
Silvia-Raluca Matei ◽  
Damian Mircea Totolan ◽  
Claudia Salceanu

Occupational therapy focuses on children's sensory processing and modulation. This chapter approaches specific interventions on children with ASD from several perspectives. OT is based on sensory integrative approach when working with children with ASD: helping parents understand their child's behavior, helping children organize responses to sensory input. The sensory integrative approach is a formulated activity plan that helps people who haven't been able to develop their own sensory recognition program. This plan allows a child to integrate all sorts of different sensory activities in their day so they can engage in and begin to work with a wide variety of sensory inputs. This provides a wide number of benefits. Their focus and attention span increases because they won't have meltdowns from trying to process too much information; sensory integrative approach helps to rebuild/reform the child's nervous system. This allows them to physically handle more sensory input. As a result, OT has been proven effective in working with children with ASD.



2020 ◽  
Vol 146 ◽  
pp. 107530
Author(s):  
Justin T. Fleming ◽  
Abigail L. Noyce ◽  
Barbara G. Shinn-Cunningham


2006 ◽  
Vol 96 (2) ◽  
pp. 813-825 ◽  
Author(s):  
Yoram Gutfreund ◽  
Eric I. Knudsen

Auditory neurons in the owl’s external nucleus of the inferior colliculus (ICX) integrate information across frequency channels to create a map of auditory space. This study describes a powerful, sound-driven adaptation of unit responsiveness in the ICX and explores the implications of this adaptation for sensory processing. Adaptation in the ICX was analyzed by presenting lightly anesthetized owls with sequential pairs of dichotic noise bursts. Adaptation occurred in response even to weak, threshold-level sounds and remained strong for more than 100 ms after stimulus offset. Stimulation by one range of sound frequencies caused adaptation that generalized across the entire broad range of frequencies to which these units responded. Identical stimuli were used to test adaptation in the lateral shell of the central nucleus of the inferior colliculus (ICCls), which provides input directly to the ICX. Compared with ICX adaptation, adaptation in the ICCls was substantially weaker, shorter lasting, and far more frequency specific, suggesting that part of the adaptation observed in the ICX was attributable to processes resident to the ICX. The sharp tuning of ICX neurons to space, along with their broad tuning to frequency, allows ICX adaptation to preserve a representation of stimulus location, regardless of the frequency content of the sound. The ICX is known to be a site of visually guided auditory map plasticity. ICX adaptation could play a role in this cross-modal plasticity by providing a short-term memory of the representation of auditory localization cues that could be compared with later-arriving, visual–spatial information from bimodal stimuli.



1997 ◽  
Vol 8 (3) ◽  
pp. 224-230 ◽  
Author(s):  
Rick O. Gilmore ◽  
Mark H. Johnson

The extent to which infants combine visual (i e, retinal position) and nonvisual (eye or head position) spatial information in planning saccades relates to the issue of what spatial frame or frames of reference influence early visually guided action We explored this question by testing infants from 4 to 6 months of age on the double-step saccade paradigm, which has shown that adults combine visual and eye position information into an egocentric (head- or trunk-centered) representation of saccade target locations In contrast, our results imply that infants depend on a simple retinocentric representation at age 4 months, but by 6 months use egocentric representations more often to control saccade planning Shifts in the representation of visual space for this simple sensorimotor behavior may index maturation in cortical circuitry devoted to visual spatial processing in general



2008 ◽  
Vol 1230 ◽  
pp. 158-167 ◽  
Author(s):  
Günther Lehnert ◽  
Hubert D. Zimmer


2020 ◽  
Vol 34 (06) ◽  
pp. 10369-10376
Author(s):  
Peng Gao ◽  
Hao Zhang

Loop closure detection is a fundamental problem for simultaneous localization and mapping (SLAM) in robotics. Most of the previous methods only consider one type of information, based on either visual appearances or spatial relationships of landmarks. In this paper, we introduce a novel visual-spatial information preserving multi-order graph matching approach for long-term loop closure detection. Our approach constructs a graph representation of a place from an input image to integrate visual-spatial information, including visual appearances of the landmarks and the background environment, as well as the second and third-order spatial relationships between two and three landmarks, respectively. Furthermore, we introduce a new formulation that formulates loop closure detection as a multi-order graph matching problem to compute a similarity score directly from the graph representations of the query and template images, instead of performing conventional vector-based image matching. We evaluate the proposed multi-order graph matching approach based on two public long-term loop closure detection benchmark datasets, including the St. Lucia and CMU-VL datasets. Experimental results have shown that our approach is effective for long-term loop closure detection and it outperforms the previous state-of-the-art methods.



Sign in / Sign up

Export Citation Format

Share Document