multisensory information
Recently Published Documents


TOTAL DOCUMENTS

118
(FIVE YEARS 45)

H-INDEX

18
(FIVE YEARS 3)

eLife ◽  
2021 ◽  
Vol 10 ◽  
Author(s):  
Mayu Yamada ◽  
Hirono Ohashi ◽  
Koh Hosoda ◽  
Daisuke Kurabayashi ◽  
Shunsuke Shigaki

Most animals survive and thrive due to navigational behavior to reach their destinations. In order to navigate, it is important for animals to integrate information obtained from multisensory inputs and use that information to modulate their behavior. In this study, by using a virtual reality (VR) system for an insect, we investigated how the adult silkmoth integrates visual and wind direction information during female search behavior (olfactory behavior). According to the behavioral experiments using a VR system, the silkmoth had the highest navigational success rate when odor, vision, and wind information were correctly provided. However, the success rate of the search was reduced if the wind direction information provided was different from the direction actually detected. This indicates that it is important to acquire not only odor information but also wind direction information correctly. When the wind is received from the same direction as the odor, the silkmoth takes positive behavior; if the odor is detected but the wind direction is not in the same direction as the odor, the silkmoth behaves more carefully. This corresponds to a modulation of behavior according to the degree of complexity (turbulence) of the environment. We mathematically modeled the modulation of behavior using multisensory information and evaluated it using simulations. The mathematical model not only succeeded in reproducing the actual silkmoth search behavior but also improved the search success relative to the conventional odor-source search algorithm.


2021 ◽  
Author(s):  
Anouk Keizer ◽  
Manja Engel

Anorexia nervosa (AN) is an eating disorder that mainly affects young women. One of the most striking symptoms of this disorder is the distorted experience of body size and shape. Patients are by definition underweight, but experience and perceive their body as bigger than it in reality is. This body representation disturbance has fascinated scientists for many decades, leading to a rich and diverse body of literature on this topic. Research shows that AN patients do not only think that their body is bigger than reality, and visually perceive it as such, but that other sensory modalities also play an important role in oversized body experiences. Patients for example have an altered (enlarged) size perception of tactile stimuli, and move their body as if it is larger than it actually is. Moreover, patients with AN appear to process and integrate multisensory information differently than healthy individuals, especially in relation to body size. This leads to the conclusion that the representation of the size of the body in the brain is enlarged. This conclusion has important implications for the treatment of body representation disturbances in AN. Traditionally treatment of AN is very cognitive in nature, it is possible however that changed cognitions with respect to body size experiences do not lead to actual changes in metric representations of body size stored in the brain. Recently a few studies have been published in which a multisensory approach in treatment of body representation disturbance in AN has been found to be effective in treating this symptom of AN.


2021 ◽  
Vol 15 ◽  
Author(s):  
Klaudia Grechuta ◽  
Javier De La Torre Costa ◽  
Belén Rubio Ballester ◽  
Paul Verschure

The unique ability to identify one’s own body and experience it as one’s own is fundamental in goal-oriented behavior and survival. However, the mechanisms underlying the so-called body ownership are yet not fully understood. Evidence based on Rubber Hand Illusion (RHI) paradigms has demonstrated that body ownership is a product of reception and integration of self and externally generated multisensory information, feedforward and feedback processing of sensorimotor signals, and prior knowledge about the body. Crucially, however, these designs commonly involve the processing of proximal modalities while the contribution of distal sensory signals to the experience of ownership remains elusive. Here we propose that, like any robust percept, body ownership depends on the integration and prediction across all sensory modalities, including distal sensory signals pertaining to the environment. To test our hypothesis, we created an embodied goal-oriented Virtual Air Hockey Task, in which participants were to hit a virtual puck into a goal. In two conditions, we manipulated the congruency of distal multisensory cues (auditory and visual) while preserving proximal and action-driven signals entirely predictable. Compared to a fully congruent condition, our results revealed a significant decrease on three dimensions of ownership evaluation when distal signals were incongruent, including the subjective report as well as physiological and kinematic responses to an unexpected threat. Together, these findings support the notion that the way we represent our body is contingent upon all the sensory stimuli, including distal and action-independent signals. The present data extend the current framework of body ownership and may also find applications in rehabilitation scenarios.


2021 ◽  
Vol 17 (9) ◽  
pp. e1009383
Author(s):  
Roman Goulard ◽  
Cornelia Buehlmann ◽  
Jeremy E. Niven ◽  
Paul Graham ◽  
Barbara Webb

Insects can navigate efficiently in both novel and familiar environments, and this requires flexiblity in how they are guided by sensory cues. A prominent landmark, for example, can elicit strong innate behaviours (attraction or menotaxis) but can also be used, after learning, as a specific directional cue as part of a navigation memory. However, the mechanisms that allow both pathways to co-exist, interact or override each other are largely unknown. Here we propose a model for the behavioural integration of innate and learned guidance based on the neuroanatomy of the central complex (CX), adapted to control landmark guided behaviours. We consider a reward signal provided either by an innate attraction to landmarks or a long-term visual memory in the mushroom bodies (MB) that modulates the formation of a local vector memory in the CX. Using an operant strategy for a simulated agent exploring a simple world containing a single visual cue, we show how the generated short-term memory can support both innate and learned steering behaviour. In addition, we show how this architecture is consistent with the observed effects of unilateral MB lesions in ants that cause a reversion to innate behaviour. We suggest the formation of a directional memory in the CX can be interpreted as transforming rewarding (positive or negative) sensory signals into a mapping of the environment that describes the geometrical attractiveness (or repulsion). We discuss how this scheme might represent an ideal way to combine multisensory information gathered during the exploration of an environment and support optimal cue integration.


2021 ◽  
Author(s):  
Mayu Yamada ◽  
Hirono Ohashi ◽  
Koh Hosoda ◽  
Daisuke Kurabayashi ◽  
Shunsuke Shigaki

Most animals survive and thrive due to navigation behavior to reach their destinations. In order to navigate, it is important for animals to integrate information obtained from multisensory inputs and use that information to modulate their behavior. In this study, by using a virtual reality (VR) system for an insect, we investigated how an adult silkmoth integrates visual and wind direction information during female search behavior (olfactory behavior). According to the behavioral experiments using the VR system, the silkmoth had the highest navigation success rate when odor, vision, and wind information were correctly provided. However, we found that the success rate of the search signifcantly reduced if wind direction information was provided that was incorrect from the direction actually detected. This indicates that it is important to acquire not only odor information, but also wind direction information correctly. In other words, Behavior was modulated by the degree of co-incidence between the direction of arrival of the odor and the direction of arrival of the wind, and posture control (angular velocity control) was modulated by visual information. We mathematically modeled the modulation of behavior using multisensory information and evaluated it by simulation. As a result, the mathematical model not only succeeded in reproducing the actual female search behavior of the silkmoth, but can also improve search success relative to the conventional odor source search algorithm.


2021 ◽  
pp. 1-29
Author(s):  
Jie Wu ◽  
Qitian Li ◽  
Qiufang Fu ◽  
Michael Rose ◽  
Liping Jing

Abstract Although it has been demonstrated that multisensory information can facilitate object recognition and object memory, it remains unclear whether such facilitation effect exists in category learning. To address this issue, comparable car images and sounds were first selected by a discrimination task in Experiment 1. Then, those selected images and sounds were utilized in a prototype category learning task in Experiments 2 and 3, in which participants were trained with auditory, visual, and audiovisual stimuli, and were tested with trained or untrained stimuli within the same categories presented alone or accompanied with a congruent or incongruent stimulus in the other modality. In Experiment 2, when low-distortion stimuli (more similar to the prototypes) were trained, there was higher accuracy for audiovisual trials than visual trials, but no significant difference between audiovisual and auditory trials. During testing, accuracy was significantly higher for congruent trials than unisensory or incongruent trials, and the congruency effect was larger for untrained high-distortion stimuli than trained low-distortion stimuli. In Experiment 3, when high-distortion stimuli (less similar to the prototypes) were trained, there was higher accuracy for audiovisual trials than visual or auditory trials, and the congruency effect was larger for trained high-distortion stimuli than untrained low-distortion stimuli during testing. These findings demonstrated that higher degree of stimuli distortion resulted in more robust multisensory effect, and the categorization of not only trained but also untrained stimuli in one modality could be influenced by an accompanying stimulus in the other modality.


2021 ◽  
Author(s):  
Tobias Wibble ◽  
Tony Pansell ◽  
Sten Grillner ◽  
Juan Perez-Fernandez

Gaze stabilization compensates for movements of the head or external environment to minimize image blurring, which is critical for visually-guided behaviors. Multisensory information is used to stabilize the visual scene on the retina via the vestibulo-ocular (VOR) and optokinetic (OKR) reflexes. While the organization of neuronal circuits underlying VOR is well described across vertebrates, less is known about the contribution and evolutionary origin of the OKR circuits. Moreover, the integration of these two sensory modalities is still poorly understood. Here, we developed a novel experimental model, the isolated lamprey eye-brain-labyrinth preparation, to analyze the neuronal pathways underlying visuo-vestibular integration which allowed electrophysiological recordings while applying vestibular stimulation using a moving platform, coordinated with visual stimulation via two screens. We show that lampreys exhibit robust visuo-vestibular integration, with optokinetic information processed in the pretectum and integrated with vestibular inputs at several subcortical levels. The enhanced eye movement response to multimodal stimulation favored the vestibular response at increased velocities. The optokinetic signals can be downregulated from tectum. Additionally, saccades are present in the form of nystagmus. The lamprey represents the oldest living group of vertebrates, thus all basic components of the visuo-vestibular control of gaze were present already at the dawn of vertebrate evolution.


2021 ◽  
Vol 12 (1) ◽  
Author(s):  
L. Godenzini ◽  
D. Alwis ◽  
R. Guzulaitis ◽  
S. Honnuraiah ◽  
G. J. Stuart ◽  
...  

AbstractThe capacity of the brain to encode multiple types of sensory input is key to survival. Yet, how neurons integrate information from multiple sensory pathways and to what extent this influences behavior is largely unknown. Using two-photon Ca2+ imaging, optogenetics and electrophysiology in vivo and in vitro, we report the influence of auditory input on sensory encoding in the somatosensory cortex and show its impact on goal-directed behavior. Monosynaptic input from the auditory cortex enhanced dendritic and somatic encoding of tactile stimulation in layer 2/3 (L2/3), but not layer 5 (L5), pyramidal neurons in forepaw somatosensory cortex (S1). During a tactile-based goal-directed task, auditory input increased dendritic activity and reduced reaction time, which was abolished by photoinhibition of auditory cortex projections to forepaw S1. Taken together, these results indicate that dendrites of L2/3 pyramidal neurons encode multisensory information, leading to enhanced neuronal output and reduced response latency during goal-directed behavior.


2021 ◽  
Author(s):  
Célian Bimbard ◽  
Timothy PH Sit ◽  
Anna Lebedeva ◽  
Kenneth D Harris ◽  
Matteo Carandini

Sensory cortices are increasingly thought to encode multisensory information. For instance, primary visual cortex (V1) appears to be influenced by sounds. Here we show that sound-evoked responses in mouse V1 are low-dimensional, similar across neurons and across brains, and can be explained by highly stereotyped uninstructed movements of eyes and body. Thus, neural activity previously interpreted as being sensory or multisensory may have a behavioral origin.


2021 ◽  
pp. 267-284
Author(s):  
Jasmine Ho ◽  
Bigna Lenggenhager

The sense of our body is fundamental to human self-consciousness. Many neurological and psychiatric disorders involve atypical corporeal awareness with symptomatology that might be very heterogeneous, affecting various aspects of the bodily self. A common dichotomy divides disorders of the bodily self into disorders affecting predominantly the body schema and disorders predominantly affecting the body image. Yet, increasing evidence suggests that body schema and body image are mutually dependent, making a clear categorization of most disorders difficult. This interdependence is illustrated with examples of a few selected disorders that encompass an atypical sense of the bodily self. A special focus is placed on underlying neural alterations in various body-related brain regions. While body schema-related disorders might rather be linked to a disruption in the integration of multisensory information into a coherent body representation, especially in premotor and posterior parietal areas, body image disturbances, particularly their affective and cognitive aspects, might be linked to a broader network centred around cortical midline structures that are crucially involved in self-referential processes.


Sign in / Sign up

Export Citation Format

Share Document