Using Multisensory Information About the Success of a Baseball Swing

2008 ◽  
Author(s):  
Rob Gray
2021 ◽  
Vol 11 (4) ◽  
pp. 518
Author(s):  
Sara De Angelis ◽  
Alessandro Antonio Princi ◽  
Fulvio Dal Farra ◽  
Giovanni Morone ◽  
Carlo Caltagirone ◽  
...  

Postural instability and fear of falling represent two major causes of decreased mobility and quality of life in cerebrovascular and neurologic diseases. In recent years, rehabilitation strategies were carried out considering a combined sensorimotor intervention and an active involvement of the patients during the rehabilitation sessions. Accordingly, new technological devices and paradigms have been developed to increase the effectiveness of rehabilitation by integrating multisensory information and augmented feedback promoting the involvement of the cognitive paradigm in neurorehabilitation. In this context, the vibrotactile feedback (VF) could represent a peripheral therapeutic input, in order to provide spatial proprioceptive information to guide the patient during task-oriented exercises. The present systematic review and metanalysis aimed to explore the effectiveness of the VF on balance and gait rehabilitation in patients with neurological and cerebrovascular diseases. A total of 18 studies met the inclusion criteria and were included. Due to the lack of high-quality studies and heterogeneity of treatments protocols, clinical practice recommendations on the efficacy of VF cannot be made. Results show that VF-based intervention could be a safe complementary sensory-motor approach for balance and gait rehabilitation in patients with neurological and cerebrovascular diseases. More high-quality randomized controlled trials are needed.


2021 ◽  
Author(s):  
Mayu Yamada ◽  
Hirono Ohashi ◽  
Koh Hosoda ◽  
Daisuke Kurabayashi ◽  
Shunsuke Shigaki

Most animals survive and thrive due to navigation behavior to reach their destinations. In order to navigate, it is important for animals to integrate information obtained from multisensory inputs and use that information to modulate their behavior. In this study, by using a virtual reality (VR) system for an insect, we investigated how an adult silkmoth integrates visual and wind direction information during female search behavior (olfactory behavior). According to the behavioral experiments using the VR system, the silkmoth had the highest navigation success rate when odor, vision, and wind information were correctly provided. However, we found that the success rate of the search signifcantly reduced if wind direction information was provided that was incorrect from the direction actually detected. This indicates that it is important to acquire not only odor information, but also wind direction information correctly. In other words, Behavior was modulated by the degree of co-incidence between the direction of arrival of the odor and the direction of arrival of the wind, and posture control (angular velocity control) was modulated by visual information. We mathematically modeled the modulation of behavior using multisensory information and evaluated it by simulation. As a result, the mathematical model not only succeeded in reproducing the actual female search behavior of the silkmoth, but can also improve search success relative to the conventional odor source search algorithm.


Author(s):  
Marilyn C. Salzman ◽  
Chris Dede ◽  
R. Bowen Loftin ◽  
Debra Sprague

Understanding how to leverage the features of immersive, three-dimensional (3-D) multisensory virtual reality to meet user needs presents a challenge for human factors researchers. This paper describes our approach to evaluating this medium's potential as a tool for teaching abstract science. It describes some of our early research outcomes and discusses an evaluation comparing a 3-D VR microworld to an alternative 2-D computer-based microworld. Both are simulations in which students learn about electrostatics. The outcomes of the comparison study suggest: 1) the immersive 3-D VR microworld facilitated conceptual and three-dimensional learning that the 2-D computer microworld did not, and 2) VR's multisensory information aided students who found the electrostatics concepts challenging. As a whole, our research suggests that VR's immersive representational abilities have promise for teaching and for visualization. It also demonstrates that characteristics of the learning experience such as usability, motivation, and simulator sickness are important part of assessing this medium's potential.


2013 ◽  
Vol 26 (4) ◽  
pp. 347-370 ◽  
Author(s):  
Marine Taffou ◽  
Rachid Guerchouche ◽  
George Drettakis ◽  
Isabelle Viaud-Delmon

In a natural environment, affective information is perceived via multiple senses, mostly audition and vision. However, the impact of multisensory information on affect remains relatively undiscovered. In this study, we investigated whether the auditory–visual presentation of aversive stimuli influences the experience of fear. We used the advantages of virtual reality to manipulate multisensory presentation and to display potentially fearful dog stimuli embedded in a natural context. We manipulated the affective reactions evoked by the dog stimuli by recruiting two groups of participants: dog-fearful and non-fearful participants. The sensitivity to dog fear was assessed psychometrically by a questionnaire and also at behavioral and subjective levels using a Behavioral Avoidance Test (BAT). Participants navigated in virtual environments, in which they encountered virtual dog stimuli presented through the auditory channel, the visual channel or both. They were asked to report their fear using Subjective Units of Distress. We compared the fear for unimodal (visual or auditory) and bimodal (auditory–visual) dog stimuli. Dog-fearful participants as well as non-fearful participants reported more fear in response to bimodal audiovisual compared to unimodal presentation of dog stimuli. These results suggest that fear is more intense when the affective information is processed via multiple sensory pathways, which might be due to a cross-modal potentiation. Our findings have implications for the field of virtual reality-based therapy of phobias. Therapies could be refined and improved by implicating and manipulating the multisensory presentation of the feared situations.


1985 ◽  
pp. 125-151 ◽  
Author(s):  
C. I. Howarth ◽  
W. D. A. Beggs

2020 ◽  
Vol 30 (8) ◽  
pp. 4410-4423
Author(s):  
You Li ◽  
Carol Seger ◽  
Qi Chen ◽  
Lei Mo

Abstract Humans are able to categorize things they encounter in the world (e.g., a cat) by integrating multisensory information from the auditory and visual modalities with ease and speed. However, how the brain learns multisensory categories remains elusive. The present study used functional magnetic resonance imaging to investigate, for the first time, the neural mechanisms underpinning multisensory information-integration (II) category learning. A sensory-modality-general network, including the left insula, right inferior frontal gyrus (IFG), supplementary motor area, left precentral gyrus, bilateral parietal cortex, and right caudate and globus pallidus, was recruited for II categorization, regardless of whether the information came from a single modality or from multiple modalities. Putamen activity was higher in correct categorization than incorrect categorization. Critically, the left IFG and left body and tail of the caudate were activated in multisensory II categorization but not in unisensory II categorization, which suggests this network plays a specific role in integrating multisensory information during category learning. The present results extend our understanding of the role of the left IFG in multisensory processing from the linguistic domain to a broader role in audiovisual learning.


1994 ◽  
Vol 71 (1) ◽  
pp. 429-432 ◽  
Author(s):  
M. T. Wallace ◽  
B. E. Stein

1. The synthesis of information from different sensory modalities in the superior colliculus is an important precursor of attentive and orientation behavior. 2. This integration of multisensory information is critically dependent on inputs from a small area of association cortex, the anterior ectosylvian sulcus. Removal of these corticotectal influences can have a remarkably specific effect: it can eliminate multisensory integration in superior colliculus neurons while leaving their responses to unimodal cues intact. 3. Apparently, some of the associative functions of cortex are accomplished via its target neurons in the midbrain.


Sign in / Sign up

Export Citation Format

Share Document