scholarly journals Analysis of Multisensory-Motor Integration in Olfactory Navigation of Silkmoth, Bombyx mori, using Virtual Reality System

2021 ◽  
Author(s):  
Mayu Yamada ◽  
Hirono Ohashi ◽  
Koh Hosoda ◽  
Daisuke Kurabayashi ◽  
Shunsuke Shigaki

Most animals survive and thrive due to navigation behavior to reach their destinations. In order to navigate, it is important for animals to integrate information obtained from multisensory inputs and use that information to modulate their behavior. In this study, by using a virtual reality (VR) system for an insect, we investigated how an adult silkmoth integrates visual and wind direction information during female search behavior (olfactory behavior). According to the behavioral experiments using the VR system, the silkmoth had the highest navigation success rate when odor, vision, and wind information were correctly provided. However, we found that the success rate of the search signifcantly reduced if wind direction information was provided that was incorrect from the direction actually detected. This indicates that it is important to acquire not only odor information, but also wind direction information correctly. In other words, Behavior was modulated by the degree of co-incidence between the direction of arrival of the odor and the direction of arrival of the wind, and posture control (angular velocity control) was modulated by visual information. We mathematically modeled the modulation of behavior using multisensory information and evaluated it by simulation. As a result, the mathematical model not only succeeded in reproducing the actual female search behavior of the silkmoth, but can also improve search success relative to the conventional odor source search algorithm.

eLife ◽  
2021 ◽  
Vol 10 ◽  
Author(s):  
Mayu Yamada ◽  
Hirono Ohashi ◽  
Koh Hosoda ◽  
Daisuke Kurabayashi ◽  
Shunsuke Shigaki

Most animals survive and thrive due to navigational behavior to reach their destinations. In order to navigate, it is important for animals to integrate information obtained from multisensory inputs and use that information to modulate their behavior. In this study, by using a virtual reality (VR) system for an insect, we investigated how the adult silkmoth integrates visual and wind direction information during female search behavior (olfactory behavior). According to the behavioral experiments using a VR system, the silkmoth had the highest navigational success rate when odor, vision, and wind information were correctly provided. However, the success rate of the search was reduced if the wind direction information provided was different from the direction actually detected. This indicates that it is important to acquire not only odor information but also wind direction information correctly. When the wind is received from the same direction as the odor, the silkmoth takes positive behavior; if the odor is detected but the wind direction is not in the same direction as the odor, the silkmoth behaves more carefully. This corresponds to a modulation of behavior according to the degree of complexity (turbulence) of the environment. We mathematically modeled the modulation of behavior using multisensory information and evaluated it using simulations. The mathematical model not only succeeded in reproducing the actual silkmoth search behavior but also improved the search success relative to the conventional odor-source search algorithm.


2020 ◽  
Vol 5 (1) ◽  
pp. 40-47
Author(s):  
Ning Sa ◽  
Xiaojun (Jenny) Yuan

AbstractWith the development of mobile technologies, voice search is becoming increasingly important in our daily lives. By investigating the general usage of voice search and user perception about voice search systems, this research aims to understand users’ voice search behavior. We are particularly interested in how users perform voice search, their topics of interest, and their preference toward voice search. We elicit users’ opinions by asking them to fill out an online survey. Results indicated that participants liked voice search because it was convenient. However, voice search was used much less frequently than keyboard search. The success rate of voice search was low, and the participants usually gave up voice search or switched to keyboard search. They tended to perform voice search when they were driving or walking. Moreover, the participants mainly used voice search for simple tasks on mobile devices. The main reasons why participants disliked voice search are attributed to the system mistakes and the fact that they were unable to modify the queries.


2020 ◽  
Author(s):  
David Harris ◽  
Mark Wilson ◽  
Tim Holmes ◽  
Toby de Burgh ◽  
Samuel James Vine

Head-mounted eye tracking has been fundamental for developing an understanding of sporting expertise, as the way in which performers sample visual information from the environment is a major determinant of successful performance. There is, however, a long running tension between the desire to study realistic, in-situ gaze behaviour and the difficulties of acquiring accurate ocular measurements in dynamic and fast-moving sporting tasks. Here, we describe how immersive technologies, such as virtual reality, offer an increasingly compelling approach for conducting eye movement research in sport. The possibility of studying gaze behaviour in representative and realistic environments, but with high levels of experimental control, could enable significant strides forward for eye tracking in sport and improve understanding of how eye movements underpin sporting skills. By providing a rationale for virtual reality as an optimal environment for eye tracking research, as well as outlining practical considerations related to hardware, software and data analysis, we hope to guide researchers and practitioners in the use of this approach.


Author(s):  
Ying Zhang ◽  
Xu Hao ◽  
Kelu Hou ◽  
Lei Hu ◽  
Jingyuan Shang ◽  
...  

Aims: To assess the impact of cytochrome P450 (CYP) 2C19 polymorphisms on the clinical efficacy and safety of voriconazole. Methods: We systematically searched PubMed, EMBASE, CENTRAL, ClinicalTrials.gov, and three Chinese databases from their inception to March 18, 2021 using a predefined search algorithm to identify relevant studies. Studies that reported voriconazole-treated patients and information on CYP2C19 polymorphisms were included. The efficacy outcome was success rate. The safety outcomes included overall adverse events, hepatotoxicity and neurotoxicity. Results: A total of 20 studies were included. Intermediate metabolizers (IMs) and Poor metabolizers (PMs) were associated with increased success rates compared with normal metabolizers (NMs) (risk ratio (RR): 1.18, 95% confidence interval (CI): 1.03~1.34, I2=0%, p=0.02; RR: 1.28, 95%CI: 1.06~1.54, I2=0%, p=0.01). PMs were at increased risk of overall adverse events in comparison with NMs and IMs (RR: 2.18, 95%CI: 1.35~3.53, I2=0%, p=0.001; RR: 1.80, 95% CI: 1.23~2.64, I2=0%, p=0.003). PMs demonstrated a trend towards an increased incidence of hepatotoxicity when compared with NMs (RR: 1.60, 95%CI: 0.94~2.74, I2=27%, p=0.08), although there was no statistically significant difference. In addition, there was no significant association between CYP2C19 polymorphisms and neurotoxicity. Conclusions: IMs and PMs were at a significant higher success rate in comparison with NMs. PMs were significantly associated with an increased incidence of all adverse events compared with NMs and IMs. Researches are expected to further confirm these findings. Additionally, the relationship between hepatotoxicity and CYP2C19 polymorphisms deservers clinical attention.


2021 ◽  
Vol 2 ◽  
Author(s):  
Thirsa Huisman ◽  
Axel Ahrens ◽  
Ewen MacDonald

To reproduce realistic audio-visual scenarios in the laboratory, Ambisonics is often used to reproduce a sound field over loudspeakers and virtual reality (VR) glasses are used to present visual information. Both technologies have been shown to be suitable for research. However, the combination of both technologies, Ambisonics and VR glasses, might affect the spatial cues for auditory localization and thus, the localization percept. Here, we investigated how VR glasses affect the localization of virtual sound sources on the horizontal plane produced using either 1st-, 3rd-, 5th- or 11th-order Ambisonics with and without visual information. Results showed that with 1st-order Ambisonics the localization error is larger than with the higher orders, while the differences across the higher orders were small. The physical presence of the VR glasses without visual information increased the perceived lateralization of the auditory stimuli by on average about 2°, especially in the right hemisphere. Presenting visual information about the environment and potential sound sources did reduce this HMD-induced shift, however it could not fully compensate for it. While the localization performance itself was affected by the Ambisonics order, there was no interaction between the Ambisonics order and the effect of the HMD. Thus, the presence of VR glasses can alter acoustic localization when using Ambisonics sound reproduction, but visual information can compensate for most of the effects. As such, most use cases for VR will be unaffected by these shifts in the perceived location of the auditory stimuli.


Neurology ◽  
2018 ◽  
Vol 90 (11) ◽  
pp. e977-e984 ◽  
Author(s):  
Motoyasu Honma ◽  
Yuri Masaoka ◽  
Takeshi Kuroda ◽  
Akinori Futamura ◽  
Azusa Shiromaru ◽  
...  

ObjectiveTo determine whether Parkinson disease (PD) affects cross-modal function of vision and olfaction because it is known that PD impairs various cognitive functions, including olfaction.MethodsWe conducted behavioral experiments to identify the influence of PD on cross-modal function by contrasting patient performance with age-matched normal controls (NCs). We showed visual effects on the strength and preference of odor by manipulating semantic connections between picture/odorant pairs. In addition, we used brain imaging to identify the role of striatal presynaptic dopamine transporter (DaT) deficits.ResultsWe found that odor evaluation in participants with PD was unaffected by visual information, while NCs overestimated smell when sniffing odorless liquid while viewing pleasant/unpleasant visual cues. Furthermore, DaT deficit in striatum, for the posterior putamen in particular, correlated to few visual effects in participants with PD.ConclusionsThese findings suggest that PD impairs cross-modal function of vision/olfaction as a result of posterior putamen deficit. This cross-modal dysfunction may serve as the basis of a novel precursor assessment of PD.


2013 ◽  
Vol 26 (4) ◽  
pp. 347-370 ◽  
Author(s):  
Marine Taffou ◽  
Rachid Guerchouche ◽  
George Drettakis ◽  
Isabelle Viaud-Delmon

In a natural environment, affective information is perceived via multiple senses, mostly audition and vision. However, the impact of multisensory information on affect remains relatively undiscovered. In this study, we investigated whether the auditory–visual presentation of aversive stimuli influences the experience of fear. We used the advantages of virtual reality to manipulate multisensory presentation and to display potentially fearful dog stimuli embedded in a natural context. We manipulated the affective reactions evoked by the dog stimuli by recruiting two groups of participants: dog-fearful and non-fearful participants. The sensitivity to dog fear was assessed psychometrically by a questionnaire and also at behavioral and subjective levels using a Behavioral Avoidance Test (BAT). Participants navigated in virtual environments, in which they encountered virtual dog stimuli presented through the auditory channel, the visual channel or both. They were asked to report their fear using Subjective Units of Distress. We compared the fear for unimodal (visual or auditory) and bimodal (auditory–visual) dog stimuli. Dog-fearful participants as well as non-fearful participants reported more fear in response to bimodal audiovisual compared to unimodal presentation of dog stimuli. These results suggest that fear is more intense when the affective information is processed via multiple sensory pathways, which might be due to a cross-modal potentiation. Our findings have implications for the field of virtual reality-based therapy of phobias. Therapies could be refined and improved by implicating and manipulating the multisensory presentation of the feared situations.


Sign in / Sign up

Export Citation Format

Share Document