scholarly journals Multisensory-motor integration in olfactory navigation of silkmoth, Bombyx mori, using virtual reality system

eLife ◽  
2021 ◽  
Vol 10 ◽  
Author(s):  
Mayu Yamada ◽  
Hirono Ohashi ◽  
Koh Hosoda ◽  
Daisuke Kurabayashi ◽  
Shunsuke Shigaki

Most animals survive and thrive due to navigational behavior to reach their destinations. In order to navigate, it is important for animals to integrate information obtained from multisensory inputs and use that information to modulate their behavior. In this study, by using a virtual reality (VR) system for an insect, we investigated how the adult silkmoth integrates visual and wind direction information during female search behavior (olfactory behavior). According to the behavioral experiments using a VR system, the silkmoth had the highest navigational success rate when odor, vision, and wind information were correctly provided. However, the success rate of the search was reduced if the wind direction information provided was different from the direction actually detected. This indicates that it is important to acquire not only odor information but also wind direction information correctly. When the wind is received from the same direction as the odor, the silkmoth takes positive behavior; if the odor is detected but the wind direction is not in the same direction as the odor, the silkmoth behaves more carefully. This corresponds to a modulation of behavior according to the degree of complexity (turbulence) of the environment. We mathematically modeled the modulation of behavior using multisensory information and evaluated it using simulations. The mathematical model not only succeeded in reproducing the actual silkmoth search behavior but also improved the search success relative to the conventional odor-source search algorithm.

2021 ◽  
Author(s):  
Mayu Yamada ◽  
Hirono Ohashi ◽  
Koh Hosoda ◽  
Daisuke Kurabayashi ◽  
Shunsuke Shigaki

Most animals survive and thrive due to navigation behavior to reach their destinations. In order to navigate, it is important for animals to integrate information obtained from multisensory inputs and use that information to modulate their behavior. In this study, by using a virtual reality (VR) system for an insect, we investigated how an adult silkmoth integrates visual and wind direction information during female search behavior (olfactory behavior). According to the behavioral experiments using the VR system, the silkmoth had the highest navigation success rate when odor, vision, and wind information were correctly provided. However, we found that the success rate of the search signifcantly reduced if wind direction information was provided that was incorrect from the direction actually detected. This indicates that it is important to acquire not only odor information, but also wind direction information correctly. In other words, Behavior was modulated by the degree of co-incidence between the direction of arrival of the odor and the direction of arrival of the wind, and posture control (angular velocity control) was modulated by visual information. We mathematically modeled the modulation of behavior using multisensory information and evaluated it by simulation. As a result, the mathematical model not only succeeded in reproducing the actual female search behavior of the silkmoth, but can also improve search success relative to the conventional odor source search algorithm.


2020 ◽  
Vol 5 (1) ◽  
pp. 40-47
Author(s):  
Ning Sa ◽  
Xiaojun (Jenny) Yuan

AbstractWith the development of mobile technologies, voice search is becoming increasingly important in our daily lives. By investigating the general usage of voice search and user perception about voice search systems, this research aims to understand users’ voice search behavior. We are particularly interested in how users perform voice search, their topics of interest, and their preference toward voice search. We elicit users’ opinions by asking them to fill out an online survey. Results indicated that participants liked voice search because it was convenient. However, voice search was used much less frequently than keyboard search. The success rate of voice search was low, and the participants usually gave up voice search or switched to keyboard search. They tended to perform voice search when they were driving or walking. Moreover, the participants mainly used voice search for simple tasks on mobile devices. The main reasons why participants disliked voice search are attributed to the system mistakes and the fact that they were unable to modify the queries.


Author(s):  
Ying Zhang ◽  
Xu Hao ◽  
Kelu Hou ◽  
Lei Hu ◽  
Jingyuan Shang ◽  
...  

Aims: To assess the impact of cytochrome P450 (CYP) 2C19 polymorphisms on the clinical efficacy and safety of voriconazole. Methods: We systematically searched PubMed, EMBASE, CENTRAL, ClinicalTrials.gov, and three Chinese databases from their inception to March 18, 2021 using a predefined search algorithm to identify relevant studies. Studies that reported voriconazole-treated patients and information on CYP2C19 polymorphisms were included. The efficacy outcome was success rate. The safety outcomes included overall adverse events, hepatotoxicity and neurotoxicity. Results: A total of 20 studies were included. Intermediate metabolizers (IMs) and Poor metabolizers (PMs) were associated with increased success rates compared with normal metabolizers (NMs) (risk ratio (RR): 1.18, 95% confidence interval (CI): 1.03~1.34, I2=0%, p=0.02; RR: 1.28, 95%CI: 1.06~1.54, I2=0%, p=0.01). PMs were at increased risk of overall adverse events in comparison with NMs and IMs (RR: 2.18, 95%CI: 1.35~3.53, I2=0%, p=0.001; RR: 1.80, 95% CI: 1.23~2.64, I2=0%, p=0.003). PMs demonstrated a trend towards an increased incidence of hepatotoxicity when compared with NMs (RR: 1.60, 95%CI: 0.94~2.74, I2=27%, p=0.08), although there was no statistically significant difference. In addition, there was no significant association between CYP2C19 polymorphisms and neurotoxicity. Conclusions: IMs and PMs were at a significant higher success rate in comparison with NMs. PMs were significantly associated with an increased incidence of all adverse events compared with NMs and IMs. Researches are expected to further confirm these findings. Additionally, the relationship between hepatotoxicity and CYP2C19 polymorphisms deservers clinical attention.


2013 ◽  
Vol 26 (4) ◽  
pp. 347-370 ◽  
Author(s):  
Marine Taffou ◽  
Rachid Guerchouche ◽  
George Drettakis ◽  
Isabelle Viaud-Delmon

In a natural environment, affective information is perceived via multiple senses, mostly audition and vision. However, the impact of multisensory information on affect remains relatively undiscovered. In this study, we investigated whether the auditory–visual presentation of aversive stimuli influences the experience of fear. We used the advantages of virtual reality to manipulate multisensory presentation and to display potentially fearful dog stimuli embedded in a natural context. We manipulated the affective reactions evoked by the dog stimuli by recruiting two groups of participants: dog-fearful and non-fearful participants. The sensitivity to dog fear was assessed psychometrically by a questionnaire and also at behavioral and subjective levels using a Behavioral Avoidance Test (BAT). Participants navigated in virtual environments, in which they encountered virtual dog stimuli presented through the auditory channel, the visual channel or both. They were asked to report their fear using Subjective Units of Distress. We compared the fear for unimodal (visual or auditory) and bimodal (auditory–visual) dog stimuli. Dog-fearful participants as well as non-fearful participants reported more fear in response to bimodal audiovisual compared to unimodal presentation of dog stimuli. These results suggest that fear is more intense when the affective information is processed via multiple sensory pathways, which might be due to a cross-modal potentiation. Our findings have implications for the field of virtual reality-based therapy of phobias. Therapies could be refined and improved by implicating and manipulating the multisensory presentation of the feared situations.


2020 ◽  
Vol 31 (3) ◽  
pp. 675-691 ◽  
Author(s):  
Jella Pfeiffer ◽  
Thies Pfeiffer ◽  
Martin Meißner ◽  
Elisa Weiß

How can we tailor assistance systems, such as recommender systems or decision support systems, to consumers’ individual shopping motives? How can companies unobtrusively identify shopping motives without explicit user input? We demonstrate that eye movement data allow building reliable prediction models for identifying goal-directed and exploratory shopping motives. Our approach is validated in a real supermarket and in an immersive virtual reality supermarket. Several managerial implications of using gaze-based classification of information search behavior are discussed: First, the advent of virtual shopping environments makes using our approach straightforward as eye movement data are readily available in next-generation virtual reality devices. Virtual environments can be adapted to individual needs once shopping motives are identified and can be used to generate more emotionally engaging customer experiences. Second, identifying exploratory behavior offers opportunities for marketers to adapt marketing communication and interaction processes. Personalizing the shopping experience and profiling customers’ needs based on eye movement data promises to further increase conversion rates and customer satisfaction. Third, eye movement-based recommender systems do not need to interrupt consumers and thus do not take away attention from the purchase process. Finally, our paper outlines the technological basis of our approach and discusses the practical relevance of individual predictors.


2018 ◽  
Author(s):  
Hannah Haberkern ◽  
Melanie A. Basnak ◽  
Biafra Ahanonu ◽  
David Schauder ◽  
Jeremy D. Cohen ◽  
...  

AbstractA navigating animal’s sensory experience is shaped not just by its surroundings, but by its movements within them, which in turn are influenced by its past experiences. Studying the intertwined roles of sensation, experience and directed action in navigation has been made easier by the development of virtual reality (VR) environments for head-fixed animals, which allow for quantitative measurements of behavior in well-controlled sensory conditions. VR has long featured in studies of Drosophila melanogaster, but these experiments have typically relied on one-dimensional (1D) VR, effectively allowing the fly to change only its heading in a visual scene, and not its position. Here we explore how flies navigate in a two-dimensional (2D) visual VR environment that more closely resembles their experience during free behavior. We show that flies’ interaction with landmarks in 2D environments cannot be automatically derived from their behavior in simpler 1D environments. Using a novel paradigm, we then demonstrate that flies in 2D VR adapt their behavior in a visual environment in response to optogenetically delivered appetitive and aversive stimuli. Much like free-walking flies after encounters with food, head-fixed flies respond to optogenetic activation of sugar-sensing neurons by initiating a local search behavior. Finally, by pairing optogenetic activation of heat-sensing cells to the flies’ presence near visual landmarks of specific shapes, we elicit selective learned avoidance of landmarks associated with aversive “virtual heat”. These head-fixed paradigms set the stage for an interrogation of fly brain circuitry underlying flexible navigation in complex visual environments.


Author(s):  
Helmy Widyantara ◽  
Muhammad Rivai ◽  
Djoko Purwanto

A wind direction sensor has been implemented for many applications, such as navigation, weather, and air pollution monitoring. In an odor tracking system, the wind plays the important role to carry gas from its source. Therefore, the precise, low-cost, and effective wind direction sensor is required to trace the gas source. In this study, a new design of wind direction sensor has been developed using thermal anemometer principle with the main component of the positive temperature coefficient thermistor. Three anemometers each of which has different directions are used as inputs for the neural network to determine the direction of the wind automatically.The experimental results show that the wind sensor system is able to detect twelve wind directions. A mobile robot equipped with this sensor system can navigate to a wind source in the open air with a success rate of 80%.This system is expected to increase the success rate of the mobile robot in searching for dangerous leaking gas in the open air.


2020 ◽  
Vol 42 ◽  
pp. 34
Author(s):  
Eliezer Oliveira Cavalheiro ◽  
Cleiton Anderson Trindade De Carvalho ◽  
Glauber Rodrigues De Quadros ◽  
Silvana Maldaner

The windsock is a meteorological instrument that indicates wind direction. This instrument is critical in sea travel and aviation and this sensor can prevent accidents in emergency situations such as storms caused by a particular wind direction.hus, in this work it is proposed to develop an electronic windsock using a rotary encoder module and infrared LEDS. For the development of the Wind Direction Sensor project an Arduino microcontroller and c ++ language were used. The engineered hybrid system converts the rotations into an electrical signal. These signals were associated with east, west, north and south orientations. The projected sensor presented a lower probability of error in the wind direction information when compared to the windsock that employs 5mm Infrared LED.


2020 ◽  
Vol 8 (9) ◽  
pp. 626
Author(s):  
Yong Wan ◽  
Xiaolei Shi ◽  
Yongshou Dai ◽  
Ligang Li ◽  
Xiaojun Qu ◽  
...  

Synthetic aperture radar (SAR) can extract sea surface wind speed information. To extract wind speed information through the geophysical model function (GMF), the corresponding wind direction information must be input. This article introduces some concepts about networked SAR satellites. The networked satellites enable multiple SARs to observe the same sea surface at different incidence angles at the same time. Aiming at the X-band networked SAR data with different incident angles, the cost function is established by using the GMF. By minimizing the cost function, accurate wind speed information can be extracted without inputting wind direction information. When the noise is small, the wind direction information is introduced, and the accuracy of the extracted wind speed will be improved. When the noise is less than 1 dB and the incident angle is greater than 30°, the root-mean-square error (RMSE) of the wind speed extracted by this method is basically less than 2 m/s.


2014 ◽  
Vol 573 ◽  
pp. 777-782 ◽  
Author(s):  
A.Jeya Saravanan ◽  
C.P. Karthikeyan ◽  
Anand A. Samuel

Uneven terrains in mountain regions, where wind mills are to be erected cause concerns on the matrix of location, variation in wind direction, wake effects and due to location which may take a toll on efficiency, frequent changes in wind velocity, limitation of the hub height are a fear of the exogenous variables that influence the operation of wind farm. An attempt is made in this work to analyze the effect of those parameters on the efficiency of wind farm. Energy efficiency and exergy efficiency for a three column wind farm are determined and compared. The mathematical model developed considers wake deficit loss, transmission losses and resource losses the loss due to change in the wind direction, overall efficiency factor and locational specifications. A new objective function is derived for the wind farm with multidirectional wind flow and it is solved by Covariant Matrix Adaptation Evolutionary Strategy algorithm. This algorithm is used to maximize the wind farm exergetic efficiency. Location specification is the main variable to optimize and the other dimensionless variables remain same. Exergy efficiency is improved when compared to the reference layouts. The results projected will help the wind farm promoters to optimally utilize the resources to get maximum output.


Sign in / Sign up

Export Citation Format

Share Document