insect navigation
Recently Published Documents


TOTAL DOCUMENTS

50
(FIVE YEARS 17)

H-INDEX

14
(FIVE YEARS 2)

2021 ◽  
Author(s):  
Nirag Kadakia ◽  
Mahmut Demir ◽  
Brenden T. Michaelis ◽  
Matthew A. Reidenbach ◽  
Damon A. Clark ◽  
...  

ABSTRACTInsects can detect bilateral differences in odor concentration between their two antennae, enabling them to sense odor gradients. While gradients aid navigation in simple odor environments like static ribbons, their role in navigating complex plumes remains unknown. Here, we use a virtual reality paradigm to show that Drosophila use bilateral sensing for a distinct computation: detecting the motion of odor signals. Such odor direction sensing is computationally equivalent to motion detection algorithms underlying motion detection in vision. Simulations of natural plumes reveal that odor motion contains valuable directional information absent from the airflow, which Drosophila indeed exploit when navigating natural plumes. Olfactory studies dating back a century have stressed the critical role of wind sensing for insect navigation (Flügge, 1934; Kennedy and Marsh, 1974); we reveal an entirely orthogonal direction cue used by flies in natural environments, and give theoretical arguments suggesting that this cue may be of broad use across the animal kingdom.


2021 ◽  
Vol 15 ◽  
Author(s):  
Benjamin H. Paffhausen ◽  
Julian Petrasch ◽  
Benjamin Wild ◽  
Thierry Meurers ◽  
Tobias Schülke ◽  
...  

Navigating animals combine multiple perceptual faculties, learn during exploration, retrieve multi-facetted memory contents, and exhibit goal-directedness as an expression of their current needs and motivations. Navigation in insects has been linked to a variety of underlying strategies such as path integration, view familiarity, visual beaconing, and goal-directed orientation with respect to previously learned ground structures. Most works, however, study navigation either from a field perspective, analyzing purely behavioral observations, or combine computational models with neurophysiological evidence obtained from lab experiments. The honey bee (Apis mellifera) has long been a popular model in the search for neural correlates of complex behaviors and exhibits extraordinary navigational capabilities. However, the neural basis for bee navigation has not yet been explored under natural conditions. Here, we propose a novel methodology to record from the brain of a copter-mounted honey bee. This way, the animal experiences natural multimodal sensory inputs in a natural environment that is familiar to her. We have developed a miniaturized electrophysiology recording system which is able to record spikes in the presence of time-varying electric noise from the copter's motors and rotors, and devised an experimental procedure to record from mushroom body extrinsic neurons (MBENs). We analyze the resulting electrophysiological data combined with a reconstruction of the animal's visual perception and find that the neural activity of MBENs is linked to sharp turns, possibly related to the relative motion of visual features. This method is a significant technological step toward recording brain activity of navigating honey bees under natural conditions. By providing all system specifications in an online repository, we hope to close a methodological gap and stimulate further research informing future computational models of insect navigation.


Author(s):  
Olivier J.N. Bertrand ◽  
Charlotte Doussot ◽  
Tim Siesenop ◽  
Sridhar Ravi ◽  
Martin Egelhaaf

One persistent question in animal navigation is how animals follow habitual routes between their home and a food source. Our current understanding of insect navigation suggests an interplay between visual memories, collision avoidance and path integration, the continuous integration of distance and direction travelled. However, these behavioural modules have to be continuously updated with instantaneous visual information. In order to alleviate this need, the insect could learn and replicate habitual movements (“movement memories”) around objects (e.g. a bent trajectory around an object) to reach its destination. We investigated whether bumblebees, Bombus terrestris, learn and use movement memories en route to their home. Using a novel experimental paradigm, we habituated bumblebees to establish a habitual route in a flight tunnel containing “invisible” obstacles. We then confronted them with conflicting cues leading to different choice directions depending on whether they rely on movement or visual memories. The results suggest that they use movement memories to navigate, but also rely on visual memories to solve conflicting situations. We investigated whether the observed behaviour was due to other guidance systems, such as path integration or optic flow-based flight control, and found that neither of these systems was sufficient to explain the behaviour.


2020 ◽  
Vol 223 (24) ◽  
pp. jeb228601
Author(s):  
Roman Goulard ◽  
Cornelia Buehlmann ◽  
Jeremy E. Niven ◽  
Paul Graham ◽  
Barbara Webb

ABSTRACTThe natural scale of insect navigation during foraging makes it challenging to study under controlled conditions. Virtual reality and trackball setups have offered experimental control over visual environments while studying tethered insects, but potential limitations and confounds introduced by tethering motivates the development of alternative untethered solutions. In this paper, we validate the use of a motion compensator (or ‘treadmill’) to study visually driven behaviour of freely moving wood ants (Formica rufa). We show how this setup allows naturalistic walking behaviour and preserves foraging motivation over long time frames. Furthermore, we show that ants are able to transfer associative and navigational memories from classical maze and arena contexts to our treadmill. Thus, we demonstrate the possibility to study navigational behaviour over ecologically relevant durations (and virtual distances) in precisely controlled environments, bridging the gap between natural and highly controlled laboratory experiments.


2020 ◽  
Vol 42 ◽  
pp. 110-117
Author(s):  
Florent Le Moël ◽  
Antoine Wystrach

2020 ◽  
Author(s):  
Le Zhu ◽  
Michael Mangan ◽  
Barbara Webb

AbstractInsects, despite relatively small brains, can perform complex navigation tasks such as memorising a visual route. The exact format of visual memory encoded by neural systems during route learning and following is still unclear. Here we propose that interconnections between Kenyon cells in the Mushroom Body (MB) could encode spatio-temporal memory of visual motion experienced when moving along a route. In our implementation, visual motion is sensed using an event-based camera mounted on a robot, and learned by a biologically constrained spiking neural network model, based on simplified MB architecture and using modified leaky integrate-and-fire neurons. In contrast to previous image-matching models where all memories are stored in parallel, the continuous visual flow is inherently sequential. Our results show that the model can distinguish learned from unlearned route segments, with some tolerance to internal and external noise, including small displacements. The neural response can also explain observed behaviour taken to support sequential memory in ant experiments. However, obtaining comparable robustness to insect navigation might require the addition of biomimetic pre-processing of the input stream, and determination of the appropriate motor strategy to exploit the memory output.


eLife ◽  
2020 ◽  
Vol 9 ◽  
Author(s):  
Xuelong Sun ◽  
Shigang Yue ◽  
Michael Mangan

Insect navigation arises from the coordinated action of concurrent guidance systems but the neural mechanisms through which each functions, and are then coordinated, remains unknown. We propose that insects require distinct strategies to retrace familiar routes (route-following) and directly return from novel to familiar terrain (homing) using different aspects of frequency encoded views that are processed in different neural pathways. We also demonstrate how the Central Complex and Mushroom Bodies regions of the insect brain may work in tandem to coordinate the directional output of different guidance cues through a contextually switched ring-attractor inspired by neural recordings. The resultant unified model of insect navigation reproduces behavioural data from a series of cue conflict experiments in realistic animal environments and offers testable hypotheses of where and how insects process visual cues, utilise the different information that they provide and coordinate their outputs to achieve the adaptive behaviours observed in the wild.


2020 ◽  
Author(s):  
Roman Goulard ◽  
Cornelia Buehlmann ◽  
Jeremy E. Niven ◽  
Paul Graham ◽  
Barbara Webb

AbstractThe scale of natural insect navigation during foraging makes it challenging to study, in a controlled way, the navigation processes that an insect brain can support. Virtual Reality and trackball setups have offered experimental control over visual environments while studying tethered insects, but potential limitations and confounds introduced by tethering motivates the development of alternative untethered solutions. In this paper we validate the use of a motion compensator (or ‘treadmill’) to study visually-driven behaviour of freely moving wood ants (Formica rufa). We show how this setup allows naturalistic walking behaviour and motivation over long timeframes. Furthermore, we show that ants are able to transfer associative and navigational memories from classical maze and arena contexts to our treadmill. Thus, we demonstrate the possibility to study navigational behaviour over ecologically relevant durations (and virtual distances) in precisely controlled environments, bridging the gap between natural and highly controlled laboratory experiments.1Summary statementWe have developed and validated a motion compensating treadmill for wood ants which opens new perspectives to study insect navigation behaviour in a fully controlled manner over ecologically relevant durations.


Sign in / Sign up

Export Citation Format

Share Document