scholarly journals Hierarchical Spatial Search Strategies in Drosophila

2020 ◽  
Author(s):  
Nicola Meda ◽  
Giulio M. Menti ◽  
Aram Megighian ◽  
Mauro A. Zordan

ABSTRACTAnimals rely on multiple sensory information systems to make decisions. The integration of information stemming from these systems is believed to result in a precise behavioural output. To what degree a single sensory system may override the others is unknown. Evidence for a hierarchical use of different systems to guide navigation is lacking. We used Drosophila melanogaster to investigate whether, in order to relieve an unpleasant stimulation, fruit flies employed an idiothetically-based local search strategy before making use of visual information, or viceversa. Fruit flies appear to initially resort to idiothetic information and only later, if the first strategy proves unsuccessful to relieve the unpleasant stimulation, make use of other information, such as visual cues. By leveraging on this innate preference for a hierarchical use of one strategy over another, we believe that in vivo recordings of brain activity during the navigation of fruit flies could provide mechanistic insights into how simultaneous information from multiple sensory modalities is evaluated, integrated, and motor responses elicited, thus shedding new light on the neural basis of decision-making.

2018 ◽  
Vol 5 (2) ◽  
pp. 171785 ◽  
Author(s):  
Martin F. Strube-Bloss ◽  
Wolfgang Rössler

Flowers attract pollinating insects like honeybees by sophisticated compositions of olfactory and visual cues. Using honeybees as a model to study olfactory–visual integration at the neuronal level, we focused on mushroom body (MB) output neurons (MBON). From a neuronal circuit perspective, MBONs represent a prominent level of sensory-modality convergence in the insect brain. We established an experimental design allowing electrophysiological characterization of olfactory, visual, as well as olfactory–visual induced activation of individual MBONs. Despite the obvious convergence of olfactory and visual pathways in the MB, we found numerous unimodal MBONs. However, a substantial proportion of MBONs (32%) responded to both modalities and thus integrated olfactory–visual information across MB input layers. In these neurons, representation of the olfactory–visual compound was significantly increased compared with that of single components, suggesting an additive, but nonlinear integration. Population analyses of olfactory–visual MBONs revealed three categories: (i) olfactory, (ii) visual and (iii) olfactory–visual compound stimuli. Interestingly, no significant differentiation was apparent regarding different stimulus qualities within these categories. We conclude that encoding of stimulus quality within a modality is largely completed at the level of MB input, and information at the MB output is integrated across modalities to efficiently categorize sensory information for downstream behavioural decision processing.


Author(s):  
Jose Adrian Vega Vermehren ◽  
Cornelia Buehlmann ◽  
Ana Sofia David Fernandes ◽  
Paul Graham

AbstractAnts are excellent navigators taking into account multimodal sensory information as they move through the world. To be able to accurately localise the nest at the end of a foraging journey, visual cues, wind direction and also olfactory cues need to be learnt. Learning walks are performed at the start of an ant’s foraging career or when the appearance of the nest surrounding has changed. We investigated here whether the structure of such learning walks in the desert ant Cataglyphis fortis takes into account wind direction in conjunction with the learning of new visual information. Ants learnt to travel back and forth between their nest and a feeder, and we then introduced a black cylinder near their nest to induce learning walks in regular foragers. By doing this across days with different prevailing wind directions, we were able to probe how ants balance the influence of different sensory modalities. We found that (i) the ants’ outwards headings are influenced by the direction of the wind with their routes deflected in such a way that they will arrive downwind of their nest when homing, (ii) a novel object along the route induces learning walks in experienced ants and (iii) the structure of learning walks is shaped by the wind direction rather than the position of the visual cue.


2019 ◽  
Author(s):  
Mohamed Abdelhack ◽  
Yukiyasu Kamitani

AbstractVisual recognition involves integrating visual information with other sensory information and prior knowledge. In accord with Bayesian inference under conditions of unreliable visual input, the brain relies on the prior as a source of information to achieve the inference process. This drives a top-down process to improve the neural representation of visual input. However, the extent to which non-stimulus-driven top-down information affects processing in the ventral stream is still unclear. We conducted a perceptual decision-making task using blurred images, while conducting functional magnetic resonance imaging. We then transformed brain activity into deep neural network features to distinguish bottom-up and top-down signals. We found that top-down information unrelated to the stimulus had a minimal effect on lower-level visual processes. The neural representations of degraded stimuli that were misrecognized were still correlated with the correct object category in the lower levels of processing. In contrast, activity in the higher cognitive areas was more strongly correlated with recognition reported by the subjects. The results indicated a discrepancy between the results of processing at the lower and higher levels, indicating the existence of a stimulus-independent top-down signal flowing back down the hierarchy. These findings suggest that integration between bottom-up and top-down information takes the form of competing evidence in higher visual areas between prior-driven top-down and stimulus-driven bottom-up signals. These findings could provide important insight into the different modes of integration of neural signals in the visual cortex that contribute to the visual inference process.


2019 ◽  
Vol 16 (154) ◽  
pp. 20180903
Author(s):  
Edward D. Lee ◽  
Edward Esposito ◽  
Itai Cohen

Swing in a crew boat, a good jazz riff, a fluid conversation: these tasks require extracting sensory information about how others flow in order to mimic and respond. To determine what factors influence coordination, we build an environment to manipulate incoming sensory information by combining virtual reality and motion capture. We study how people mirror the motion of a human avatar’s arm as we occlude the avatar. We efficiently map the transition from successful mirroring to failure using Gaussian process regression. Then, we determine the change in behaviour when we introduce audio cues with a frequency proportional to the speed of the avatar’s hand or train individuals with a practice session. Remarkably, audio cues extend the range of successful mirroring to regimes where visual information is sparse. Such cues could facilitate joint coordination when navigating visually occluded environments, improve reaction speed in human–computer interfaces or measure altered physiological states and disease.


2020 ◽  
Author(s):  
Mireia Torralba ◽  
Alice Drew ◽  
Alba Sabaté San José ◽  
Luis Morís Fernández ◽  
Salvador Soto-Faraco

AbstractEndogenous brain processes play a paramount role in shaping up perceptual phenomenology, as illustrated by the alternations experienced by humans (and other animals) when watching perceptually ambiguous, static images. Here, we hypothesised that endogenous alpha fluctuations in the visual cortex pace the accumulation of sensory information leading to perceptual outcomes. We addressed this hypothesis using binocular rivalry combined with visual entrainment and electroencephalography in humans (42 female, 40 male). The results revealed a correlation between the individual frequency of alpha oscillations in the occipital cortex and perceptual alternation rates experienced during binocular rivalry. In subsequent experiments we show that regulating endogenous brain activity via entrainment produced corresponding changes in perceptual alternation rate, which were observed only in the alpha range but not at lower entrainment frequencies. Overall, rhythmic alpha stimulation resulted in faster perceptual alternation rates, compared to arrhythmic or no stimulation. These findings support the notion that visual information is accumulated via alpha cycles to promote the emergence of conscious perceptual representations. We suggest that models of binocular rivalry incorporating posterior alpha as a pacemaker can provide an important advance in the comprehension of the dynamics of visual awareness.Significance statementMainstream theories in cognitive neuroscience agree that endogenous brain processes play a paramount role in shaping our perceptual experience of sensory inputs. In vision, endogenous fluctuations in the alpha rhythm have been pointed out to regulate visual inputs to perception. In support of this hypothesis, here we used EEG recordings and visual entrainment to demonstrate that inter-individual differences in the speed of endogenous alpha fluctuations in the brain are causally related to the accrual of visual information to awareness. These findings provide, for the first time, evidence for alpha-gated regulation of the dynamics of alternations in conscious visual perception.


2015 ◽  
Author(s):  
Gyorgy Lur ◽  
Martin A. Vinck ◽  
Lan Tang ◽  
Jessica A. Cardin ◽  
Michael J. Higley

SummaryPrimary neocortical sensory areas act as central hubs, distributing afferent information to numerous cortical and subcortical structures. However, it remains unclear whether each downstream target receives distinct versions of sensory information. We used in vivo calcium imaging combined with retrograde tracing to monitor visual response properties of three distinct subpopulations of projection neurons in primary visual cortex. While there is overlap across the groups, on average corticotectal (CT) cells exhibit lower contrast thresholds and broader tuning for orientation and spatial frequency in comparison to corticostriatal (CS) cells, while corticocortical (CC) cells have intermediate properties. Noise correlational analyses support the hypothesis that CT cells integrate information across diverse layer 5 populations, whereas CS and CC cells form more selectively interconnected groups. Overall, our findings demonstrate the existence of functional subnetworks within layer 5 that may differentially route visual information to behaviorally relevant downstream targets.


2020 ◽  
Author(s):  
Alice Tomassini ◽  
Eric Maris ◽  
Pauline Hilt ◽  
Luciano Fadiga ◽  
Alessandro D’Ausilio

AbstractMovements overtly sample sensory information, making sensory analysis an active-sensing process. In this study, we show that visual information sampling is not just locked to the (overt) movement dynamics, but it is structured by the internal (covert) dynamics of cortico-motor control. We asked human participants to perform an isometric motor task – based on proprioceptive feedback – while detecting unrelated near-threshold visual stimuli. The motor output (Force) shows zero-lag coherence with brain activity (recorded via electroencephalography) in the beta-band, as previously reported. In contrast, cortical rhythms in the alpha-band systematically forerun the motor output by 200ms. Importantly, visual detection is facilitated when cortico-motor alpha (not beta) synchronization is enhanced immediately before stimulus onset, namely at the optimal phase relationship for sensorimotor communication. These findings demonstrate an automatic gating of visual inputs by the ongoing motor control processes, providing evidence of an internal and alpha-cycling visuomotor loop.


2019 ◽  
Author(s):  
Clément Vinauger ◽  
Floris Van Breugel ◽  
Lauren T. Locke ◽  
Kennedy K.S. Tobin ◽  
Michael H. Dickinson ◽  
...  

SummaryMosquitoes rely on the integration of multiple sensory cues, including olfactory, visual, and thermal stimuli, to detect, identify and locate their hosts [1–4]. Although we increasingly know more about the role of chemosensory behaviours in mediating mosquito-host interactions [1], the role of visual cues remains comparatively less studied [3], and how the combination of olfactory and visual information is integrated in the mosquito brain remains unknown. In the present study, we used a tethered-flight LED arena, which allowed for quantitative control over the stimuli, to show that CO2 exposure affects target-tracking responses, but not responses to large-field visual stimuli. In addition, we show that CO2 modulates behavioural responses to visual objects in a time-dependent manner. To gain insight into the neural basis of this olfactory and visual coupling, we conducted two-photon microscopy experiments in a new GCaMP6s-expressing mosquito line. Imaging revealed that the majority of ROIs in the lobula region of the optic lobe exhibited strong responses to small-field stimuli, but showed little response to a large-field stimulus. Approximately 20% of the neurons we imaged were modulated when an attractive odour preceded the visual stimulus; these same neurons also elicited a small response when the odour was presented alone. By contrast, imaging in the antennal lobe revealed no modulation when visual stimuli were presented before or after the olfactory stimulus. Together, our results are the first to reveal the dynamics of olfactory modulation in visually evoked behaviours of mosquitoes, and suggest that coupling between these sensory systems is asymmetrical and time-dependent.


2018 ◽  
Author(s):  
Xiaofeng Li ◽  
Ahmad Abou Tayoun ◽  
Zhuoyi Song ◽  
An Dau ◽  
Diana Rien ◽  
...  

AbstractCa2+-activated K+ channels (BK and SK) are ubiquitous in synaptic circuits, but their role in network adaptation and sensory perception remains largely unknown. Using electrophysiological and behavioral assays and biophysical modelling, we discover how visual information transfer in mutants lacking the BK channel (dSlo−), SK channel (dSK−) or both (dSK−;;dSlo−) is shaped in the female fruit fly (Drosophila melanogaster) R1-R6 photoreceptor-LMC circuits (R-LMC-R system) through synaptic feedforward-feedback interactions and reduced R1-R6 Shaker and Shab K+ conductances. This homeostatic compensation is specific for each mutant, leading to distinctive adaptive dynamics. We show how these dynamics inescapably increase the energy cost of information and promote the mutants’ distorted motion perception, determining the true price and limits of chronic homeostatic compensation in an in vivo genetic animal model. These results reveal why Ca2+-activated K+ channels reduce network excitability (energetics), improving neural adaptability for transmitting and perceiving sensory information.Significance statementIn this study, we directly link in vivo and ex vivo experiments with detailed stochastically operating biophysical models to extract new mechanistic knowledge of how Drosophila photoreceptor-interneuron-photoreceptor (R-LMC-R) circuitry homeostatically retains its information sampling and transmission capacity against chronic perturbations in its ion-channel composition, and what is the cost of this compensation and its impact on optomotor behavior. We anticipate that this novel approach will provide a useful template to other model organisms and computational neuroscience, in general, in dissecting fundamental mechanisms of homeostatic compensation and deepening our understanding of how biological neural networks work.


1999 ◽  
Vol 13 (2) ◽  
pp. 117-125 ◽  
Author(s):  
Laurence Casini ◽  
Françoise Macar ◽  
Marie-Hélène Giard

Abstract The experiment reported here was aimed at determining whether the level of brain activity can be related to performance in trained subjects. Two tasks were compared: a temporal and a linguistic task. An array of four letters appeared on a screen. In the temporal task, subjects had to decide whether the letters remained on the screen for a short or a long duration as learned in a practice phase. In the linguistic task, they had to determine whether the four letters could form a word or not (anagram task). These tasks allowed us to compare the level of brain activity obtained in correct and incorrect responses. The current density measures recorded over prefrontal areas showed a relationship between the performance and the level of activity in the temporal task only. The level of activity obtained with correct responses was lower than that obtained with incorrect responses. This suggests that a good temporal performance could be the result of an efficacious, but economic, information-processing mechanism in the brain. In addition, the absence of this relation in the anagram task results in the question of whether this relation is specific to the processing of sensory information only.


Sign in / Sign up

Export Citation Format

Share Document