head direction cells
Recently Published Documents


TOTAL DOCUMENTS

128
(FIVE YEARS 18)

H-INDEX

30
(FIVE YEARS 2)

2021 ◽  
Author(s):  
Ningyu Zhang ◽  
Roddy M Grieves ◽  
Kate J Jeffery

A class of neurons showing bidirectional tuning in a two-compartment environment was recently discovered in dysgranular retrosplenial cortex (dRSC). We investigated here whether these neurons possess a more general environmental symmetry-encoding property, potentially useful in representing complex spatial structure. We report that directional tuning of dRSC neurons reflected environment symmetry in onefold, twofold and fourfold-symmetric environments: this was the case not just globally, but also locally within each sub-compartment. Thus, these cells use environmental cues to organize multiple directional tuning curves, which perhaps sometimes combine via interaction with classic head direction cells. A consequence is that both local and global environmental symmetry are simultaneously encoded even within local sub-compartments, which may be important for cognitive mapping of the space beyond immediate perceptual reach.


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Lijun Chao ◽  
Zhi Xiong ◽  
Jianye Liu ◽  
Chuang Yang ◽  
Yudi Chen

Purpose To solve problems of low intelligence and poor robustness of traditional navigation systems, the purpose of this paper is to propose a brain-inspired localization method of the unmanned aerial vehicle (UAV). Design/methodology/approach First, the yaw angle of the UAV is obtained by modeling head direction cells with one-dimension continuous attractor neural network (1 D-CANN) and then inputs into 3D grid cells. After that, the motion information of the UAV is encoded as the firing of 3 D grid cells using 3 D-CANN. Finally, the current position of the UAV can be decoded from the neuron firing through the period-adic method. Findings Simulation results suggest that continuous yaw and position information can be generated from the conjunctive model of head direction cells and grid cells. Originality/value The proposed period-adic cell decoding method can provide a UAV with the 3 D position, which is more intelligent and robust than traditional navigation methods.


2021 ◽  
Author(s):  
Danil Akhtiamov ◽  
Anthony G. Cohn ◽  
Yuri Alexander Dabaghian

A common approach to interpreting spiking activity is based on identifying the firing fields---regions in physical or configuration spaces that elicit responses of neurons. Common examples include hippocampal place cells that fire at preferred locations in the navigated environment, head direction cells that fire at preferred orientations of the animal's head, view cells that respond to preferred spots in the visual field, etc. In all these cases, firing fields were discovered empirically, by trial and error. We argue that the existence and a number of properties of the firing fields can be established theoretically, through topological analyses of the neuronal spiking activity.


2021 ◽  
Vol 31 (12) ◽  
pp. R781-R783
Author(s):  
Aylin Apostel ◽  
Jonas Rose

Information ◽  
2021 ◽  
Vol 12 (3) ◽  
pp. 100
Author(s):  
Simon Gay ◽  
Kévin Le Le Run ◽  
Edwige Pissaloux ◽  
Katerine Romeo ◽  
Christèle Lecomte

This paper presents a novel bio-inspired predictive model of visual navigation inspired by mammalian navigation. This model takes inspiration from specific types of neurons observed in the brain, namely place cells, grid cells and head direction cells. In the proposed model, place cells are structures that store and connect local representations of the explored environment, grid and head direction cells make predictions based on these representations to define the position of the agent in a place cell’s reference frame. This specific use of navigation cells has three advantages: First, the environment representations are stored by place cells and require only a few spatialized descriptors or elements, making this model suitable for the integration of large-scale environments (indoor and outdoor). Second, the grid cell modules act as an efficient visual and absolute odometry system. Finally, the model provides sequential spatial tracking that can integrate and track an agent in redundant environments or environments with very few or no distinctive cues, while being very robust to environmental changes. This paper focuses on the architecture formalization and the main elements and properties of this model. The model has been successfully validated on basic functions: mapping, guidance, homing, and finding shortcuts. The precision of the estimated position of the agent and the robustness to environmental changes during navigation were shown to be satisfactory. The proposed predictive model is intended to be used on autonomous platforms, but also to assist visually impaired people in their mobility.


Author(s):  
Xiaoyang Long ◽  
Sheng-Jia Zhang

AbstractSpatially selective firing of place cells, grid cells, boundary vector/border cells and head direction cells constitutes the basic building blocks of a canonical spatial navigation system centered on the hippocampal-entorhinal complex. While head direction cells can be found throughout the brain, spatial tuning outside the hippocampal formation is often non-specific or conjunctive to other representations such as a reward. Although the precise mechanism of spatially selective firing activity is not understood, various studies show sensory inputs, particularly vision, heavily modulate spatial representation in the hippocampal-entorhinal circuit. To better understand the contribution of other sensory inputs in shaping spatial representation in the brain, we performed recording from the primary somatosensory cortex in foraging rats. To our surprise, we were able to detect the full complement of spatially selective firing patterns similar to that reported in the hippocampal-entorhinal network, namely, place cells, head direction cells, boundary vector/border cells, grid cells and conjunctive cells, in the somatosensory cortex. These newly identified somatosensory spatial cells form a spatial map outside the hippocampal formation and support the hypothesis that location information modulates body representation in the somatosensory cortex. Our findings provide transformative insights into our understanding of how spatial information is processed and integrated in the brain, as well as functional operations of the somatosensory cortex in the context of rehabilitation with brain-machine interfaces.


IEEE Access ◽  
2021 ◽  
pp. 1-1
Author(s):  
Baozhong Li ◽  
Yanming Liu ◽  
Lei Lai

Author(s):  
Elhanan Ben-Yishay ◽  
Ksenia Krivoruchko ◽  
Shaked Ron ◽  
Nachum Ulanovsky ◽  
Dori Derdikman ◽  
...  

Birds strongly rely on spatial memory and navigation. However, it is unknown how space is represented in the avian brain. Here we used tetrodes to record neurons from the hippocampal formation (HPF) of Japanese quails – a migratory ground-dwelling species – while the quails roamed a 1×1-meter arena (>2,100 neurons from 21 birds). Whereas spatially-modulated cells (place-cells, border-cells, etc.) were generally not encountered, the firing-rate of 12% of the neurons was unimodally and significantly modulated by the head-azimuth – i.e. these were head-direction cells (HD cells, n=260). Typically, HD cells were maximally active at one preferred-direction and minimally at the opposite null-direction, with preferred-directions spanning all 360°. The HD tuning was relatively broad (mean= 130°), independent of the animal’s position and speed, and was stable during the recording-session. These findings support the existence of an allocentric head-direction representation in the quail HPF, and provide the first demonstration of head-direction cells in birds.


2020 ◽  
Vol 123 (4) ◽  
pp. 1392-1406 ◽  
Author(s):  
Juan Ignacio Sanguinetti-Scheck ◽  
Michael Brecht

The home is a unique location in the life of humans and animals. In rats, home presents itself as a multicompartmental space that involves integrating navigation through subspaces. Here we embedded the laboratory rat’s home cage in the arena, while recording neurons in the animal’s parasubiculum and medial entorhinal cortex, two brain areas encoding the animal’s location and head direction. We found that head direction signals were unaffected by home cage presence or translocation. Head direction cells remain globally stable and have similar properties inside and outside the embedded home. We did not observe egocentric bearing encoding of the home cage. However, grid cells were distorted in the presence of the home cage. While they did not globally remap, single firing fields were translocated toward the home. These effects appeared to be geometrical in nature rather than a home-specific distortion and were not dependent on explicit behavioral use of the home cage during a hoarding task. Our work suggests that medial entorhinal cortex and parasubiculum do not remap after embedding the home, but local changes in grid cell activity overrepresent the embedded space location and might contribute to navigation in complex environments. NEW & NOTEWORTHY Neural findings in the field of spatial navigation come mostly from an abstract approach that separates the animal from even a minimally biological context. In this article we embed the home cage of the rat in the environment to address some of the complexities of natural navigation. We find no explicit home cage representation. While both head direction cells and grid cells remain globally stable, we find that embedded spaces locally distort grid cells.


Sign in / Sign up

Export Citation Format

Share Document