scholarly journals Visually and Tactually Guided Grasps Lead to Different Neuronal Activity in Non-human Primates

2021 ◽  
Vol 15 ◽  
Author(s):  
Daniela Buchwald ◽  
Hansjörg Scherberger

Movements are defining characteristics of all behaviors. Animals walk around, move their eyes to explore the world or touch structures to learn more about them. So far we only have some basic understanding of how the brain generates movements, especially when we want to understand how different areas of the brain interact with each other. In this study we investigated the influence of sensory object information on grasp planning in four different brain areas involved in vision, touch, movement planning, and movement generation in the parietal, somatosensory, premotor and motor cortex. We trained one monkey to grasp objects that he either saw or touched beforehand while continuously recording neural spiking activity with chronically implanted floating multi-electrode arrays. The animal was instructed to sit in the dark and either look at a shortly illuminated object or reach out and explore the object with his hand in the dark before lifting it up. In a first analysis we confirmed that the animal not only memorizes the object in both tasks, but also applies an object-specific grip type, independent of the sensory modality. In the neuronal population, we found a significant difference in the number of tuned units for sensory modalities during grasp planning that persisted into grasp execution. These differences were sufficient to enable a classifier to decode the object and sensory modality in a single trial exclusively from neural population activity. These results give valuable insights in how different brain areas contribute to the preparation of grasp movement and how different sensory streams can lead to distinct neural activity while still resulting in the same action execution.

2018 ◽  
Author(s):  
Chethan Pandarinath ◽  
K. Cora Ames ◽  
Abigail A Russo ◽  
Ali Farshchian ◽  
Lee E Miller ◽  
...  

In the fifty years since Evarts first recorded single neurons in motor cortex of behaving monkeys, great effort has been devoted to understanding their relation to movement. Yet these single neurons exist within a vast network, the nature of which has been largely inaccessible. With advances in recording technologies, algorithms, and computational power, the ability to study network-level phenomena is increasing exponentially. Recent experimental results suggest that the dynamical properties of these networks are critical to movement planning and execution. Here we discuss this dynamical systems perspective, and how it is reshaping our understanding of the motor cortices. Following an overview of key studies in motor cortex, we discuss techniques to uncover the “latent factors” underlying observed neural population activity. Finally, we discuss efforts to leverage these factors to improve the performance of brain-machine interfaces, promising to make these findings broadly relevant to neuroengineering as well as systems neuroscience.


eLife ◽  
2020 ◽  
Vol 9 ◽  
Author(s):  
Adrian Ponce-Alvarez ◽  
Gabriela Mochol ◽  
Ainhoa Hermoso-Mendizabal ◽  
Jaime de la Rocha ◽  
Gustavo Deco

Previous research showed that spontaneous neuronal activity presents sloppiness: the collective behavior is strongly determined by a small number of parameter combinations, defined as ‘stiff’ dimensions, while it is insensitive to many others (‘sloppy’ dimensions). Here, we analyzed neural population activity from the auditory cortex of anesthetized rats while the brain spontaneously transited through different synchronized and desynchronized states and intermittently received sensory inputs. We showed that cortical state transitions were determined by changes in stiff parameters associated with the activity of a core of neurons with low responses to stimuli and high centrality within the observed network. In contrast, stimulus-evoked responses evolved along sloppy dimensions associated with the activity of neurons with low centrality and displaying large ongoing and stimulus-evoked fluctuations without affecting the integrity of the network. Our results shed light on the interplay among stability, flexibility, and responsiveness of neuronal collective dynamics during intrinsic and induced activity.


2019 ◽  
Author(s):  
Adrián Ponce-Alvarez ◽  
Gabriela Mochol ◽  
Ainhoa Hermoso-Mendizabal ◽  
Jaime de la Rocha ◽  
Gustavo Deco

SummaryPrevious research showed that spontaneous neuronal activity presents sloppiness: the collective behavior is strongly determined by a small number of parameter combinations, defined as “stiff” dimensions, while it is insensitive to many others (“sloppy” dimensions). Here, we analyzed neural population activity from the auditory cortex of anesthetized rats while the brain spontaneously transited through different synchronized and desynchronized states and intermittently received sensory inputs. We showed that cortical state transitions were determined by changes in stiff parameters associated with the activity of a core of neurons with low responses to stimuli and high centrality within the observed network. In contrast, stimulus-evoked responses evolved along sloppy dimensions associated with the activity of neurons with low centrality and displaying large ongoing and stimulus-evoked fluctuations without affecting the integrity of the network. Our results shed light on the interplay among stability, flexibility, and responsiveness of neuronal collective dynamics during intrinsic and induced activity.


2014 ◽  
Vol 112 (9) ◽  
pp. 2290-2301 ◽  
Author(s):  
Jean Blouin ◽  
Anahid H. Saradjian ◽  
Nicolas Lebar ◽  
Alain Guillaume ◽  
Laurence Mouchnino

Behavioral studies have suggested that the brain uses a visual estimate of the hand to plan reaching movements toward visual targets and somatosensory inputs in the case of somatosensory targets. However, neural correlates for distinct coding of the hand according to the sensory modality of the target have not yet been identified. Here we tested the twofold hypothesis that the somatosensory input from the reaching hand is facilitated and inhibited, respectively, when planning movements toward somatosensory (unseen fingers) or visual targets. The weight of the somatosensory inputs was assessed by measuring the amplitude of the somatosensory evoked potential (SEP) resulting from vibration of the reaching finger during movement planning. The target sensory modality had no significant effect on SEP amplitude. However, Spearman's analyses showed significant correlations between the SEPs and reaching errors. When planning movements toward proprioceptive targets without visual feedback of the reaching hand, participants showing the greater SEPs were those who produced the smaller directional errors. Inversely, participants showing the smaller SEPs when planning movements toward visual targets with visual feedback of the reaching hand were those who produced the smaller directional errors. No significant correlation was found between the SEPs and radial or amplitude errors. Our results indicate that the sensory strategy for planning movements is highly flexible among individuals and also for a given sensory context. Most importantly, they provide neural bases for the suggestion that optimization of movement planning requires the target and the reaching hand to both be represented in the same sensory modality.


eLife ◽  
2015 ◽  
Vol 4 ◽  
Author(s):  
Matthew D Golub ◽  
Byron M Yu ◽  
Steven M Chase

To successfully guide limb movements, the brain takes in sensory information about the limb, internally tracks the state of the limb, and produces appropriate motor commands. It is widely believed that this process uses an internal model, which describes our prior beliefs about how the limb responds to motor commands. Here, we leveraged a brain-machine interface (BMI) paradigm in rhesus monkeys and novel statistical analyses of neural population activity to gain insight into moment-by-moment internal model computations. We discovered that a mismatch between subjects’ internal models and the actual BMI explains roughly 65% of movement errors, as well as long-standing deficiencies in BMI speed control. We then used the internal models to characterize how the neural population activity changes during BMI learning. More broadly, this work provides an approach for interpreting neural population activity in the context of how prior beliefs guide the transformation of sensory input to motor output.


2019 ◽  
Vol 116 (30) ◽  
pp. 15210-15215 ◽  
Author(s):  
Emily R. Oby ◽  
Matthew D. Golub ◽  
Jay A. Hennig ◽  
Alan D. Degenhart ◽  
Elizabeth C. Tyler-Kabara ◽  
...  

Learning has been associated with changes in the brain at every level of organization. However, it remains difficult to establish a causal link between specific changes in the brain and new behavioral abilities. We establish that new neural activity patterns emerge with learning. We demonstrate that these new neural activity patterns cause the new behavior. Thus, the formation of new patterns of neural population activity can underlie the learning of new skills.


Life ◽  
2021 ◽  
Vol 11 (4) ◽  
pp. 296
Author(s):  
Rodrigo Araneda ◽  
Sandra Silva Moura ◽  
Laurence Dricot ◽  
Anne G. De Volder

Using functional magnetic resonance imaging, here we monitored the brain activity in 12 early blind subjects and 12 blindfolded control subjects, matched for age, gender and musical experience, during a beat detection task. Subjects were required to discriminate regular (“beat”) from irregular (“no beat”) rhythmic sequences composed of sounds or vibrotactile stimulations. In both sensory modalities, the brain activity differences between the two groups involved heteromodal brain regions including parietal and frontal cortical areas and occipital brain areas, that were recruited in the early blind group only. Accordingly, early blindness induced brain plasticity changes in the cerebral pathways involved in rhythm perception, with a participation of the visually deprived occipital brain areas whatever the sensory modality for input. We conclude that the visually deprived cortex switches its input modality from vision to audition and vibrotactile sense to perform this temporal processing task, supporting the concept of a metamodal, multisensory organization of this cortex.


2021 ◽  
Vol 15 ◽  
Author(s):  
Daniela Buchwald ◽  
Stefan Schaffelhofer ◽  
Matthias Dörge ◽  
Benjamin Dann ◽  
Hansjörg Scherberger

Grasping movements are some of the most common movements primates do every day. They are important for social interactions as well as picking up objects or food. Usually, these grasping movements are guided by vision but proprioceptive and haptic inputs contribute greatly. Since grasping behaviors are common and easy to motivate, they represent an ideal task for understanding the role of different brain areas during planning and execution of complex voluntary movements in primates. For experimental purposes, a stable and repeatable presentation of the same object as well as the variation of objects is important in order to understand the neural control of movement generation. This is even more the case when investigating the role of different senses for movement planning, where objects need to be presented in specific sensory modalities. We developed a turntable setup for non-human primates (macaque monkeys) to investigate visually and tactually guided grasping movements with an option to easily exchange objects. The setup consists of a turntable that can fit six different objects and can be exchanged easily during the experiment to increase the number of presented objects. The object turntable is connected to a stepper motor through a belt system to automate rotation and hence object presentation. By increasing the distance between the turntable and the stepper motor, metallic components of the stepper motor are kept at a distance to the actual recording setup, which allows using a magnetic-based data glove to track hand kinematics. During task execution, the animal sits in the dark and is instructed to grasp the object in front of it. Options to turn on a light above the object allow for visual presentation of the objects, while the object can also remain in the dark for exclusive tactile exploration. A red LED is projected onto the object by a one-way mirror that serves as a grasp cue instruction for the animal to start grasping the object. By comparing kinematic data from the magnetic-based data glove with simultaneously recorded neural signals, this setup enables the systematic investigation of neural population activity involved in the neural control of hand grasping movements.


2019 ◽  
pp. 301-336
Author(s):  
György Buzsáki

This chapter discusses the hypothesis that the strongly skewed nature of our perceptions and memory result from log-normal distributions of anatomical connectivity at both micro- and mesoscales, synaptic weight distributions, firing rates, and neuronal population activity. Nearly all anatomical and physiological features of the brain are part of a continuous but wide distribution, typically obeying a log-normal form. This organization implies that the interactions that give rise to this distribution involve multiplication or division of random factors, resulting in values that can span several orders of magnitude. Neuronal networks with such broad distributions are needed to maintain stability against competing needs, including wide dynamic range, redundancy, resilience, homeostasis, and plasticity. These features of the brain may explain the Weber-Fechner law: for any sensory modality, perceptual intensity is a logarithmic function of physical intensity. Neuronal systems organized according to log rules form brain networks that can produce good-enough and fast decisions in most situations using only a subset of the brain’s resources.


2019 ◽  
Vol 2 (1) ◽  
Author(s):  
Christopher Heelan ◽  
Jihun Lee ◽  
Ronan O’Shea ◽  
Laurie Lynch ◽  
David M. Brandman ◽  
...  

AbstractDirect electronic communication with sensory areas of the neocortex is a challenging ambition for brain-computer interfaces. Here, we report the first successful neural decoding of English words with high intelligibility from intracortical spike-based neural population activity recorded from the secondary auditory cortex of macaques. We acquired 96-channel full-broadband population recordings using intracortical microelectrode arrays in the rostral and caudal parabelt regions of the superior temporal gyrus (STG). We leveraged a new neural processing toolkit to investigate the choice of decoding algorithm, neural preprocessing, audio representation, channel count, and array location on neural decoding performance. The presented spike-based machine learning neural decoding approach may further be useful in informing future encoding strategies to deliver direct auditory percepts to the brain as specific patterns of microstimulation.


Sign in / Sign up

Export Citation Format

Share Document