scholarly journals Coding of whisker motion across the mouse face

eLife ◽  
2019 ◽  
Vol 8 ◽  
Author(s):  
Kyle S Severson ◽  
Duo Xu ◽  
Hongdian Yang ◽  
Daniel H O'Connor

Haptic perception synthesizes touch with proprioception, the sense of body position. Humans and mice alike experience rich active touch of the face. Because most facial muscles lack proprioceptor endings, the sensory basis of facial proprioception remains unsolved. Facial proprioception may instead rely on mechanoreceptors that encode both touch and self-motion. In rodents, whisker mechanoreceptors provide a signal that informs the brain about whisker position. Whisking involves coordinated orofacial movements, so mechanoreceptors innervating facial regions other than whiskers could also provide information about whisking. To define all sources of sensory information about whisking available to the brain, we recorded spikes from mechanoreceptors innervating diverse parts of the face. Whisker motion was encoded best by whisker mechanoreceptors, but also by those innervating whisker pad hairy skin and supraorbital vibrissae. Redundant self-motion responses may provide the brain with a stable proprioceptive signal despite mechanical perturbations during active touch.

2018 ◽  
Author(s):  
Kyle S. Severson ◽  
Duo Xu ◽  
Hongdian Yang ◽  
Daniel H. O’Connor

AbstractHaptic perception synthesizes touch with proprioception, or sense of body position. Humans and mice alike experience rich active touch of the face. Because most facial muscles lack proprioceptor endings, the sensory basis of facial proprioception remains unsolved. Facial proprioception may instead rely on mechanoreceptors that encode both touch and self-motion. In rodents, whisker mechanoreceptors provide a signal that informs the brain about whisker position. Whisking involves coordinated orofacial movements, so mechanoreceptors innervating facial regions other than whiskers could also provide information about whisking. To define all sources of sensory information about whisking available to the brain, we recorded spikes from mechanoreceptors innervating diverse parts of the face. Whisker motion was encoded best by whisker mechanoreceptors, but also by those innervating whisker pad hairy skin and supraorbital vibrissae. Redundant self-motion responses may provide the brain with a stable proprioceptive signal despite mechanical perturbations such as whisker growth and active touch.


Author(s):  
Kathleen E. Cullen

As we go about our everyday activities, our brain computes accurate estimates of both our motion relative to the world, and of our orientation relative to gravity. Essential to this computation is the information provided by the vestibular system; it detects the rotational velocity and linear acceleration of our heads relative to space, making a fundamental contribution to our perception of self-motion and spatial orientation. Additionally, in everyday life, our perception of self-motion depends on the integration of both vestibular and nonvestibular cues, including visual and proprioceptive information. Furthermore, the integration of motor-related information is also required for perceptual stability, so that the brain can distinguish whether the experienced sensory inflow was a result of active self-motion through the world or if instead self-motion that was externally generated. To date, understanding how the brain encodes and integrates sensory cues with motor signals for the perception of self-motion during natural behaviors remains a major goal in neuroscience. Recent experiments have (i) provided new insights into the neural code used to represent sensory information in vestibular pathways, (ii) established that vestibular pathways are inherently multimodal at the earliest stages of processing, and (iii) revealed that self-motion information processing is adjusted to meet the needs of specific tasks. Our current level of understanding of how the brain integrates sensory information and motor-related signals to encode self-motion and ensure perceptual stability during everyday activities is reviewed.


2020 ◽  
Vol 30 (10) ◽  
pp. 5471-5483
Author(s):  
Y Yau ◽  
M Dadar ◽  
M Taylor ◽  
Y Zeighami ◽  
L K Fellows ◽  
...  

Abstract Current models of decision-making assume that the brain gradually accumulates evidence and drifts toward a threshold that, once crossed, results in a choice selection. These models have been especially successful in primate research; however, transposing them to human fMRI paradigms has proved it to be challenging. Here, we exploit the face-selective visual system and test whether decoded emotional facial features from multivariate fMRI signals during a dynamic perceptual decision-making task are related to the parameters of computational models of decision-making. We show that trial-by-trial variations in the pattern of neural activity in the fusiform gyrus reflect facial emotional information and modulate drift rates during deliberation. We also observed an inverse-urgency signal based in the caudate nucleus that was independent of sensory information but appeared to slow decisions, particularly when information in the task was ambiguous. Taken together, our results characterize how decision parameters from a computational model (i.e., drift rate and urgency signal) are involved in perceptual decision-making and reflected in the activity of the human brain.


Author(s):  
Donata Oertel ◽  
Xiao-Jie Cao ◽  
Alberto Recio-Spinoso

Plasticity in neuronal circuits is essential for optimizing connections as animals develop and for adapting to injuries and aging, but it can also distort the processing, as well as compromise the conveyance of ongoing sensory information. This chapter summarizes evidence from electrophysiological studies in slices and in vivo that shows how remarkably robust signaling is in principal cells of the ventral cochlear nucleus. Even in the face of short-term plasticity, these neurons signal rapidly and with temporal precision. They can relay ongoing acoustic information from the cochlea to the brain largely independently of sounds to which they were exposed previously.


PLoS Biology ◽  
2020 ◽  
Vol 18 (11) ◽  
pp. e3000882
Author(s):  
Jonathan Andrew Cheung ◽  
Phillip Maire ◽  
Jinho Kim ◽  
Kiana Lee ◽  
Garrett Flynn ◽  
...  

During active tactile exploration, the dynamic patterns of touch are transduced to electrical signals and transformed by the brain into a mental representation of the object under investigation. This transformation from sensation to perception is thought to be a major function of the mammalian cortex. In primary somatosensory cortex (S1) of mice, layer 5 (L5) pyramidal neurons are major outputs to downstream areas that influence perception, decision-making, and motor control. We investigated self-motion and touch representations in L5 of S1 with juxtacellular loose-seal patch recordings of optogenetically identified excitatory neurons. We found that during rhythmic whisker movement, 54 of 115 active neurons (47%) represented self-motion. This population was significantly more modulated by whisker angle than by phase. Upon active touch, a distinct pattern of activity was evoked across L5, which represented the whisker angle at the time of touch. Object location was decodable with submillimeter precision from the touch-evoked spike counts of a randomly sampled handful of these neurons. These representations of whisker angle during self-motion and touch were independent, both in the selection of which neurons were active and in the angle-tuning preference of coactive neurons. Thus, the output of S1 transiently shifts from a representation of self-motion to an independent representation of explored object location during active touch.


2019 ◽  
Author(s):  
Y. Yau ◽  
M. Dadar ◽  
M. Taylor ◽  
Y. Zeighami ◽  
L.K. Fellows ◽  
...  

AbstractCurrent models of decision-making assume that the brain gradually accumulates evidence and drifts towards a threshold which, once crossed, results in a choice selection. These models have been especially successful in primate research, however transposing them to human fMRI paradigms has proved challenging. Here, we exploit the face-selective visual system and test whether decoded emotional facial features from multivariate fMRI signals during a dynamic perceptual decision-making task are related to the parameters of computational models of decision-making. We show that trial-by-trial variations in the pattern of neural activity in the fusiform gyrus reflect facial emotional information and modulate drift rates during deliberation. We also observed an inverse-urgency signal based in the caudate nucleus that was independent of sensory information but appeared to slow decisions, particularly when information in the task was ambiguous. Taken together, our results characterize how decision parameters from a computational model (i.e., drift rate and urgency signal) are involved in perceptual decision-making and reflected in the activity of the human brain.


1999 ◽  
Vol 13 (2) ◽  
pp. 117-125 ◽  
Author(s):  
Laurence Casini ◽  
Françoise Macar ◽  
Marie-Hélène Giard

Abstract The experiment reported here was aimed at determining whether the level of brain activity can be related to performance in trained subjects. Two tasks were compared: a temporal and a linguistic task. An array of four letters appeared on a screen. In the temporal task, subjects had to decide whether the letters remained on the screen for a short or a long duration as learned in a practice phase. In the linguistic task, they had to determine whether the four letters could form a word or not (anagram task). These tasks allowed us to compare the level of brain activity obtained in correct and incorrect responses. The current density measures recorded over prefrontal areas showed a relationship between the performance and the level of activity in the temporal task only. The level of activity obtained with correct responses was lower than that obtained with incorrect responses. This suggests that a good temporal performance could be the result of an efficacious, but economic, information-processing mechanism in the brain. In addition, the absence of this relation in the anagram task results in the question of whether this relation is specific to the processing of sensory information only.


1984 ◽  
Vol 29 (7) ◽  
pp. 567-568
Author(s):  
Gilles Kirouac
Keyword(s):  
The Face ◽  

Author(s):  
Ann-Sophie Barwich

How much does stimulus input shape perception? The common-sense view is that our perceptions are representations of objects and their features and that the stimulus structures the perceptual object. The problem for this view concerns perceptual biases as responsible for distortions and the subjectivity of perceptual experience. These biases are increasingly studied as constitutive factors of brain processes in recent neuroscience. In neural network models the brain is said to cope with the plethora of sensory information by predicting stimulus regularities on the basis of previous experiences. Drawing on this development, this chapter analyses perceptions as processes. Looking at olfaction as a model system, it argues for the need to abandon a stimulus-centred perspective, where smells are thought of as stable percepts, computationally linked to external objects such as odorous molecules. Perception here is presented as a measure of changing signal ratios in an environment informed by expectancy effects from top-down processes.


2012 ◽  
Vol 2012 ◽  
pp. 1-12 ◽  
Author(s):  
Giulio Tononi ◽  
Chiara Cirelli

Sleep must serve an essential, universal function, one that offsets the risk of being disconnected from the environment. The synaptic homeostasis hypothesis (SHY) is an attempt to identify this essential function. Its core claim is that sleep is needed to reestablish synaptic homeostasis, which is challenged by the remarkable plasticity of the brain. In other words, sleep is “the price we pay for plasticity.” In this issue, M. G. Frank reviewed several aspects of the hypothesis and raised several issues. The comments below provide a brief summary of the motivations underlying SHY and clarify that SHY is a hypothesis not about specific mechanisms, but about a universal, essential function of sleep. This function is the preservation of synaptic homeostasis in the face of a systematic bias toward a net increase in synaptic strength—a challenge that is posed by learning during adult wake, and by massive synaptogenesis during development.


Sign in / Sign up

Export Citation Format

Share Document