IV. There is more to taste than meets the tongue

2000 ◽  
Vol 278 (1) ◽  
pp. G6-G9 ◽  
Author(s):  
Donald B. Katz ◽  
Miguel A. L. Nicolelis ◽  
S. A. Simon

The tongue is the principal organ that provides sensory information about the quality and quantity of chemicals in food. Other information about the temperature and texture of food is also transduced on the tongue, via extragemmal receptors that form branches of the trigeminal, glossopharyngeal, and vagal nerves. These systems, together with information from the gastrointestinal (GI) system, interact to determine whether or not food is palatable. In this themes article, emphasis is placed on the integrative aspects of gustatory processing by showing the convergence of gustatory information with somatosensory, nociceptive, and visceral information (from the GI system) on the tongue and in the brain. Our thesis is that gustation should be thought of as an integral part of a distributed, interacting multimodal system in which information from other systems, including the GI system, can modulate the taste of food.

1999 ◽  
Vol 13 (2) ◽  
pp. 117-125 ◽  
Author(s):  
Laurence Casini ◽  
Françoise Macar ◽  
Marie-Hélène Giard

Abstract The experiment reported here was aimed at determining whether the level of brain activity can be related to performance in trained subjects. Two tasks were compared: a temporal and a linguistic task. An array of four letters appeared on a screen. In the temporal task, subjects had to decide whether the letters remained on the screen for a short or a long duration as learned in a practice phase. In the linguistic task, they had to determine whether the four letters could form a word or not (anagram task). These tasks allowed us to compare the level of brain activity obtained in correct and incorrect responses. The current density measures recorded over prefrontal areas showed a relationship between the performance and the level of activity in the temporal task only. The level of activity obtained with correct responses was lower than that obtained with incorrect responses. This suggests that a good temporal performance could be the result of an efficacious, but economic, information-processing mechanism in the brain. In addition, the absence of this relation in the anagram task results in the question of whether this relation is specific to the processing of sensory information only.


Author(s):  
Ann-Sophie Barwich

How much does stimulus input shape perception? The common-sense view is that our perceptions are representations of objects and their features and that the stimulus structures the perceptual object. The problem for this view concerns perceptual biases as responsible for distortions and the subjectivity of perceptual experience. These biases are increasingly studied as constitutive factors of brain processes in recent neuroscience. In neural network models the brain is said to cope with the plethora of sensory information by predicting stimulus regularities on the basis of previous experiences. Drawing on this development, this chapter analyses perceptions as processes. Looking at olfaction as a model system, it argues for the need to abandon a stimulus-centred perspective, where smells are thought of as stable percepts, computationally linked to external objects such as odorous molecules. Perception here is presented as a measure of changing signal ratios in an environment informed by expectancy effects from top-down processes.


2004 ◽  
Vol 27 (3) ◽  
pp. 377-396 ◽  
Author(s):  
Rick Grush

The emulation theory of representation is developed and explored as a framework that can revealingly synthesize a wide variety of representational functions of the brain. The framework is based on constructs from control theory (forward models) and signal processing (Kalman filters). The idea is that in addition to simply engaging with the body and environment, the brain constructs neural circuits that act as models of the body and environment. During overt sensorimotor engagement, these models are driven by efference copies in parallel with the body and environment, in order to provide expectations of the sensory feedback, and to enhance and process sensory information. These models can also be run off-line in order to produce imagery, estimate outcomes of different actions, and evaluate and develop motor plans. The framework is initially developed within the context of motor control, where it has been shown that inner models running in parallel with the body can reduce the effects of feedback delay problems. The same mechanisms can account for motor imagery as the off-line driving of the emulator via efference copies. The framework is extended to account for visual imagery as the off-line driving of an emulator of the motor-visual loop. I also show how such systems can provide for amodal spatial imagery. Perception, including visual perception, results from such models being used to form expectations of, and to interpret, sensory input. I close by briefly outlining other cognitive functions that might also be synthesized within this framework, including reasoning, theory of mind phenomena, and language.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Helen Feigin ◽  
Shira Baror ◽  
Moshe Bar ◽  
Adam Zaidel

AbstractPerceptual decisions are biased by recent perceptual history—a phenomenon termed 'serial dependence.' Here, we investigated what aspects of perceptual decisions lead to serial dependence, and disambiguated the influences of low-level sensory information, prior choices and motor actions. Participants discriminated whether a brief visual stimulus lay to left/right of the screen center. Following a series of biased ‘prior’ location discriminations, subsequent ‘test’ location discriminations were biased toward the prior choices, even when these were reported via different motor actions (using different keys), and when the prior and test stimuli differed in color. By contrast, prior discriminations about an irrelevant stimulus feature (color) did not substantially influence subsequent location discriminations, even though these were reported via the same motor actions. Additionally, when color (not location) was discriminated, a bias in prior stimulus locations no longer influenced subsequent location discriminations. Although low-level stimuli and motor actions did not trigger serial-dependence on their own, similarity of these features across discriminations boosted the effect. These findings suggest that relevance across perceptual decisions is a key factor for serial dependence. Accordingly, serial dependence likely reflects a high-level mechanism by which the brain predicts and interprets new incoming sensory information in accordance with relevant prior choices.


2015 ◽  
Vol 370 (1668) ◽  
pp. 20140172 ◽  
Author(s):  
Marcus E. Raichle

Traditionally studies of brain function have focused on task-evoked responses. By their very nature such experiments tacitly encourage a reflexive view of brain function. While such an approach has been remarkably productive at all levels of neuroscience, it ignores the alternative possibility that brain functions are mainly intrinsic and ongoing, involving information processing for interpreting, responding to and predicting environmental demands. I suggest that the latter view best captures the essence of brain function, a position that accords well with the allocation of the brain's energy resources, its limited access to sensory information and a dynamic, intrinsic functional organization. The nature of this intrinsic activity, which exhibits a surprising level of organization with dimensions of both space and time, is revealed in the ongoing activity of the brain and its metabolism. As we look to the future, understanding the nature of this intrinsic activity will require integrating knowledge from cognitive and systems neuroscience with cellular and molecular neuroscience where ion channels, receptors, components of signal transduction and metabolic pathways are all in a constant state of flux. The reward for doing so will be a much better understanding of human behaviour in health and disease.


2019 ◽  
Author(s):  
Shigenori Inagaki ◽  
Ryo Iwata ◽  
Masakazu Iwamoto ◽  
Takeshi Imai

SUMMARYSensory information is selectively or non-selectively inhibited and enhanced in the brain, but it remains unclear whether this occurs commonly at the peripheral stage. Here, we performed two-photon calcium imaging of mouse olfactory sensory neurons (OSNs) in vivo and found that odors produce not only excitatory but also inhibitory responses at their axon terminals. The inhibitory responses remained in mutant mice, in which all possible sources of presynaptic lateral inhibition were eliminated. Direct imaging of the olfactory epithelium revealed widespread inhibitory responses at OSN somata. The inhibition was in part due to inverse agonism toward the odorant receptor. We also found that responses to odor mixtures are often suppressed or enhanced in OSNs: Antagonism was dominant at higher odor concentrations, whereas synergy was more prominent at lower odor concentrations. Thus, odor responses are extensively tuned by inhibition, antagonism, and synergy, at the early peripheral stage, contributing to robust odor representations.


2010 ◽  
Author(s):  
Αικατερίνη Χαραλαμποπούλου

In this study I have attempted to present a linguistic investigation into the nature and structure of time, based on proposals developed in Evans (2004). Accordingly, as linguistic structure and particularly patterns of elaboration reflect conceptual structure conventionalized into a format encodable in language, this study presents an examination of the human conceptual system for time. Indeed, an examination of the ways in which language lexicalizes time provides important insights into the nature and organization of time. That is, given the widely held assumption that semantic structure derives from and reflects, at least partially, conceptual structure, language offers a direct way of investigating the human conceptual system. However, how time is realized at the conceptual level, that is, how we represent time as revealed by the way temporal concepts are encoded in language, does not tell the whole story, if we are to uncover the nature and structure of time. Research in cognitive science suggests that phenomenological experience and the nature of the external world of sensory experience to which subjective experience constitutes a response, give rise to our pre- conceptual experience of time. In other words, as Evans (2004) says, time is not restricted to one particular layer of experience but it rather “constitutes a complex range of phenomena and processes which relate to different levels and kinds of experience” (ibid.: 5). Accordingly, while my focus in this study is on the temporal structure, which is to say the organization and structuring of temporal concepts, at the conceptual level, I have also attempted to present an examination of the nature of temporal experience at the pre-conceptual level (prior to representation in conceptual structure). In this regard, I have examined the results of research from neuroscience, cognitive psychology and social psychology. More specifically and with respect to evidence from neuroscience, it is suggested that temporal experience is ultimately grounded in neurological mechanisms necessary for regulating and facilitating perception (e.g., Pöppel 1994). That is, perceptual processing is underpinned by the occurrence of neurologically instantiated temporal intervals, the perceptual moments, which facilitate the integration of sensory information into coherent percepts. As we have seen, there is no single place in the brain where perceptual input derived from different modalities, or even information from within the same modality, can be integrated. In other words, there is no one place where spatially distributed sensory information associated with the distinct perceptual processing areas of the brain, are integrated in order to produce a coherent percept. Rather, what seems to be the case is that the integration of sensory information into coherent percepts is enabled by the phenomena of periodic perceptual moments. Such a mechanism enables us to perceive, in that the nature of our percepts are in an important sense ‘constructed’. Put another way, perception is a kind of constructive process which updates successive perceptual information to which an organism has access. The updating occurs by virtue of innate timing mechanisms, the perceptual moments, which occur at all levels of neurological processing and range from a fraction of second up to an outer limit of about three seconds. It is these timing mechanisms which form the basis of our temporal experience. As Gell says, “perception is intrinsically time-perception, and conversely, time-perception, or internal time-consciousness, is just perception itself...That is to say, time is not something we encounter as a feature of contingent reality, as if it lay outside us, waiting to be perceived along with tables and chairs and the rest of the perceptible contents of the universe. Instead, subjective time arises as inescapable feature of the perceptual process itself, which enters into the perception of anything whatsoever” (1992: 231). In other words, our experience of time is a consequence of the various innate ‘timing mechanisms' in the brain which give rise to a range of perceptual moments, which are in turn necessary for and underpin perceptual processing. In this way, time exists into the experience of everything as it is fundamental to the way in which perceptual process operates. […]


2019 ◽  
Author(s):  
David A. Tovar ◽  
Micah M. Murray ◽  
Mark T. Wallace

AbstractObjects are the fundamental building blocks of how we create a representation of the external world. One major distinction amongst objects is between those that are animate versus inanimate. Many objects are specified by more than a single sense, yet the nature by which multisensory objects are represented by the brain remains poorly understood. Using representational similarity analysis of human EEG signals, we show enhanced encoding of audiovisual objects when compared to their corresponding visual and auditory objects. Surprisingly, we discovered the often-found processing advantages for animate objects was not evident in a multisensory context due to greater neural enhancement of inanimate objects—the more weakly encoded objects under unisensory conditions. Further analysis showed that the selective enhancement of inanimate audiovisual objects corresponded with an increase in shared representations across brain areas, suggesting that neural enhancement was mediated by multisensory integration. Moreover, a distance-to-bound analysis provided critical links between neural findings and behavior. Improvements in neural decoding at the individual exemplar level for audiovisual inanimate objects predicted reaction time differences between multisensory and unisensory presentations during a go/no-go animate categorization task. Interestingly, links between neural activity and behavioral measures were most prominent 100 to 200ms and 350 to 500ms after stimulus presentation, corresponding to time periods associated with sensory evidence accumulation and decision-making, respectively. Collectively, these findings provide key insights into a fundamental process the brain uses to maximize information it captures across sensory systems to perform object recognition.Significance StatementOur world is filled with an ever-changing milieu of sensory information that we are able to seamlessly transform into meaningful perceptual experience. We accomplish this feat by combining different features from our senses to construct objects. However, despite the fact that our senses do not work in isolation but rather in concert with each other, little is known about how the brain combines the senses together to form object representations. Here, we used EEG and machine learning to study how the brain processes auditory, visual, and audiovisual objects. Surprisingly, we found that non-living objects, the objects which were more difficult to process with one sense alone, benefited the most from engaging multiple senses.


Author(s):  
Roy E. Ritzmann ◽  
Sasha N. Zill

This article discusses legged locomotion in insects. It describes the basic patterns of coordinated movement both within each leg and among the various legs. The nervous system controls these actions through groups of joint pattern generators coupled through interneurons and interjoint reflexes in a range of insect species. These local control systems within the thoracic ganglia rely on leg proprioceptors that monitor joint movement and cuticular strain interacting with central pattern generation interneurons. The local control systems can change quantitatively and qualitatively as needed to generate turns or more forceful movements. In dealing with substantial obstacles or changes in navigational movements, more profound changes are required. These rely on sensory information processed in the brain that projects to the multimodal sensorimotor neuropils collectively referred to as the central complex. The central complex affects descending commands that alter local control circuits to accomplish appropriate redirected movements.


Author(s):  
J. Eric Ahlskog

As a prelude to the treatment chapters that follow, we need to define and describe the types of problems and symptoms encountered in DLB and PDD. The clinical picture can be quite varied: problems encountered by one person may be quite different from those encountered by another person, and symptoms that are problematic in one individual may be minimal in another. In these disorders, the Lewy neurodegenerative process potentially affects certain nervous system regions but spares others. Affected areas include thinking and memory circuits, as well as movement (motor) function and the autonomic nervous system, which regulates primary functions such as bladder, bowel, and blood pressure control. Many other brain regions, by contrast, are spared or minimally involved, such as vision and sensation. The brain and spinal cord constitute the central nervous system. The interface between the brain and spinal cord is by way of the brain stem, as shown in Figure 4.1. Thought, memory, and reasoning are primarily organized in the thick layers of cortex overlying lower brain levels. Volitional movements, such as writing, throwing, or kicking, also emanate from the cortex and integrate with circuits just below, including those in the basal ganglia, shown in Figure 4.2. The basal ganglia includes the striatum, globus pallidus, subthalamic nucleus, and substantia nigra, as illustrated in Figure 4.2. Movement information is integrated and modulated in these basal ganglia nuclei and then transmitted down the brain stem to the spinal cord. At spinal cord levels the correct sequence of muscle activation that has been programmed is accomplished. Activated nerves from appropriate regions of the spinal cord relay the signals to the proper muscles. Sensory information from the periphery (limbs) travels in the opposite direction. How are these signals transmitted? Brain cells called neurons have long, wire-like extensions that interface with other neurons, effectively making up circuits that are slightly similar to computer circuits; this is illustrated in Figure 4.3. At the end of these wire-like extensions are tiny enlargements (terminals) that contain specific biological chemicals called neurotransmitters. Neurotransmitters are released when the electrical signal travels down that neuron to the end of that wire-like process.


Sign in / Sign up

Export Citation Format

Share Document