scholarly journals The Independent Evolution of Dorsal Pallia in Multiple Vertebrate Lineages

2021 ◽  
pp. 1-12
Author(s):  
Georg F. Striedter ◽  
R. Glenn Northcutt

Comparative neurobiologists have long wondered when and how the dorsal pallium (e.g., mammalian neocortex) evolved. For the last 50 years, the most widely accepted answer has been that this structure was already present in the earliest vertebrates and, therefore, homologous between the major vertebrate lineages. One challenge for this hypothesis is that the olfactory bulbs project throughout most of the pallium in the most basal vertebrate lineages (notably lampreys, hagfishes, and lungfishes) but do not project to the putative dorsal pallia in teleosts, cartilaginous fishes, and amniotes (i.e., reptiles, birds, and mammals). To make sense of these data, one may hypothesize that a dorsal pallium existed in the earliest vertebrates and received extensive olfactory input, which was subsequently lost in several lineages. However, the dorsal pallium is notoriously difficult to delineate in many vertebrates, and its homology between the various lineages is often based on little more than its topology. Therefore, we suspect that dorsal pallia evolved independently in teleosts, cartilaginous fishes, and amniotes. We further hypothesize that the emergence of these dorsal pallia was accompanied by the phylogenetic restriction of olfactory projections to the pallium and the expansion of inputs from other sensory modalities. We do not deny that the earliest vertebrates may have possessed nonolfactory sensory inputs to some parts of the pallium, but such projections alone do not define a dorsal pallium.

2020 ◽  
Vol 45 (7) ◽  
pp. 523-531
Author(s):  
Sara Touj ◽  
Samie Cloutier ◽  
Amel Jemâa ◽  
Mathieu Piché ◽  
Gilles Bronchti ◽  
...  

Abstract It is well established that early blindness results in enhancement of the remaining nonvisual sensory modalities accompanied by functional and anatomical brain plasticity. While auditory and tactile functions have been largely investigated, the results regarding olfactory functions remained less explored and less consistent. In the present study, we investigated olfactory function in blind mice using 3 tests: the buried food test, the olfactory threshold test, and the olfactory performance test. The results indicated better performance of blind mice in the buried food test and odor performance test while there was no difference in the olfactory threshold test. Using histological measurements, we also investigated if there was anatomical plasticity in the olfactory bulbs (OB), the most salient site for olfactory processing. The results indicated a larger volume of the OB driven by larger glomerular and granular layers in blind mice compared with sighted mice. Structural plasticity in the OB may underlie the enhanced olfactory performance in blind mice.


Author(s):  
Feng Li ◽  
Jack Lindsey ◽  
Elizabeth C. Marin ◽  
Nils Otto ◽  
Marisa Dreher ◽  
...  

AbstractMaking inferences about the computations performed by neuronal circuits from synapse-level connectivity maps is an emerging opportunity in neuroscience. The mushroom body (MB) is well positioned for developing and testing such an approach due to its conserved neuronal architecture, recently completed dense connectome, and extensive prior experimental studies of its roles in learning, memory and activity regulation. Here we identify new components of the MB circuit in Drosophila, including extensive visual input and MB output neurons (MBONs) with direct connections to descending neurons. We find unexpected structure in sensory inputs, in the transfer of information about different sensory modalities to MBONs, and in the modulation of that transfer by dopaminergic neurons (DANs). We provide insights into the circuitry used to integrate MB outputs, connectivity between the MB and the central complex and inputs to DANs, including feedback from MBONs. Our results provide a foundation for further theoretical and experimental work.


2018 ◽  
Author(s):  
Lindsey A. Czarnecki ◽  
Andrew H. Moberly ◽  
Cynthia D. Fast ◽  
Daniel J. Turkel ◽  
John P. McGann

SummaryThe mammalian brain interprets sensory input based on prior multisensory knowledge of the external world, but it is unknown how this knowledge influences neural processing in individual sensory modalities. We found that GABAergic periglomerular interneuron populations in the olfactory bulb endogenously respond not only to odors but also to visual, auditory, and somatosensory stimuli in waking (but not anesthetized) mice. When these stimuli predict future odors, they evoke enhanced interneuron activity during the time odor normally occurs. When expectations are violated by omitting an expected “warning tone” before an odor, odor presentation evokes a burst of interneuron activity. The resulting GABA release presynaptically suppresses neurotransmitter release from the axon terminals of olfactory sensory neurons, the cells that transduce odor in the nasal epithelium and communicate this information to the brain. Expectations, even those evoked by cues in other sensory modalities, can thus affect the very first neurons in the olfactory system.


eLife ◽  
2020 ◽  
Vol 9 ◽  
Author(s):  
Feng Li ◽  
Jack W Lindsey ◽  
Elizabeth C Marin ◽  
Nils Otto ◽  
Marisa Dreher ◽  
...  

Making inferences about the computations performed by neuronal circuits from synapse-level connectivity maps is an emerging opportunity in neuroscience. The mushroom body (MB) is well positioned for developing and testing such an approach due to its conserved neuronal architecture, recently completed dense connectome, and extensive prior experimental studies of its roles in learning, memory and activity regulation. Here we identify new components of the MB circuit in Drosophila, including extensive visual input and MB output neurons (MBONs) with direct connections to descending neurons. We find unexpected structure in sensory inputs, in the transfer of information about different sensory modalities to MBONs, and in the modulation of that transfer by dopaminergic neurons (DANs). We provide insights into the circuitry used to integrate MB outputs, connectivity between the MB and the central complex and inputs to DANs, including feedback from MBONs. Our results provide a foundation for further theoretical and experimental work.


Crustaceana ◽  
2004 ◽  
Vol 77 (1) ◽  
pp. 1-24 ◽  
Author(s):  
◽  
◽  

AbstractCrayfish have well developed sense organs, but clear-cut relationships with their behavioural use have hardly been established. We studied the possible use of vision, olfaction, and touch during agonistic behaviour, assuming that the outcome of agonistic interactions primarily depends on the input from one of these sensory organs. Agonistic encounters were studied in triads of crayfish, intact and with reversible blockage of vision, olfaction, or touch. Tension actions (threat, strike, fight, or avoidance) categorized as positive or negative depending on the crayfish initiating them, allowed us to identify the dominant and submissive animals. The contribution of sense organs to the outcome of interactions was tested during agonistic behaviour by blocking them (one at a time) after the establishment of a hierarchy (after-experiments) of before it (before-experiments). Under control conditions, a large number of contacts allowed animals to establish a dominance order on the first day of agonistic interactions, and the number of positive contacts between animals diminished in subsequent days. Visual or chemical blockage in after-experiments did not change the dominance order, but positive contacts decreased or increased, respectively. Blinded-before animals established a dominance order in the first 3 days of agonistic interactions showing an elevated number of positive contacts during the observation period. A similar result occurred in anosmic-before animals. Results from crayfish in which antennae were immobilized were similar to those from controls. Results suggest that at least two sensory modalities are necessary to gather information about conspecifics. Once the order is established (learned) any one of the senses is sufficient to maintain it. We speculate that if a chemical compound is involved in the maintenance of the dominance order, it is released after localization of a conspecific by vision or touch, a manoeuvre that could minimize expenditure of a costly resource.


eLife ◽  
2016 ◽  
Vol 5 ◽  
Author(s):  
Andreas A Kardamakis ◽  
Juan Pérez-Fernández ◽  
Sten Grillner

Animals integrate the different senses to facilitate event-detection for navigation in their environment. In vertebrates, the optic tectum (superior colliculus) commands gaze shifts by synaptic integration of different sensory modalities. Recent works suggest that tectum can elaborate gaze reorientation commands on its own, rather than merely acting as a relay from upstream/forebrain circuits to downstream premotor centers. We show that tectal circuits can perform multisensory computations independently and, hence, configure final motor commands. Single tectal neurons receive converging visual and electrosensory inputs, as investigated in the lamprey - a phylogenetically conserved vertebrate. When these two sensory inputs overlap in space and time, response enhancement of output neurons occurs locally in the tectum, whereas surrounding areas and temporally misaligned inputs are inhibited. Retinal and electrosensory afferents elicit local monosynaptic excitation, quickly followed by inhibition via recruitment of GABAergic interneurons. Multisensory inputs can thus regulate event-detection within tectum through local inhibition without forebrain control.


2021 ◽  
Vol 83 (6) ◽  
pp. 377-381
Author(s):  
Maureen E. Dunbar ◽  
Jacqueline J. Shade

In a traditional anatomy and physiology lab, the general senses – temperature, pain, touch, pressure, vibration, and proprioception – and the special senses – olfaction (smell), vision, gustation (taste), hearing, and equilibrium – are typically taught in isolation. In reality, information derived from these individual senses interacts to produce the complex sensory experience that constitutes perception. To introduce students to the concept of multisensory integration, a crossmodal perception lab was developed. In this lab, students explore how vision impacts olfaction and how vision and olfaction interact to impact flavor perception. Students are required to perform a series of multisensory tasks that focus on the interaction of multiple sensory inputs and their impact on flavor and scent perception. Additionally, students develop their own hypothesis as to which sensory modalities they believe will best assist them in correctly identifying the flavor of a candy: taste alone, taste paired with scent, or taste paired with vision. Together these experiments give students an appreciation for multisensory integration while also encouraging them to actively engage in the scientific method. They are then asked to hypothesize the possible outcome of one last experiment after collecting and assessing data from the prior tasks.


2017 ◽  
Author(s):  
Yaelan Jung ◽  
Bart Larsen ◽  
Dirk B. Walther

AbstractNatural environments convey information through multiple sensory modalities, all of which contribute to people’s percepts. Although it has been shown that visual or auditory content of scene categories can be decoded from brain activity, it remains unclear where and how humans integrate different sensory inputs and represent scene information beyond a specific sensory modality domain. To address this question, we investigated how categories of scene images and sounds are represented in several brain regions. A mixed gender group of healthy human subjects participated the present study, where their brain activity was measured with fMRI while viewing images or listening to sounds of different places. We found that both visual and auditory scene categories can be decoded not only from modality-specific areas, but also from several brain regions in the temporal, parietal, and prefrontal cortex. Intriguingly, only in the prefrontal cortex, but not in any other regions, categories of scene images and sounds appear to be represented in similar activation patterns, suggesting that scene representations in the prefrontal cortex are modality-independent. Furthermore, the error patterns of neural decoders indicate that category-specific neural activity patterns in the middle and superior frontal gyri are tightly linked to categorization behavior. Our findings demonstrate that complex scene information is represented at an abstract level in the prefrontal cortex, regardless of the sensory modality of the stimulus.Statement of SignificanceOur experience in daily life requires the integration of multiple sensory inputs such as images, sounds, or scents from the environment. Here, for the first time, we investigated where and how in the brain information about the natural environment from multiple senses is merged to form modality-independent representations of scene categories. We show direct decoding of scene categories across sensory modalities from patterns of neural activity in the prefrontal cortex. We also conclusively tie these neural representations to human categorization behavior based on the errors from the neural decoder and behavior. Our findings suggest that the prefrontal cortex is a central hub for integrating sensory information and computing modality-independent representations of scene categories.


2018 ◽  
Vol 2018 ◽  
pp. 1-11 ◽  
Author(s):  
Manuel Teichert ◽  
Jürgen Bolz

On our way through a town, the things we see can make us change the way we go. The things that we hear can make us stop or walk on, or the things we feel can cause us to wear a warm jacket or just a t-shirt. All these behaviors are mediated by highly complex processing mechanisms in our brain and reflect responses to many important sensory inputs. The mammalian cerebral cortex, which processes the sensory information, consists of largely specialized sensory areas mainly receiving information from their corresponding sensory modalities. The first cortical regions receiving the input from the outer world are the so called primary sensory cortices. Strikingly, there is convincing evidence that primary sensory cortices do not work in isolation but are substantially affected by other sensory modalities. Here, we will review previous and current literature on this cross-modal interplay.


eLife ◽  
2014 ◽  
Vol 3 ◽  
Author(s):  
Jan Drugowitsch ◽  
Gregory C DeAngelis ◽  
Eliana M Klier ◽  
Dora E Angelaki ◽  
Alexandre Pouget

Humans and animals can integrate sensory evidence from various sources to make decisions in a statistically near-optimal manner, provided that the stimulus presentation time is fixed across trials. Little is known about whether optimality is preserved when subjects can choose when to make a decision (reaction-time task), nor when sensory inputs have time-varying reliability. Using a reaction-time version of a visual/vestibular heading discrimination task, we show that behavior is clearly sub-optimal when quantified with traditional optimality metrics that ignore reaction times. We created a computational model that accumulates evidence optimally across both cues and time, and trades off accuracy with decision speed. This model quantitatively explains subjects's choices and reaction times, supporting the hypothesis that subjects do, in fact, accumulate evidence optimally over time and across sensory modalities, even when the reaction time is under the subject's control.


Sign in / Sign up

Export Citation Format

Share Document