scholarly journals A Theory for the Emergence of Neocortical Network Architecture

2020 ◽  
Author(s):  
Daniel Udvary ◽  
Philipp Harth ◽  
Jakob H. Macke ◽  
Hans-Christian Hege ◽  
Christiaan P.J. de Kock ◽  
...  

Developmental programs that guide neurons and their neurites into specific subvolumes of the mammalian neocortex give rise to lifelong constraints for the formation of synaptic connections. To what degree do these constraints affect cortical wiring diagrams? Here we introduce an inverse modeling approach to show how cortical networks would appear if they were solely due to the spatial distributions of neurons and neurites. We find that neurite packing density and morphological diversity will inevitably translate into non-random pairwise and higher-order connectivity statistics. More importantly, we show that these non-random wiring properties are not arbitrary, but instead reflect the specific structural organization of the underlying neuropil. Our predictions are consistent with the empirically observed wiring specificity from subcellular to network scales. Thus, independent from learning and genetically encoded wiring rules, many of the properties that define the neocortex’ characteristic network architecture may emerge as a result of neuron and neurite development.

2020 ◽  
Vol 4 (1) ◽  
pp. 292-314 ◽  
Author(s):  
Max Nolte ◽  
Eyal Gal ◽  
Henry Markram ◽  
Michael W. Reimann

Synaptic connectivity between neocortical neurons is highly structured. The network structure of synaptic connectivity includes first-order properties that can be described by pairwise statistics, such as strengths of connections between different neuron types and distance-dependent connectivity, and higher order properties, such as an abundance of cliques of all-to-all connected neurons. The relative impact of first- and higher order structure on emergent cortical network activity is unknown. Here, we compare network structure and emergent activity in two neocortical microcircuit models with different synaptic connectivity. Both models have a similar first-order structure, but only one model includes higher order structure arising from morphological diversity within neuronal types. We find that such morphological diversity leads to more heterogeneous degree distributions, increases the number of cliques, and contributes to a small-world topology. The increase in higher order network structure is accompanied by more nuanced changes in neuronal firing patterns, such as an increased dependence of pairwise correlations on the positions of neurons in cliques. Our study shows that circuit models with very similar first-order structure of synaptic connectivity can have a drastically different higher order network structure, and suggests that the higher order structure imposed by morphological diversity within neuronal types has an impact on emergent cortical activity.


2020 ◽  
Vol 8 (S1) ◽  
pp. S110-S144 ◽  
Author(s):  
Jan Treur

AbstractIn network models for real-world domains, often network adaptation has to be addressed by incorporating certain network adaptation principles. In some cases, also higher order adaptation occurs: the adaptation principles themselves also change over time. To model such multilevel adaptation processes, it is useful to have some generic architecture. Such an architecture should describe and distinguish the dynamics within the network (base level), but also the dynamics of the network itself by certain adaptation principles (first-order adaptation level), and also the adaptation of these adaptation principles (second-order adaptation level), and may be still more levels of higher order adaptation. This paper introduces a multilevel network architecture for this, based on the notion network reification. Reification of a network occurs when a base network is extended by adding explicit states representing the characteristics of the structure of the base network. It will be shown how this construction can be used to explicitly represent network adaptation principles within a network. When the reified network is itself also reified, also second-order adaptation principles can be explicitly represented. The multilevel network reification construction introduced here is illustrated for an adaptive adaptation principle from social science for bonding based on homophily and one for metaplasticity in Cognitive Neuroscience.


Author(s):  
R. М. Peleshchak ◽  
V. V. Lytvyn ◽  
О. І. Cherniak ◽  
І. R. Peleshchak ◽  
М. V. Doroshenko

Context. To reduce the computational resource time in the problems of diagnosing and recognizing distorted images based on a fully connected stochastic pseudospin neural network, it becomes necessary to thin out synaptic connections between neurons, which is solved using the method of diagonalizing the matrix of synaptic connections without losing interaction between all neurons in the network. Objective. To create an architecture of a stochastic pseudo-spin neural network with diagonal synaptic connections without loosing the interaction between all the neurons in the layer to reduce its learning time. Method. The paper uses the Hausholder method, the method of compressing input images based on the diagonalization of the matrix of synaptic connections and the computer mathematics system MATLAB for converting a fully connected neural network into a tridiagonal form with hidden synaptic connections between all neurons. Results. We developed a model of a stochastic neural network architecture with sparse renormalized synaptic connections that take into account deleted synaptic connections. Based on the transformation of the synaptic connection matrix of a fully connected neural network into a Hessenberg matrix with tridiagonal synaptic connections, we proposed a renormalized local Hebb rule. Using the computer mathematics system “WolframMathematica 11.3”, we calculated, as a function of the number of neurons N, the relative tuning time of synaptic connections (per iteration) in a stochastic pseudospin neural network with a tridiagonal connection Matrix, relative to the tuning time of synaptic connections (per iteration) in a fully connected synaptic neural network. Conclusions. We found that with an increase in the number of neurons, the tuning time of synaptic connections (per iteration) in a stochastic pseudospin neural network with a tridiagonal connection Matrix, relative to the tuning time of synaptic connections (per iteration) in a fully connected synaptic neural network, decreases according to a hyperbolic law. Depending on the direction of pseudospin neurons, we proposed a classification of a renormalized neural network with a ferromagnetic structure, an antiferromagnetic structure, and a dipole glass.


2021 ◽  
Author(s):  
Miriam Bell ◽  
Padmini Rangamani

Synaptic plasticity involves the modification of both biochemical and structural components of neurons. Many studies have revealed that the change in the number density of the glutamatergic receptor AMPAR at the synapse is proportional to synaptic weight update; increase in AMPAR corresponds to strengthening of synapses while decrease in AMPAR density weakens synaptic connections. The dynamics of AMPAR are thought to be regulated by upstream signaling, primarily the calcium-CaMKII pathway, trafficking to and from the synapse, and influx from extrasynaptic sources. Here, we have developed a set of models using compartmental ordinary differential equations to systematically investigate contributions of signaling and trafficking variations on AMPAR dynamics at the synaptic site. We find that the model properties including network architecture and parameters significantly affect the integration of fast upstream species by slower downstream species. Furthermore, we predict that the model outcome, as determined by bound AMPAR at the synaptic site, depends on (a) the choice of signaling model (bistable CaMKII or monostable CaMKII dynamics), (b) trafficking versus influx contributions, and (c) frequency of stimulus. Therefore, AMPAR dynamics can have unexpected dependencies when upstream signaling dynamics (such as CaMKII and PP1) are coupled with trafficking modalities.


2013 ◽  
Vol 24 (10) ◽  
pp. 1507-1518 ◽  
Author(s):  
MacMillan Mbantenkhu ◽  
Sara Wierzbicki ◽  
Xiaowen Wang ◽  
Shangdong Guo ◽  
Stephan Wilkens ◽  
...  

Mgm101 is a Rad52-type single-stranded annealing protein (SSAP) required for mitochondrial DNA (mtDNA) repair and maintenance. Structurally, Mgm101 forms large oligomeric rings. Here we determine the function(s) of a 32–amino acid carboxyl-terminal tail (Mgm101238–269) conserved in the Mgm101 family of proteins. Mutagenic analysis shows that Lys-253, Trp-257, Arg-259, and Tyr-268 are essential for mtDNA maintenance. Mutations in Lys-251, Arg-252, Lys-260, and Tyr-266 affect mtDNA stability at 37°C and under oxidative stress. The Y268A mutation severely affects single-stranded DNA (ssDNA) binding without altering the ring structure. Mutations in the Lys-251–Arg-252–Lys-253 positive triad also affect ssDNA binding. Moreover, the C-tail alone is sufficient to mediate ssDNA binding. Finally, we find that the W257A and R259A mutations dramatically affect the conformation and oligomeric state of Mgm101. These structural alterations correlate with protein degradation in vivo. The data thus indicate that the C-tail of Mgm101, likely displayed on the ring surface, is required for ssDNA binding, higher-order structural organization, and protein stability. We speculate that an initial electrostatic and base-stacking interaction with ssDNA could remodel ring organization. This may facilitate the formation of nucleoprotein filaments competent for mtDNA repair. These findings could have broad implications for understanding how SSAPs promote DNA repair and genome maintenance.


2008 ◽  
Vol 20 (3) ◽  
pp. 668-708 ◽  
Author(s):  
Christopher DiMattina ◽  
Kechen Zhang

Identifying the optimal stimuli for a sensory neuron is often a difficult process involving trial and error. By analyzing the relationship between stimuli and responses in feedforward and stable recurrent neural network models, we find that the stimulus yielding the maximum firing rate response always lies on the topological boundary of the collection of all allowable stimuli, provided that individual neurons have increasing input-output relations or gain functions and that the synaptic connections are convergent between layers with nondegenerate weight matrices. This result suggests that in neurophysiological experiments under these conditions, only stimuli on the boundary need to be tested in order to maximize the response, thereby potentially reducing the number of trials needed for finding the most effective stimuli. Even when the gain functions allow firing rate cutoff or saturation, a peak still cannot exist in the stimulus-response relation in the sense that moving away from the optimum stimulus always reduces the response. We further demonstrate that the condition for nondegenerate synaptic connections also implies that proper stimuli can independently perturb the activities of all neurons in the same layer. One example of this type of manipulation is changing the activity of a single neuron in a given processing layer while keeping that of all others constant. Such stimulus perturbations might help experimentally isolate the interactions of selected neurons within a network.


Sign in / Sign up

Export Citation Format

Share Document