On the Asymptotic Information Storage Capacity of Neural Networks

1989 ◽  
pp. 271-280 ◽  
Author(s):  
G. Palm
2019 ◽  
Vol 369 ◽  
pp. 185-190 ◽  
Author(s):  
Masaki Kobayashi

2020 ◽  
Vol 11 (1) ◽  
Author(s):  
Jung Min Lee ◽  
Mo Beom Koo ◽  
Seul Woo Lee ◽  
Heelim Lee ◽  
Junho Kwon ◽  
...  

AbstractSynthesis of a polymer composed of a large discrete number of chemically distinct monomers in an absolutely defined aperiodic sequence remains a challenge in polymer chemistry. The synthesis has largely been limited to oligomers having a limited number of repeating units due to the difficulties associated with the step-by-step addition of individual monomers to achieve high molecular weights. Here we report the copolymers of α-hydroxy acids, poly(phenyllactic-co-lactic acid) (PcL) built via the cross-convergent method from four dyads of monomers as constituent units. Our proposed method allows scalable synthesis of sequence-defined PcL in a minimal number of coupling steps from reagents in stoichiometric amounts. Digital information can be stored in an aperiodic sequence of PcL, which can be fully retrieved as binary code by mass spectrometry sequencing. The information storage density (bit/Da) of PcL is 50% higher than DNA, and the storage capacity of PcL can also be increased by adjusting the molecular weight (~38 kDa).


1994 ◽  
Vol 49 (2) ◽  
pp. 1690-1698 ◽  
Author(s):  
Mathias Schlüter ◽  
Friedrich Wagner

2021 ◽  
Vol 118 (45) ◽  
pp. e2024890118
Author(s):  
Shu Ho ◽  
Rebecca Lajaunie ◽  
Marion Lerat ◽  
Mickaël Le ◽  
Valérie Crépel ◽  
...  

Cerebellar Purkinje neurons integrate information transmitted at excitatory synapses formed by granule cells. Although these synapses are considered essential sites for learning, most of them appear not to transmit any detectable electrical information and have been defined as silent. It has been proposed that silent synapses are required to maximize information storage capacity and ensure its reliability, and hence to optimize cerebellar operation. Such optimization is expected to occur once the cerebellar circuitry is in place, during its maturation and the natural and steady improvement of animal agility. We therefore investigated whether the proportion of silent synapses varies over this period, from the third to the sixth postnatal week in mice. Selective expression of a calcium indicator in granule cells enabled quantitative mapping of presynaptic activity, while postsynaptic responses were recorded by patch clamp in acute slices. Through this approach and the assessment of two anatomical features (the distance that separates adjacent planar Purkinje dendritic trees and the synapse density), we determined the average excitatory postsynaptic potential per synapse. Its value was four to eight times smaller than responses from paired recorded detectable connections, consistent with over 70% of synapses being silent. These figures remained remarkably stable across maturation stages. According to the proposed role for silent synapses, our results suggest that information storage capacity and reliability are optimized early during cerebellar maturation. Alternatively, silent synapses may have roles other than adjusting the information storage capacity and reliability.


1989 ◽  
Vol 22 (9) ◽  
pp. L407-L411 ◽  
Author(s):  
M Opper ◽  
J Kleinz ◽  
H Kohler ◽  
W Kinzel

2002 ◽  
Vol 66 (6) ◽  
Author(s):  
Joaquín J. Torres ◽  
Lovorka Pantic ◽  
Hilbert J. Kappen

2008 ◽  
Vol 11 (03) ◽  
pp. 433-442 ◽  
Author(s):  
GUIKUN WU ◽  
HONG ZHAO

We show that the delayed feedback neural networks for storing limit cycles can be trained using a global training algorithm. It is found that the storage capacity of the networks is in proportion to delay length as in the networks trained by the correlation learning based on Hebb's rule, but is much higher than in the latter. The generalization capacity of the networks is also higher than in the latter. Another interesting finding is that the spurious states or unwanted attractors totally disappear in the networks trained by the global training algorithm if the memory limit cycles are sufficiently long. The dynamics of the networks is investigated as a function of the length of limit cycles.


1996 ◽  
Vol 07 (01) ◽  
pp. 19-32 ◽  
Author(s):  
A. LAMURA ◽  
C. MARANGI ◽  
G. NARDULLI

In this paper we analyze replica symmetry breaking in attractor neural networks with non-monotone activation function. We study the non-monotone version of the Edinburgh model, which allows the control of the domains of attraction by the stability parameter K, and we compute, at one step of symmetry breaking, storage capacity and, for the strongly dilute model, the domains of attraction of the stable fixed points.


Sign in / Sign up

Export Citation Format

Share Document