hopfield networks
Recently Published Documents


TOTAL DOCUMENTS

260
(FIVE YEARS 39)

H-INDEX

21
(FIVE YEARS 2)

Author(s):  
Philipp Seidl ◽  
Philipp Renz ◽  
Natalia Dyubankova ◽  
Paulo Neves ◽  
Jonas Verhoeven ◽  
...  
Keyword(s):  

Entropy ◽  
2021 ◽  
Vol 23 (11) ◽  
pp. 1494
Author(s):  
Christopher Hillar ◽  
Tenzin Chan ◽  
Rachel Taubman ◽  
David Rolnick

In 1943, McCulloch and Pitts introduced a discrete recurrent neural network as a model for computation in brains. The work inspired breakthroughs such as the first computer design and the theory of finite automata. We focus on learning in Hopfield networks, a special case with symmetric weights and fixed-point attractor dynamics. Specifically, we explore minimum energy flow (MEF) as a scalable convex objective for determining network parameters. We catalog various properties of MEF, such as biological plausibility, and them compare to classical approaches in the theory of learning. Trained Hopfield networks can perform unsupervised clustering and define novel error-correcting coding schemes. They also efficiently find hidden structures (cliques) in graph theory. We extend this known connection from graphs to hypergraphs and discover n-node networks with robust storage of 2Ω(n1−ϵ) memories for any ϵ>0. In the case of graphs, we also determine a critical ratio of training samples at which networks generalize completely.


2021 ◽  
pp. 281-318
Author(s):  
Abhijit S. Pandya ◽  
Robert B. Macy

2021 ◽  
Author(s):  
Sepp Hochreiter
Keyword(s):  

2021 ◽  
Author(s):  
Oskar Schnaack ◽  
Luca Peliti ◽  
Armita Nourmohammad

Storing memory for molecular recognition is an efficient strategy for responding to external stimuli. Biological processes use different strategies to store memory. In the olfactory cortex, synaptic connections form when stimulated by an odor, and establish distributed memory that can be retrieved upon re-exposure. In contrast, the immune system encodes specialized memory by diverse receptors that recognize a multitude of evolving pathogens. Despite the mechanistic differences between the olfactory and the immune memory, these systems can still be viewed as different information encoding strategies. Here, we present a theoretical framework with artificial neural networks to characterize optimal memory strategies for both static and dynamic (evolving) patterns. Our approach is a generalization of the energy-based Hopfield model in which memory is stored as a network's energy minima. We find that while classical Hopfield networks with distributed memory can efficiently encode a memory of static patterns, they are inadequate against evolving patterns. To follow an evolving pattern, we show that a distributed network should use a higher learning rate, which in turn, can distort the energy landscape associated with the stored memory attractors. Specifically, narrow connecting paths emerge between memory attractors, leading to misclassification of evolving patterns. We demonstrate that compartmentalized networks with specialized subnetworks are the optimal solutions to memory storage for evolving patterns. We postulate that evolution of pathogens may be the reason for the immune system to encoded a focused memory, in contrast to the distributed memory used in the olfactory cortex that interacts with mixtures of static odors.


Entropy ◽  
2021 ◽  
Vol 23 (4) ◽  
pp. 456
Author(s):  
Xitong Xu ◽  
Shengbo Chen

Image encryption is a confidential strategy to keep the information in digital images from being leaked. Due to excellent chaotic dynamic behavior, self-feedbacked Hopfield networks have been used to design image ciphers. However, Self-feedbacked Hopfield networks have complex structures, large computational amount and fixed parameters; these properties limit the application of them. In this paper, a single neuronal dynamical system in self-feedbacked Hopfield network is unveiled. The discrete form of single neuronal dynamical system is derived from a self-feedbacked Hopfield network. Chaotic performance evaluation indicates that the system has good complexity, high sensitivity, and a large chaotic parameter range. The system is also incorporated into a framework to improve its chaotic performance. The result shows the system is well adapted to this type of framework, which means that there is a lot of room for improvement in the system. To investigate its applications in image encryption, an image encryption scheme is then designed. Simulation results and security analysis indicate that the proposed scheme is highly resistant to various attacks and competitive with some exiting schemes.


Author(s):  
Amir Dembo ◽  
Reza Gheissari

AbstractConsider $$(X_{i}(t))$$ ( X i ( t ) ) solving a system of N stochastic differential equations interacting through a random matrix $${\mathbf {J}} = (J_{ij})$$ J = ( J ij ) with independent (not necessarily identically distributed) random coefficients. We show that the trajectories of averaged observables of $$(X_i(t))$$ ( X i ( t ) ) , initialized from some $$\mu $$ μ independent of $${\mathbf {J}}$$ J , are universal, i.e., only depend on the choice of the distribution $$\mathbf {J}$$ J through its first and second moments (assuming e.g., sub-exponential tails). We take a general combinatorial approach to proving universality for dynamical systems with random coefficients, combining a stochastic Taylor expansion with a moment matching-type argument. Concrete settings for which our results imply universality include aging in the spherical SK spin glass, and Langevin dynamics and gradient flows for symmetric and asymmetric Hopfield networks.


Sign in / Sign up

Export Citation Format

Share Document