Stochastic Deformation of Invariant Tori In Neuron Model

Author(s):  
L. B. Ryashko ◽  
E. S. Slepukhina
Keyword(s):  
2016 ◽  
Vol 136 (10) ◽  
pp. 1424-1430 ◽  
Author(s):  
Yoshiki Sasaki ◽  
Katsutoshi Saeki ◽  
Yoshifumi Sekine

Author(s):  
Peter Mann

This chapter examines the structure of the phase space of an integrable system as being constructed from invariant tori using the Arnold–Liouville integrability theorem, and periodic flow and ergodic flow are investigated using action-angle theory. Time-dependent mechanics is formulated by extending the symplectic structure to a contact structure in an extended phase space before it is shown that mechanics has a natural setting on a jet bundle. The chapter then describes phase space of integrable systems and how tori behave when time-dependent dynamics occurs. Adiabatic invariance is discussed, as well as slow and fast Hamiltonian systems, the Hannay angle and counter adiabatic terms. In addition, the chapter discusses foliation, resonant tori, non-resonant tori, contact structures, Pfaffian forms, jet manifolds and Stokes’s theorem.


2021 ◽  
Vol 277 ◽  
pp. 234-274
Author(s):  
Xinyu Guan ◽  
Jianguo Si ◽  
Wen Si

Author(s):  
Serkan Kiranyaz ◽  
Junaid Malik ◽  
Habib Ben Abdallah ◽  
Turker Ince ◽  
Alexandros Iosifidis ◽  
...  

AbstractThe recently proposed network model, Operational Neural Networks (ONNs), can generalize the conventional Convolutional Neural Networks (CNNs) that are homogenous only with a linear neuron model. As a heterogenous network model, ONNs are based on a generalized neuron model that can encapsulate any set of non-linear operators to boost diversity and to learn highly complex and multi-modal functions or spaces with minimal network complexity and training data. However, the default search method to find optimal operators in ONNs, the so-called Greedy Iterative Search (GIS) method, usually takes several training sessions to find a single operator set per layer. This is not only computationally demanding, also the network heterogeneity is limited since the same set of operators will then be used for all neurons in each layer. To address this deficiency and exploit a superior level of heterogeneity, in this study the focus is drawn on searching the best-possible operator set(s) for the hidden neurons of the network based on the “Synaptic Plasticity” paradigm that poses the essential learning theory in biological neurons. During training, each operator set in the library can be evaluated by their synaptic plasticity level, ranked from the worst to the best, and an “elite” ONN can then be configured using the top-ranked operator sets found at each hidden layer. Experimental results over highly challenging problems demonstrate that the elite ONNs even with few neurons and layers can achieve a superior learning performance than GIS-based ONNs and as a result, the performance gap over the CNNs further widens.


2014 ◽  
Vol 41 (2) ◽  
pp. 249-258 ◽  
Author(s):  
Erol Egrioglu ◽  
Ufuk Yolcu ◽  
Cagdas Hakan Aladag ◽  
Eren Bas

2005 ◽  
Vol 25 (3) ◽  
pp. 481-491
Author(s):  
Yuzhen Bai ◽  
Deming Zhu

Sign in / Sign up

Export Citation Format

Share Document