scholarly journals Dimension reduction in recurrent networks by canonicalization

2021 ◽  
Vol 0 (0) ◽  
pp. 0
Author(s):  
Lyudmila Grigoryeva ◽  
Juan-Pablo Ortega

<p style='text-indent:20px;'>Many recurrent neural network machine learning paradigms can be formulated using state-space representations. The classical notion of canonical state-space realization is adapted in this paper to accommodate semi-infinite inputs so that it can be used as a dimension reduction tool in the recurrent networks setup. The so-called input forgetting property is identified as the key hypothesis that guarantees the existence and uniqueness (up to system isomorphisms) of canonical realizations for causal and time-invariant input/output systems with semi-infinite inputs. Additionally, the notion of optimal reduction coming from the theory of symmetric Hamiltonian systems is implemented in our setup to construct canonical realizations out of input forgetting but not necessarily canonical ones. These two procedures are studied in detail in the framework of linear fading memory input/output systems. {Finally, the notion of implicit reduction using reproducing kernel Hilbert spaces (RKHS) is introduced which allows, for systems with linear readouts, to achieve dimension reduction without the need to actually compute the reduced spaces introduced in the first part of the paper.</p>

2013 ◽  
Vol 11 (05) ◽  
pp. 1350020 ◽  
Author(s):  
HONGWEI SUN ◽  
QIANG WU

We study the asymptotical properties of indefinite kernel network with coefficient regularization and dependent sampling. The framework under investigation is different from classical kernel learning. Positive definiteness is not required by the kernel function and the samples are allowed to be weakly dependent with the dependence measured by a strong mixing condition. By a new kernel decomposition technique introduced in [27], two reproducing kernel Hilbert spaces and their associated kernel integral operators are used to characterize the properties and learnability of the hypothesis function class. Capacity independent error bounds and learning rates are deduced.


2014 ◽  
Vol 9 (4) ◽  
pp. 827-931 ◽  
Author(s):  
Joseph A. Ball ◽  
Dmitry S. Kaliuzhnyi-Verbovetskyi ◽  
Cora Sadosky ◽  
Victor Vinnikov

2009 ◽  
Vol 80 (3) ◽  
pp. 430-453 ◽  
Author(s):  
JOSEF DICK

AbstractWe give upper bounds on the Walsh coefficients of functions for which the derivative of order at least one has bounded variation of fractional order. Further, we also consider the Walsh coefficients of functions in periodic and nonperiodic reproducing kernel Hilbert spaces. A lower bound which shows that our results are best possible is also shown.


2017 ◽  
Vol 87 (2) ◽  
pp. 225-244 ◽  
Author(s):  
Rani Kumari ◽  
Jaydeb Sarkar ◽  
Srijan Sarkar ◽  
Dan Timotin

2018 ◽  
Vol 45 (2) ◽  
pp. 869-896 ◽  
Author(s):  
Parag Bobade ◽  
Suprotim Majumdar ◽  
Savio Pereira ◽  
Andrew J. Kurdila ◽  
John B. Ferris

Sign in / Sign up

Export Citation Format

Share Document