Nonlinear Parametric Transformation and Generation of Images Based on a Network with the CWNL Layer

2021 ◽  
pp. 415-425
Author(s):  
Slawomir Golak
1996 ◽  
Vol 06 (04) ◽  
pp. 725-735 ◽  
Author(s):  
ALEXANDER Yu. LOSKUTOV ◽  
VALERY M. TERESHKO ◽  
KONSTANTIN A. VASILIEV

We consider one-dimensional maps, the logistic map and an exponential map, in those sets of parameter values which correspond to their chaotic dynamics. It is proven that such dynamics may be stabilized by a certain cyclic parametric transformation operating strictly within the chaotic set. The stabilization is a result of the creation of stable periodic orbits in the initially chaotic maps. The period of these stable orbits is a multiple of the period of the cyclic transformation. It is shown that stabilized behavior cannot be destroyed by a weak noise smearing of the required parameter values. The regions where the behavior stabilization takes place are numerically estimated. Periods of the created stabile periodic orbits are calculated.


2018 ◽  
Vol 2 (OOPSLA) ◽  
pp. 1-28 ◽  
Author(s):  
James Koppel ◽  
Varot Premtoon ◽  
Armando Solar-Lezama

Author(s):  
Dipan K. Pal ◽  
Marios Savvides

ConvNets, through their architecture, only enforce invariance to translation. In this paper, we introduce a new class of deep convolutional architectures called Non-Parametric Transformation Networks (NPTNs) which can learn general invariances and symmetries directly from data. NPTNs are a natural generalization of ConvNets and can be optimized directly using gradient descent. Unlike almost all previous works in deep architectures, they make no assumption regarding the structure of the invariances present in the data and in that aspect are flexible and powerful. We also model ConvNets and NPTNs under a unified framework called Transformation Networks (TN), which yields a better understanding of the connection between the two. We demonstrate the efficacy of NPTNs on data such as MNIST with extreme transformations and CIFAR10 where they outperform baselines, and further outperform several recent algorithms on ETH-80. They do so while having the same number of parameters. We also show that they are more effective than ConvNets in modelling symmetries and invariances from data, without the explicit knowledge of the added arbitrary nuisance transformations. Finally, we replace ConvNets with NPTNs within Capsule Networks and show that this enables Capsule Nets to perform even better.


Sign in / Sign up

Export Citation Format

Share Document