scholarly journals Without Diagonal Nonlinear Requirements: The More GeneralP-Critical Dynamical Analysis for UPPAM Recurrent Neural Networks

2013 ◽  
Vol 2013 ◽  
pp. 1-10
Author(s):  
Xi Chen ◽  
Huizhong Mao ◽  
Chen Qiao

Continuous-time recurrent neural networks (RNNs) play an important part in practical applications. Recently, due to the ability of assuring the convergence of the equilibriums on the boundary line between stable and unstable, the study on the critical dynamics behaviors of RNNs has drawn especial attentions. In this paper, a new asymptotical stable theorem and two corollaries are presented for the unified RNNs, that is, the UPPAM RNNs. The analysis results given in this paper are under the generallyP-critical conditions, which improve substantially upon the existing relevant critical convergence and stability results, and most important, the compulsory requirement of diagonally nonlinear activation mapping in most recent researches is removed. As a result, the theory in this paper can be applied more generally.

Actuators ◽  
2021 ◽  
Vol 10 (2) ◽  
pp. 30
Author(s):  
Pornthep Preechayasomboon ◽  
Eric Rombokas

Soft robotic actuators are now being used in practical applications; however, they are often limited to open-loop control that relies on the inherent compliance of the actuator. Achieving human-like manipulation and grasping with soft robotic actuators requires at least some form of sensing, which often comes at the cost of complex fabrication and purposefully built sensor structures. In this paper, we utilize the actuating fluid itself as a sensing medium to achieve high-fidelity proprioception in a soft actuator. As our sensors are somewhat unstructured, their readings are difficult to interpret using linear models. We therefore present a proof of concept of a method for deriving the pose of the soft actuator using recurrent neural networks. We present the experimental setup and our learned state estimator to show that our method is viable for achieving proprioception and is also robust to common sensor failures.


1999 ◽  
Vol 09 (02) ◽  
pp. 95-98 ◽  
Author(s):  
ANKE MEYER-BÄSE

This paper is concerned with the asymptotic hyperstability of recurrent neural networks. We derive based on the stability results necessary and sufficient conditions for the network parameters. The results we achieve are more general than those based on Lyapunov methods, since they provide milder constraints on the connection weights than the conventional results and do not suppose symmetry of the weights.


Sign in / Sign up

Export Citation Format

Share Document