A novel gradient-based neural network for solving convex second-order cone constrained variational inequality problems

2019 ◽  
Vol 347 ◽  
pp. 343-356 ◽  
Author(s):  
Alireza Nazemi ◽  
Atiye Sabeghi
2004 ◽  
Vol 16 (4) ◽  
pp. 863-883 ◽  
Author(s):  
Youshen Xia

Recently, a projection neural network has been shown to be a promising computational model for solving variational inequality problems with box constraints. This letter presents an extended projection neural network for solving monotone variational inequality problems with linear and nonlinear constraints. In particular, the proposed neural network can include the projection neural network as a special case. Compared with the modified projection-type methods for solving constrained monotone variational inequality problems, the proposed neural network has a lower complexity and is suitable for parallel implementation. Furthermore, the proposed neural network is theoretically proven to be exponentially convergent to an exact solution without a Lipschitz condition. Illustrative examples show that the extended projection neural network can be used to solve constrained monotone variational inequality problems.


2019 ◽  
Vol 2019 ◽  
pp. 1-18 ◽  
Author(s):  
Juhe Sun ◽  
Xiao-Ren Wu ◽  
B. Saheya ◽  
Jein-Shan Chen ◽  
Chun-Hsu Ko

This paper focuses on solving the quadratic programming problems with second-order cone constraints (SOCQP) and the second-order cone constrained variational inequality (SOCCVI) by using the neural network. More specifically, a neural network model based on two discrete-type families of SOC complementarity functions associated with second-order cone is proposed to deal with the Karush-Kuhn-Tucker (KKT) conditions of SOCQP and SOCCVI. The two discrete-type SOC complementarity functions are newly explored. The neural network uses the two discrete-type families of SOC complementarity functions to achieve two unconstrained minimizations which are the merit functions of the Karuch-Kuhn-Tucker equations for SOCQP and SOCCVI. We show that the merit functions for SOCQP and SOCCVI are Lyapunov functions and this neural network is asymptotically stable. The main contribution of this paper lies on its simulation part because we observe a different numerical performance from the existing one. In other words, for our two target problems, more effective SOC complementarity functions, which work well along with the proposed neural network, are discovered.


Sign in / Sign up

Export Citation Format

Share Document