A neural network based on the generalized FB function for nonlinear convex programs with second-order cone constraints

2016 ◽  
Vol 203 ◽  
pp. 62-72 ◽  
Author(s):  
Xinhe Miao ◽  
Jein-Shan Chen ◽  
Chun-Hsu Ko
Mathematics ◽  
2018 ◽  
Vol 6 (11) ◽  
pp. 270
Author(s):  
Ali Sadeghi ◽  
Mansour Saraj ◽  
Nezam Amiri

In this article, a methodology is developed to solve an interval and a fractional interval programming problem by converting into a non-interval form for second order cone constraints, with the objective function and constraints being interval valued functions. We investigate the parametric and non-parametric forms of the interval valued functions along with their convexity properties. Two approaches are developed to obtain efficient and properly efficient solutions. Furthermore, the efficient solutions or Pareto optimal solutions of fractional and non-fractional programming problems over R + n ⋃ { 0 } are also discussed. The main idea of the present article is to introduce a new concept for efficiency, called efficient space, caused by the lower and upper bounds of the respective intervals of the objective function which are shown in different figures. Finally, some numerical examples are worked through to illustrate the methodology and affirm the validity of the obtained results.


2015 ◽  
Vol 44 (2) ◽  
pp. 457-469 ◽  
Author(s):  
Julio López ◽  
Sebastián Maldonado ◽  
Miguel Carrasco

2019 ◽  
Vol 2019 ◽  
pp. 1-18 ◽  
Author(s):  
Juhe Sun ◽  
Xiao-Ren Wu ◽  
B. Saheya ◽  
Jein-Shan Chen ◽  
Chun-Hsu Ko

This paper focuses on solving the quadratic programming problems with second-order cone constraints (SOCQP) and the second-order cone constrained variational inequality (SOCCVI) by using the neural network. More specifically, a neural network model based on two discrete-type families of SOC complementarity functions associated with second-order cone is proposed to deal with the Karush-Kuhn-Tucker (KKT) conditions of SOCQP and SOCCVI. The two discrete-type SOC complementarity functions are newly explored. The neural network uses the two discrete-type families of SOC complementarity functions to achieve two unconstrained minimizations which are the merit functions of the Karuch-Kuhn-Tucker equations for SOCQP and SOCCVI. We show that the merit functions for SOCQP and SOCCVI are Lyapunov functions and this neural network is asymptotically stable. The main contribution of this paper lies on its simulation part because we observe a different numerical performance from the existing one. In other words, for our two target problems, more effective SOC complementarity functions, which work well along with the proposed neural network, are discovered.


Sign in / Sign up

Export Citation Format

Share Document