A Neural Network Model for Non-smooth Optimization over a Compact Convex Subset

Author(s):  
Guocheng Li ◽  
Shiji Song ◽  
Cheng Wu ◽  
Zifang Du
2018 ◽  
Vol 62 (7) ◽  
pp. 1061-1085
Author(s):  
Alireza Nazemi ◽  
Marziyeh Mortezaee

Abstract In this paper, we describe a new neural network model for solving a class of non-smooth optimization problems with min–max objective function. The basic idea is to replace the min–max function by a smooth one using an entropy function. With this smoothing technique, the non-smooth problem is converted into an equivalent differentiable convex programming problem. A neural network model is then constructed based on Karush–Kuhn–Tucker optimality conditions. It is investigated that the proposed neural network is stable in the sense of Lyapunov and can converge to an exact optimal solution of the original problem. As an application in economics, we use the proposed scheme to a min–max portfolio optimization problems. The effectiveness of the method is demonstrated by several numerical simulations.


Author(s):  
Seetharam .K ◽  
Sharana Basava Gowda ◽  
. Varadaraj

In Software engineering software metrics play wide and deeper scope. Many projects fail because of risks in software engineering development[1]t. Among various risk factors creeping is also one factor. The paper discusses approximate volume of creeping requirements that occur after the completion of the nominal requirements phase. This is using software size measured in function points at four different levels. The major risk factors are depending both directly and indirectly associated with software size of development. Hence It is possible to predict risk due to creeping cause using size.


Sign in / Sign up

Export Citation Format

Share Document