Data Processing with Combined Homotopy Methods for a Class of Nonconvex Optimization Problems

2014 ◽  
Vol 1046 ◽  
pp. 403-406 ◽  
Author(s):  
Yun Feng Gao ◽  
Ning Xu

On the existing theoretical results, this paper studies the realization of combined homotopy methods on optimization problems in a specific class of nonconvex constrained region. Contraposing to this nonconvex constrained region, we give the structure method of the quasi-normal, prove that the chosen mappings on constrained grads are positive independent and the feasible region on SLM satisfies the quasi-normal cone condition. And we construct combined homotopy equation under the quasi-normal cone condition with numerical value and examples, and get a preferable result by data processing.

2021 ◽  
Author(s):  
Tianyi Liu ◽  
Zhehui Chen ◽  
Enlu Zhou ◽  
Tuo Zhao

Momentum stochastic gradient descent (MSGD) algorithm has been widely applied to many nonconvex optimization problems in machine learning (e.g., training deep neural networks, variational Bayesian inference, etc.). Despite its empirical success, there is still a lack of theoretical understanding of convergence properties of MSGD. To fill this gap, we propose to analyze the algorithmic behavior of MSGD by diffusion approximations for nonconvex optimization problems with strict saddle points and isolated local optima. Our study shows that the momentum helps escape from saddle points but hurts the convergence within the neighborhood of optima (if without the step size annealing or momentum annealing). Our theoretical discovery partially corroborates the empirical success of MSGD in training deep neural networks.


Author(s):  
Abdelkrim El Mouatasim ◽  
Rachid Ellaia ◽  
Eduardo de Cursi

Random perturbation of the projected variable metric method for nonsmooth nonconvex optimization problems with linear constraintsWe present a random perturbation of the projected variable metric method for solving linearly constrained nonsmooth (i.e., nondifferentiable) nonconvex optimization problems, and we establish the convergence to a global minimum for a locally Lipschitz continuous objective function which may be nondifferentiable on a countable set of points. Numerical results show the effectiveness of the proposed approach.


Sign in / Sign up

Export Citation Format

Share Document