gradient projection algorithm
Recently Published Documents


TOTAL DOCUMENTS

64
(FIVE YEARS 15)

H-INDEX

11
(FIVE YEARS 2)

Author(s):  
Ihar Antonau ◽  
Majid Hojjat ◽  
Kai-Uwe Bletzinger

AbstractIn node-based shape optimization, there are a vast amount of design parameters, and the objectives, as well as the physical constraints, are non-linear in state and design. Robust optimization algorithms are required. The methods of feasible directions are widely used in practical optimization problems and know to be quite robust. A subclass of these methods is the gradient projection method. It is an active-set method, it can be used with equality and non-equality constraints, and it has gained significant popularity for its intuitive implementation. One significant issue around efficiency is that the algorithm may suffer from zigzagging behavior while it follows non-linear design boundaries. In this work, we propose a modification to Rosen’s gradient projection algorithm. It includes the efficient techniques to damp the zigzagging behavior of the original algorithm while following the non-linear design boundaries, thus improving the performance of the method.


Author(s):  
Jean Walrand

AbstractOnline learning algorithms update their estimates as additional observations are made. Section 12.1 explains a simple example: online linear regression. The stochastic gradient projection algorithm is a general technique to update estimates based on additional observations; it is widely used in machine learning. Section 12.2 presents the theory behind that algorithm. When analyzing large amounts of data, one faces the problems of identifying the most relevant data and of how to use efficiently the available data. Section 12.3 explains three examples of how these questions are addressed: the LASSO algorithm, compressed sensing, and the matrix completion problem. Section 12.4 discusses deep neural networks for which the stochastic gradient projection algorithm is easy to implement.


2020 ◽  
Vol 2020 ◽  
pp. 1-7
Author(s):  
Ye Li ◽  
Jun Sun ◽  
Biao Qu

Nonnegative sparsity-constrained optimization problem arises in many fields, such as the linear compressing sensing problem and the regularized logistic regression cost function. In this paper, we introduce a new stepsize rule and establish a gradient projection algorithm. We also obtain some convergence results under milder conditions.


Sign in / Sign up

Export Citation Format

Share Document