Gradient-type method for minimization of nonsmooth penalty functions

Cybernetics ◽  
1989 ◽  
Vol 24 (4) ◽  
pp. 511-524
Author(s):  
Yu. M. Danilin



2019 ◽  
Vol 35 (3) ◽  
pp. 371-378
Author(s):  
PORNTIP PROMSINCHAI ◽  
NARIN PETROT ◽  
◽  
◽  

In this paper, we consider convex constrained optimization problems with composite objective functions over the set of a minimizer of another function. The main aim is to test numerically a new algorithm, namely a stochastic block coordinate proximal-gradient algorithm with penalization, by comparing both the number of iterations and CPU times between this introduced algorithm and the other well-known types of block coordinate descent algorithm for finding solutions of the randomly generated optimization problems with regularization term.



Author(s):  
Antonio André Novotny ◽  
Jan Sokołowski ◽  
Antoni Żochowski
Keyword(s):  




2010 ◽  
Vol 59 (10) ◽  
pp. 3301-3307 ◽  
Author(s):  
Mahboubeh Farid ◽  
Wah June Leong ◽  
Malik Abu Hassan


2012 ◽  
Vol 2012 ◽  
pp. 1-11 ◽  
Author(s):  
Mahboubeh Farid ◽  
Wah June Leong ◽  
Lihong Zheng

This paper focuses on developing diagonal gradient-type methods that employ accumulative approach in multistep diagonal updating to determine a better Hessian approximation in each step. The interpolating curve is used to derive a generalization of the weak secant equation, which will carry the information of the local Hessian. The new parameterization of the interpolating curve in variable space is obtained by utilizing accumulative approach via a norm weighting defined by two positive definite weighting matrices. We also note that the storage needed for all computation of the proposed method is justO(n). Numerical results show that the proposed algorithm is efficient and superior by comparison with some other gradient-type methods.



2013 ◽  
Vol 2013 ◽  
pp. 1-6
Author(s):  
Can Li

We are concerned with the nonnegative constraints optimization problems. It is well known that the conjugate gradient methods are efficient methods for solving large-scale unconstrained optimization problems due to their simplicity and low storage. Combining the modified Polak-Ribière-Polyak method proposed by Zhang, Zhou, and Li with the Zoutendijk feasible direction method, we proposed a conjugate gradient type method for solving the nonnegative constraints optimization problems. If the current iteration is a feasible point, the direction generated by the proposed method is always a feasible descent direction at the current iteration. Under appropriate conditions, we show that the proposed method is globally convergent. We also present some numerical results to show the efficiency of the proposed method.



Biometrika ◽  
2020 ◽  
Vol 107 (2) ◽  
pp. 397-414 ◽  
Author(s):  
David E Tyler ◽  
Mengxi Yi

Summary The properties of penalized sample covariance matrices depend on the choice of the penalty function. In this paper, we introduce a class of nonsmooth penalty functions for the sample covariance matrix and demonstrate how their use results in a grouping of the estimated eigenvalues. We refer to the proposed method as lassoing eigenvalues, or the elasso.



Sign in / Sign up

Export Citation Format

Share Document