spectral gradient method
Recently Published Documents


TOTAL DOCUMENTS

20
(FIVE YEARS 5)

H-INDEX

5
(FIVE YEARS 0)

2021 ◽  
Vol 36 ◽  
pp. 04007
Author(s):  
Gillian Yi Han Woo ◽  
Hong Seng Sim ◽  
Yong Kheng Goh ◽  
Wah June Leong

In this paper, we propose to use spectral proximal method to solve sparse optimization problems. Sparse optimization refers to an optimization problem involving the ι0 -norm in objective or constraints. The previous research showed that the spectral gradient method is outperformed the other standard unconstrained optimization methods. This is due to spectral gradient method replaced the full rank matrix by a diagonal matrix and the memory decreased from Ο(n2) to Ο(n). Since ι0-norm term is nonconvex and non-smooth, it cannot be solved by standard optimization algorithm. We will solve the ι0 -norm problem with an underdetermined system as its constraint will be considered. Using Lagrange method, this problem is transformed into an unconstrained optimization problem. A new method called spectral proximal method is proposed, which is a combination of proximal method and spectral gradient method. The spectral proximal method is then applied to the ι0-norm unconstrained optimization problem. The programming code will be written in Python to compare the efficiency of the proposed method with some existing methods. The benchmarks of the comparison are based on number of iterations, number of functions call and the computational time. Theoretically, the proposed method requires less storage and less computational time.


2021 ◽  
Vol 0 (0) ◽  
pp. 0
Author(s):  
Hong Seng Sim ◽  
Chuei Yee Chen ◽  
Wah June Leong ◽  
Jiao Li

<p style='text-indent:20px;'>This paper proposes a nonmonotone spectral gradient method for solving large-scale unconstrained optimization problems. The spectral parameter is derived from the eigenvalues of an optimally sized memoryless symmetric rank-one matrix obtained under the measure defined as a ratio of the determinant of updating matrix over its largest eigenvalue. Coupled with a nonmonotone line search strategy where backtracking-type line search is applied selectively, the spectral parameter acts as a stepsize during iterations when no line search is performed and as a milder form of quasi-Newton update when backtracking line search is employed. Convergence properties of the proposed method are established for uniformly convex functions. Extensive numerical experiments are conducted and the results indicate that our proposed spectral gradient method outperforms some standard conjugate-gradient methods.</p>


2020 ◽  
Vol 2020 ◽  
pp. 1-7
Author(s):  
Yaping Hu ◽  
Liying Liu ◽  
Yujie Wang

The joint feature selection problem can be resolved by solving a matrix l2,1-norm minimization problem. For l2,1-norm regularization, one of the most fascinating features is that some similar sparsity structures can be employed by multiple predictors. However, the nonsmooth nature of the problem brings great challenges to the problem. In this paper, an alternating direction multiplier method combined with the spectral gradient method is proposed for solving the matrix l2,1-norm optimization problem involved with multitask feature learning. Numerical experiments show the effectiveness of the proposed algorithm.


2020 ◽  
Vol 14 (5) ◽  
pp. 86
Author(s):  
Auwal Bala Abubakar ◽  
Kanikar Muangchoo ◽  
Auwal Muhammad ◽  
Abdulkarim Hassan Ibrahim

In this paper, a new spectral gradient direction is proposed to solve the l1 -regularized convex minimization problem. The spectral parameter of the proposed method is computed as a convex combination of two existing spectral parameters of some conjugate gradient method. Moreover, the spectral gradient method is applied to the resulting problem at each iteration without requiring the Jacobian matrix. Furthermore, the proposed method is shown to have converge globally under some assumptions. Numerically, the proposed method is efficient and robust in terms of its quality in reconstructing sparse signal and low computational cost compared to the existing methods.


2018 ◽  
Vol 73 (5) ◽  
pp. 425-430
Author(s):  
Zhang Mengxia ◽  
Yang Xiaomin

AbstractCompatible pairs of Hamiltonian operators for the synthetical two-component model of Xia, Qiao, and Zhou are derived systematically by means of the spectral gradient method. A new two-component system, which is bi-Hamiltonian, is presented. For this new system, the construction of its peakon solutions is considered.


2015 ◽  
Vol 9 (3) ◽  
pp. 815-833 ◽  
Author(s):  
Donghui Li ◽  
Zixin Chen ◽  
Wanyou Cheng

Sign in / Sign up

Export Citation Format

Share Document