scholarly journals A proximal alternating direction method for $\ell_{2,1}$-norm least squares problem in multi-task feature learning

2012 ◽  
Vol 8 (4) ◽  
pp. 1057-1069 ◽  
Author(s):  
Yunhai Xiao ◽  
◽  
Soon-Yi Wu ◽  
Bing-Sheng He ◽  
◽  
...  
2012 ◽  
Vol 2 (4) ◽  
pp. 326-341 ◽  
Author(s):  
Raymond H. Chan ◽  
Min Tao ◽  
Xiaoming Yuan

Abstract.The alternating direction method of multipliers (ADMM) is applied to a constrained linear least-squares problem, where the objective function is a sum of two least-squares terms and there are box constraints. The original problem is decomposed into two easier least-squares subproblems at each iteration, and to speed up the inner iteration we linearize the relevant subproblem whenever it has no known closed-form solution. We prove the convergence of the resulting algorithm, and apply it to solve some image deblurring problems. Its efficiency is demonstrated, in comparison with Newton-type methods.


Mathematics ◽  
2021 ◽  
Vol 9 (9) ◽  
pp. 941
Author(s):  
Hai-Long Shen ◽  
Xu Tang

In this paper, a preconditioned and proximal alternating direction method of multipliers (PPADMM) is established for iteratively solving the equality-constraint quadratic programming problems. Based on strictly matrix analysis, we prove that this method is asymptotically convergent. We also show the connection between this method with some existing methods, so it combines the advantages of the methods. Finally, the numerical examples show that the algorithm proposed is efficient, stable, and flexible for solving the quadratic programming problems with equality constraint.


Sign in / Sign up

Export Citation Format

Share Document