Fast Algorithms for LS and LAD-Collaborative Regression

Author(s):  
Jun Sun ◽  
Lingchen Kong ◽  
Mei Li

With the development of modern science and technology, it is easy to obtain a large number of high-dimensional datasets, which are related but different. Classical unimodel analysis is less likely to capture potential links between the different datasets. Recently, a collaborative regression model based on least square (LS) method for this problem has been proposed. In this paper, we propose a robust collaborative regression based on the least absolute deviation (LAD). We give the statistical interpretation of the LS-collaborative regression and LAD-collaborative regression. Then we design an efficient symmetric Gauss–Seidel-based alternating direction method of multipliers algorithm to solve the two models, which has the global convergence and the Q-linear rate of convergence. Finally we report numerical experiments to illustrate the efficiency of the proposed methods.

2013 ◽  
Vol 2013 ◽  
pp. 1-11 ◽  
Author(s):  
Si Wang ◽  
Ting-Zhu Huang ◽  
Xi-le Zhao ◽  
Jun Liu

A combined total variation and high-order total variation model is proposed to restore blurred images corrupted by impulse noise or mixed Gaussian plus impulse noise. We attack the proposed scheme with an alternating direction method of multipliers (ADMM). Numerical experiments demonstrate the efficiency of the proposed method and the performance of the proposed method is competitive with the existing state-of-the-art methods.


2014 ◽  
Vol 2014 ◽  
pp. 1-23 ◽  
Author(s):  
Liangtian He ◽  
Yilun Wang

We propose a new effective algorithm for recovering a group sparse signal from very limited observations or measured data. As we know that a better reconstruction quality can be achieved when encoding more structural information besides sparsity, the commonly employedl2,1-regularization incorporating the prior grouping information has a better performance than the plainl1-regularized models as expected. In this paper we make a further use of the prior grouping information as well as possibly other prior information by considering a weightedl2,1model. Specifically, we propose a multistage convex relaxation procedure to alternatively estimate weights and solve the resulted weighted problem. The procedure of estimating weights makes better use of the prior grouping information and is implemented based on the iterative support detection (Wang and Yin, 2010). Comprehensive numerical experiments show that our approach brings significant recovery enhancements compared with the plainl2,1model, solved via the alternating direction method (ADM) (Deng et al., 2013), either in noiseless or in noisy environments.


2020 ◽  
Vol 37 (04) ◽  
pp. 2040010
Author(s):  
Qingsong Wang ◽  
Chengjing Wang ◽  
Peipei Tang ◽  
Dunbiao Niu

In this paper, we apply an inexact dual alternating direction method of multipliers (dADMM) to an image decomposition model. In this model, an image is divided into two meaningful components, i.e., a cartoon part and a texture part. The dADMM presents not only the cartoon part and the texture part of an image but also the restored image (cartoon [Formula: see text] part). We also summarize the global convergence for the algorithm. Numerical experiments demonstrate the efficiency and robustness of the inexact dADMM. Furthermore, we can obtain relatively higher signal-to-noise ratio (SNR) comparing to other algorithms.


2021 ◽  
Vol 0 (0) ◽  
pp. 0
Author(s):  
Xiao Ai ◽  
Guoxi Ni ◽  
Tieyong Zeng

<p style='text-indent:20px;'>In this paper, we propose a nonconvex regularization model for images damaged by Cauchy noise and blur. This model is based on the method of the total variational proposed by Federica, Dong and Zeng [SIAM J. Imaging Sci.(2015)], where a variational approach for restoring blurred images with Cauchy noise is used. Here we consider the nonconvex regularization, namely a weighted difference of <inline-formula><tex-math id="M1">\begin{document}$ l_1 $\end{document}</tex-math></inline-formula>-norm and <inline-formula><tex-math id="M2">\begin{document}$ l_2 $\end{document}</tex-math></inline-formula>-norm coupled with wavelet frame, the alternating direction method of multiplier is carried out for this minimization problem, we describe the details of the algorithm and prove its convergence. Numerical experiments are tested by adding different levels of noise and blur, results show that our method can denoise and deblur the image better.</p>


2014 ◽  
Vol 26 (3) ◽  
pp. 611-635 ◽  
Author(s):  
Xinggang Wang ◽  
Zhengdong Zhang ◽  
Yi Ma ◽  
Xiang Bai ◽  
Wenyu Liu ◽  
...  

This letter examines the problem of robust subspace discovery from input data samples (instances) in the presence of overwhelming outliers and corruptions. A typical example is the case where we are given a set of images; each image contains, for example, a face at an unknown location of an unknown size; our goal is to identify or detect the face in the image and simultaneously learn its model. We employ a simple generative subspace model and propose a new formulation to simultaneously infer the label information and learn the model using low-rank optimization. Solving this problem enables us to simultaneously identify the ownership of instances to the subspace and learn the corresponding subspace model. We give an efficient and effective algorithm based on the alternating direction method of multipliers and provide extensive simulations and experiments to verify the effectiveness of our method. The proposed scheme can also be used to tackle many related high-dimensional combinatorial selection problems.


Sign in / Sign up

Export Citation Format

Share Document