On the convergence rate of customized proximal point algorithm for convex optimization and saddle-point problem

2012 ◽  
Vol 42 (5) ◽  
pp. 515-525 ◽  
Author(s):  
BingSheng HE ◽  
Yuan SHEN
2013 ◽  
Vol 2013 ◽  
pp. 1-10 ◽  
Author(s):  
Dali Chen ◽  
YangQuan Chen ◽  
Dingyu Xue

This paper proposes a fractional-order total variation image denoising algorithm based on the primal-dual method, which provides a much more elegant and effective way of treating problems of the algorithm implementation, ill-posed inverse, convergence rate, and blocky effect. The fractional-order total variation model is introduced by generalizing the first-order model, and the corresponding saddle-point and dual formulation are constructed in theory. In order to guaranteeO1/N2convergence rate, the primal-dual algorithm was used to solve the constructed saddle-point problem, and the final numerical procedure is given for image denoising. Finally, the experimental results demonstrate that the proposed methodology avoids the blocky effect, achieves state-of-the-art performance, and guaranteesO1/N2convergence rate.


PAMM ◽  
2021 ◽  
Vol 20 (1) ◽  
Author(s):  
Ritukesh Bharali ◽  
Fredrik Larsson ◽  
Ralf Jänicke

2020 ◽  
Vol 60 (11) ◽  
pp. 1787-1809
Author(s):  
M. S. Alkousa ◽  
A. V. Gasnikov ◽  
D. M. Dvinskikh ◽  
D. A. Kovalev ◽  
F. S. Stonyakin

2013 ◽  
Vol 46 (3) ◽  
Author(s):  
Alicja Smoktunowicz ◽  
Felicja Okulicka-Dłużewska

AbstractNumerical stability of two main direct methods for solving the symmetric saddle point problem are analyzed. The first one is a generalization of Golub’s method for the augmented system formulation (ASF) and uses the Householder QR decomposition. The second method is supported by the singular value decomposition (SVD). Numerical comparison of some direct methods are given.


Acta Numerica ◽  
2013 ◽  
Vol 22 ◽  
pp. 509-575 ◽  
Author(s):  
Yurii Nesterov ◽  
Arkadi Nemirovski

In the past decade, problems related to l1/nuclear norm minimization have attracted much attention in the signal processing, machine learning and optimization communities. In this paper, devoted to l1/nuclear norm minimization as ‘optimization beasts’, we give a detailed description of two attractive first-order optimization techniques for solving problems of this type. The first one, aimed primarily at lasso-type problems, comprises fast gradient methods applied to composite minimization formulations. The second approach, aimed at Dantzig-selector-type problems, utilizes saddle-point first-order algorithms and reformulation of the problem of interest as a generalized bilinear saddle-point problem. For both approaches, we give complete and detailed complexity analyses and discuss the application domains.


Sign in / Sign up

Export Citation Format

Share Document