convex minimization
Recently Published Documents


TOTAL DOCUMENTS

313
(FIVE YEARS 68)

H-INDEX

31
(FIVE YEARS 3)

Author(s):  
Yumin Ma ◽  
Ting Li ◽  
Yongzhong Song ◽  
Xingju Cai

In this paper, we consider nonseparable convex minimization models with quadratic coupling terms arised in many practical applications. We use a majorized indefinite proximal alternating direction method of multipliers (iPADMM) to solve this model. The indefiniteness of proximal matrices allows the function we actually solved to be no longer the majorization of the original function in each subproblem. While the convergence still can be guaranteed and larger stepsize is permitted which can speed up convergence. For this model, we analyze the global convergence of majorized iPADMM with two different techniques and the sublinear convergence rate in the nonergodic sense. Numerical experiments illustrate the advantages of the indefinite proximal matrices over the positive definite or the semi-definite proximal matrices.


Mathematics ◽  
2021 ◽  
Vol 9 (20) ◽  
pp. 2619
Author(s):  
Panadda Thongpaen ◽  
Rattanakorn Wattanataweekul

In this paper, we introduce a new iterative method using an inertial technique for approximating a common fixed point of an infinite family of nonexpansive mappings in a Hilbert space. The proposed method’s weak convergence theorem was established under some suitable conditions. Furthermore, we applied our main results to solve convex minimization problems and image restoration problems.


Author(s):  
Quoc Tran-Dinh ◽  
Ling Liang ◽  
Kim-Chuan Toh

This paper suggests two novel ideas to develop new proximal variable-metric methods for solving a class of composite convex optimization problems. The first idea is to utilize a new parameterization strategy of the optimality condition to design a class of homotopy proximal variable-metric algorithms that can achieve linear convergence and finite global iteration-complexity bounds. We identify at least three subclasses of convex problems in which our approach can apply to achieve linear convergence rates. The second idea is a new primal-dual-primal framework for implementing proximal Newton methods that has attractive computational features for a subclass of nonsmooth composite convex minimization problems. We specialize the proposed algorithm to solve a covariance estimation problem in order to demonstrate its computational advantages. Numerical experiments on the four concrete applications are given to illustrate the theoretical and computational advances of the new methods compared with other state-of-the-art algorithms.


2021 ◽  
Vol 1 (1) ◽  
pp. 19-33
Author(s):  
Sang B Mendy ◽  
John T Mendy ◽  
Alieu Jobe

The generalized viscosity implicit rules of nonexpansive asymptotically mappings in Hilbert spaces are considered. The strong convergence theorems of the rules are proved under certain assumptions imposed on the sequences of parameters. An application of it in the convex minimization problem is considered. The results presented in this paper improve and extend some recent corresponding results in the literature.


2021 ◽  
Vol 2021 (1) ◽  
Author(s):  
Parin Chaipunya ◽  
Konrawut Khammahawong ◽  
Poom Kumam

AbstractThe main purpose of this paper is to introduce a new iterative algorithm to solve inclusion problems in Hadamard manifolds. Moreover, applications to convex minimization problems and variational inequality problems are studied. A numerical example also is presented to support our main theorem.


2021 ◽  
Vol 2021 (1) ◽  
Author(s):  
Panitarn Sarnmeta ◽  
Warunun Inthakon ◽  
Dawan Chumpungam ◽  
Suthep Suantai

AbstractIn this work, we introduce a new accelerated algorithm using a linesearch technique for solving convex minimization problems in the form of a summation of two lower semicontinuous convex functions. A weak convergence of the proposed algorithm is given without assuming the Lipschitz continuity on the gradient of the objective function. Moreover, the convexity of this algorithm is also analyzed. Some numerical experiments in machine learning are also discussed, namely regression and classification problems. Furthermore, in our experiments, we evaluate the convergent behavior of this new algorithm, then compare it with various algorithms mentioned in the literature. It is found that our algorithm performs better than the others.


Sign in / Sign up

Export Citation Format

Share Document