convex minimization problems
Recently Published Documents


TOTAL DOCUMENTS

105
(FIVE YEARS 33)

H-INDEX

16
(FIVE YEARS 3)

Author(s):  
Wanna Sriprad ◽  
Somnuk Srisawat

The purpose of this paper is to study the convergence analysis of an intermixed algorithm for finding the common element of the set of solutions of split monotone variational inclusion problem (SMIV) and the set of a finite family of variational inequality problems. Under the suitable assumption, a strong convergence theorem has been proved in the framework of a real Hilbert space. In addition, by using our result, we obtain some additional results involving split convex minimization problems (SCMPs) and split feasibility problems (SFPs). Also, we give some numerical examples for supporting our main theorem.


Author(s):  
Pham Quy Muoi Pham

In [1], Nesterov has introduced an optimal algorithm with constant step-size,  with  is the Lipschitz constant of objective function. The algorithm is proved to converge with optimal rate . In this paper, we propose a new algorithm, which is allowed nonconstant step-sizes . We prove the convergence and convergence rate of the new algorithm. It is proved to have the convergence rate  as the original one. The advance of our algorithm is that it is allowed nonconstant step-sizes and give us more free choices of step-sizes, which convergence rate is still optimal. This is a generalization of Nesterov's algorithm. We have applied the new algorithm to solve the problem of finding an approximate solution to the integral equation.


Mathematics ◽  
2021 ◽  
Vol 9 (20) ◽  
pp. 2619
Author(s):  
Panadda Thongpaen ◽  
Rattanakorn Wattanataweekul

In this paper, we introduce a new iterative method using an inertial technique for approximating a common fixed point of an infinite family of nonexpansive mappings in a Hilbert space. The proposed method’s weak convergence theorem was established under some suitable conditions. Furthermore, we applied our main results to solve convex minimization problems and image restoration problems.


Author(s):  
Thierno M. M. Sow

In this paper, a new iterative method  for solving  convex minimization  problems over the set of common fixed points of quasi-nonexpansive and demicontractive mappings is constructed. Convergence theorems are also proved in Hilbert spaces without any compactness assumption. As an application, we shall utilize our results to solve quadratic optimization  problems involving bounded linear operator. Our theorems are significant improvements on several important recent results.


Author(s):  
Quoc Tran-Dinh ◽  
Ling Liang ◽  
Kim-Chuan Toh

This paper suggests two novel ideas to develop new proximal variable-metric methods for solving a class of composite convex optimization problems. The first idea is to utilize a new parameterization strategy of the optimality condition to design a class of homotopy proximal variable-metric algorithms that can achieve linear convergence and finite global iteration-complexity bounds. We identify at least three subclasses of convex problems in which our approach can apply to achieve linear convergence rates. The second idea is a new primal-dual-primal framework for implementing proximal Newton methods that has attractive computational features for a subclass of nonsmooth composite convex minimization problems. We specialize the proposed algorithm to solve a covariance estimation problem in order to demonstrate its computational advantages. Numerical experiments on the four concrete applications are given to illustrate the theoretical and computational advances of the new methods compared with other state-of-the-art algorithms.


2021 ◽  
Vol 2021 (1) ◽  
Author(s):  
Parin Chaipunya ◽  
Konrawut Khammahawong ◽  
Poom Kumam

AbstractThe main purpose of this paper is to introduce a new iterative algorithm to solve inclusion problems in Hadamard manifolds. Moreover, applications to convex minimization problems and variational inequality problems are studied. A numerical example also is presented to support our main theorem.


2021 ◽  
Vol 2021 (1) ◽  
Author(s):  
Panitarn Sarnmeta ◽  
Warunun Inthakon ◽  
Dawan Chumpungam ◽  
Suthep Suantai

AbstractIn this work, we introduce a new accelerated algorithm using a linesearch technique for solving convex minimization problems in the form of a summation of two lower semicontinuous convex functions. A weak convergence of the proposed algorithm is given without assuming the Lipschitz continuity on the gradient of the objective function. Moreover, the convexity of this algorithm is also analyzed. Some numerical experiments in machine learning are also discussed, namely regression and classification problems. Furthermore, in our experiments, we evaluate the convergent behavior of this new algorithm, then compare it with various algorithms mentioned in the literature. It is found that our algorithm performs better than the others.


2021 ◽  
Vol 37 (3) ◽  
pp. 449-461
Author(s):  
PACHARA JAILOKA ◽  
◽  
SUTHEP SUANTAI ◽  
ADISAK HANJING ◽  
◽  
...  

The purpose of this paper is to invent an accelerated algorithm for the convex minimization problem which can be applied to the image restoration problem. Theoretically, we first introduce an algorithm based on viscosity approximation method with the inertial technique for finding a common fixed point of a countable family of nonexpansive operators. Under some suitable assumptions, a strong convergence theorem of the proposed algorithm is established. Subsequently, we utilize our proposed algorithm to solving a convex minimization problem of the sum of two convex functions. As an application, we apply and analyze our algorithm to image restoration problems. Moreover, we compare convergence behavior and efficiency of our algorithm with other well-known methods such as the forward-backward splitting algorithm and the fast iterative shrinkage-thresholding algorithm. By using image quality metrics, numerical experiments show that our algorithm has a higher efficiency than the mentioned algorithms.


Sign in / Sign up

Export Citation Format

Share Document