AN INTERIOR POINT APPROACH FOR SEMIDEFINITE OPTIMIZATION USING NEW PROXIMITY FUNCTIONS

2009 ◽  
Vol 26 (03) ◽  
pp. 365-382 ◽  
Author(s):  
M. REZA PEYGHAMI

Kernel functions play an important role in interior point methods (IPMs) for solving linear optimization (LO) problems to define a new search direction. In this paper, we consider primal-dual algorithms for solving Semidefinite Optimization (SDO) problems based on a new class of kernel functions defined on the positive definite cone [Formula: see text]. Using some appealing and mild conditions of the new class, we prove with simple analysis that the new class-based large-update primal-dual IPMs enjoy an [Formula: see text] iteration bound to solve SDO problems with special choice of the parameters of the new class.

2020 ◽  
Vol 28 (1) ◽  
pp. 27-41
Author(s):  
Benhadid Ayache ◽  
Saoudi Khaled

AbstractIn this paper, we propose a large-update primal-dual interior point algorithm for linear optimization. The method is based on a new class of kernel functions which differs from the existing kernel functions in which it has a double barrier term. The investigation according to it yields the best known iteration bound O\sqrt n \log (n)\log \left( {{n \over \in }} \right) for large-update algorithm with the special choice of its parameter m and thus improves the iteration bound obtained in Bai et al. [2] for large-update algorithm.


2016 ◽  
Vol 09 (03) ◽  
pp. 1650059 ◽  
Author(s):  
Behrouz Kheirfam

In this paper an improved and modified version of full Nesterov–Todd step infeasible interior-point methods for symmetric optimization published in [A new infeasible interior-point method based on Darvay’s technique for symmetric optimization, Ann. Oper. Res. 211(1) (2013) 209–224; G. Gu, M. Zangiabadi and C. Roos, Full Nesterov–Todd step infeasible interior-point method for symmetric optimization, European J. Oper. Res. 214(3) (2011) 473–484; Simplified analysis of a full Nesterov–Todd step infeasible interior-point method for symmetric optimization, Asian-Eur. J. Math. 8(4) (2015) 1550071, 14 pp.] is considered. Each main iteration of our algorithm consisted of only a feasibility step, whereas in the earlier versions each iteration is composed of one feasibility step and several — at most three — centering steps. The algorithm finds an [Formula: see text]-solution of the underlying problem in polynomial-time and its iteration bound improves the earlier bounds factor from [Formula: see text] and [Formula: see text] to [Formula: see text]. Moreover, our method unifies the analysis for linear optimization, second-order cone optimization and semidefinite optimization.


2014 ◽  
Vol 2014 ◽  
pp. 1-11 ◽  
Author(s):  
X. Z. Cai ◽  
G. Q. Wang ◽  
M. El Ghami ◽  
Y. J. Yue

We introduce a new parametric kernel function, which is a combination of the classic kernel function and a trigonometric barrier term, and present various properties of this new kernel function. A class of large- and small-update primal-dual interior-point methods for linear optimization based on this parametric kernel function is proposed. By utilizing the feature of the parametric kernel function, we derive the iteration bounds for large-update methods,O(n2/3log⁡(n/ε)), and small-update methods,O(nlog⁡(n/ε)). These results match the currently best known iteration bounds for large- and small-update methods based on the trigonometric kernel functions.


Filomat ◽  
2020 ◽  
Vol 34 (12) ◽  
pp. 3957-3969
Author(s):  
Imene Touil ◽  
Wided Chikouche

In this paper, we propose the first hyperbolic-logarithmic kernel function for Semidefinite programming problems. By simple analysis tools, several properties of this kernel function are used to compute the total number of iterations. We show that the worst-case iteration complexity of our algorithm for large-update methods improves the obtained iteration bounds based on hyperbolic [24] as well as classic kernel functions. For small-update methods, we derive the best known iteration bound.


2007 ◽  
Vol 49 (2) ◽  
pp. 259-270 ◽  
Author(s):  
Keyvan Aminis ◽  
Arash Haseli

AbstractInterior-Point Methods (IPMs) are not only very effective in practice for solving linear optimization problems but also have polynomial-time complexity. Despite the practical efficiency of large-update algorithms, from a theoretical point of view, these algorithms have a weaker iteration bound with respect to small-update algorithms. In fact, there is a significant gap between theory and practice for large-update algorithms. By introducing self-regular barrier functions, Peng, Roos and Terlaky improved this gap up to a factor of log n. However, checking these self-regular functions is not simple and proofs of theorems involving these functions are very complicated. Roos el al. by presenting a new class of barrier functions which are not necessarily self-regular, achieved very good results through some much simpler theorems. In this paper we introduce a new kernel function in this class which yields the best known complexity bound, both for large-update and small-update methods.


Sign in / Sign up

Export Citation Format

Share Document