The Adaptation of Interior Point Method for Solving the Quadratic Programming Problems Arising in the Assembly of Deformable Structures

Author(s):  
Maria Stefanova ◽  
Sergey Lupuleac
Author(s):  
Spyridon Pougkakiotis ◽  
Jacek Gondzio

Abstract In this paper we combine an infeasible Interior Point Method (IPM) with the Proximal Method of Multipliers (PMM). The resulting algorithm (IP-PMM) is interpreted as a primal-dual regularized IPM, suitable for solving linearly constrained convex quadratic programming problems. We apply few iterations of the interior point method to each sub-problem of the proximal method of multipliers. Once a satisfactory solution of the PMM sub-problem is found, we update the PMM parameters, form a new IPM neighbourhood and repeat this process. Given this framework, we prove polynomial complexity of the algorithm, under standard assumptions. To our knowledge, this is the first polynomial complexity result for a primal-dual regularized IPM. The algorithm is guided by the use of a single penalty parameter; that of the logarithmic barrier. In other words, we show that IP-PMM inherits the polynomial complexity of IPMs, as well as the strict convexity of the PMM sub-problems. The updates of the penalty parameter are controlled by IPM, and hence are well-tuned, and do not depend on the problem solved. Furthermore, we study the behavior of the method when it is applied to an infeasible problem, and identify a necessary condition for infeasibility. The latter is used to construct an infeasibility detection mechanism. Subsequently, we provide a robust implementation of the presented algorithm and test it over a set of small to large scale linear and convex quadratic programming problems. The numerical results demonstrate the benefits of using regularization in IPMs as well as the reliability of the method.


1993 ◽  
Vol 5 (2) ◽  
pp. 182-191 ◽  
Author(s):  
Tamra J. Carpenter ◽  
Irvin J. Lustig ◽  
John M. Mulvey ◽  
David F. Shanno

2014 ◽  
pp. 116-124
Author(s):  
Di Zhao

Support Vector Machine (SVM) is one of the latest statistical models for machine learning. The key problem of SVM training is an optimization problem (mainly Quadratic Programming). Interior Point Method (IPM) is one of mainstream methods to solve Quadratic Programming problem. However, when large-scale dataset is used in IPM based SVM training, computational difficulty happens because of computationally expensive matrix operations. Preconditioner, such as Cholesky factorization (CF), incomplete Cholesky factorization and Kronecker factorization, is an effective approach to decrease time complexity of IPM based SVM training. In this paper, we reformulate SVM training into the saddle point problem. As the research question that motivates this paper, based on parallel GMRES and recently developed preconditioner Hermitian/Skew-Hermitian Separation (HSS), we develop a fast solver HSS-pGMRES-IPM for the saddle point problem from SVM training. Computational results show that, the fast solver HSS-pGMRES-IPM significantly increases the solution speed for the saddle point problem from SVM training than the conventional solver CF.


Sign in / Sign up

Export Citation Format

Share Document