Superlinear Convergence of a Modified Newton's Method for Convex Optimization Problems With Constraints
Keyword(s):
We consider the constrained optimization problem defined by: $$f (x^*) = \min_{x \in X} f(x)\eqno (1)$$ where the function f : \pmb{\mathbb{R}}^{n} → \pmb{\mathbb{R}} is convex on a closed bounded convex set X. To solve problem (1), most methods transform this problem into a problem without constraints, either by introducing Lagrange multipliers or a projection method. The purpose of this paper is to give a new method to solve some constrained optimization problems, based on the definition of a descent direction and a step while remaining in the X convex domain. A convergence theorem is proven. The paper ends with some numerical examples.
2013 ◽
Vol 479-480
◽
pp. 861-864
2016 ◽
Vol 19
(1)
◽
pp. 143-167
◽
1998 ◽
Vol 2
(6)
◽
pp. 208-213
◽
2014 ◽
Vol 962-965
◽
pp. 2903-2908
2015 ◽
Vol 6
◽
pp. A1
◽
2014 ◽
Vol 536-537
◽
pp. 476-480
◽
2018 ◽
Vol 1
(1)
◽
pp. 037-043
2019 ◽
Vol 7
(6)
◽
pp. 532-549