Global and superlinear convergence of an algorithm for one-dimensional minimization of convex functions

1982 ◽  
Vol 24 (1) ◽  
pp. 241-256 ◽  
Author(s):  
C. Lemarechal ◽  
R. Mifflin
2009 ◽  
Vol 7 (2) ◽  
pp. 167-186 ◽  
Author(s):  
Aleksandra Čižmešija ◽  
Sabir Hussain ◽  
Josip Pečarić

We prove a new general one-dimensional inequality for convex functions and Hardy–Littlewood averages. Furthermore, we apply this result to unify and refine the so-called Boas's inequality and the strengthened inequalities of the Hardy–Knopp–type, deriving their new refinements as special cases of the obtained general relation. In particular, we get new refinements of strengthened versions of the well-known Hardy and Pólya–Knopp's inequalities.


2011 ◽  
Vol 18 (9) ◽  
pp. 1303-1309 ◽  
Author(s):  
Zhaocheng Cui ◽  
Boying Wu

In this paper, we propose a new self-adaptive trust region method for unconstrained optimization problems and develop some convergence properties. In our algorithm, we use the previous and current iterative information to define a suitable trust region radius at each iteration. The global and superlinear convergence properties of the algorithm are established under reasonable assumptions. Preliminary numerical results show that the new method is efficient and attractive for solving unconstrained optimization problems.


Sign in / Sign up

Export Citation Format

Share Document