scholarly journals A gradient-type algorithm for constrained optimization with application to microstructure optimization

2020 ◽  
Vol 25 (5) ◽  
pp. 1729-1755
Author(s):  
Cristian Barbarosie ◽  
◽  
Anca-Maria Toader ◽  
Sérgio Lopes ◽  
Author(s):  
Jamilu Sabi'u ◽  
Abdullah Shah

In this article, we proposed two Conjugate Gradient (CG) parameters using the modified Dai-{L}iao condition and the descent three-term CG search direction. Both parameters are incorporated with the projection technique for solving large-scale monotone nonlinear equations. Using the Lipschitz and monotone assumptions, the global convergence of methods has been proved. Finally, numerical results are provided to illustrate the robustness of the proposed methods.


2019 ◽  
Vol 35 (3) ◽  
pp. 371-378
Author(s):  
PORNTIP PROMSINCHAI ◽  
NARIN PETROT ◽  
◽  
◽  

In this paper, we consider convex constrained optimization problems with composite objective functions over the set of a minimizer of another function. The main aim is to test numerically a new algorithm, namely a stochastic block coordinate proximal-gradient algorithm with penalization, by comparing both the number of iterations and CPU times between this introduced algorithm and the other well-known types of block coordinate descent algorithm for finding solutions of the randomly generated optimization problems with regularization term.


2019 ◽  
Vol 84 (2) ◽  
pp. 485-512 ◽  
Author(s):  
Cristian Daniel Alecsa ◽  
Szilárd Csaba László ◽  
Adrian Viorel

2008 ◽  
Vol 429 (5-6) ◽  
pp. 1229-1242 ◽  
Author(s):  
Catherine Fraikin ◽  
Yurii Nesterov ◽  
Paul Van Dooren
Keyword(s):  

1987 ◽  
Vol 18 (6) ◽  
pp. 1061-1078 ◽  
Author(s):  
NADAV BERMAN ◽  
ARIE FEUER ◽  
ELIAS WAHNON

2007 ◽  
Vol 119 (1) ◽  
pp. 51-78 ◽  
Author(s):  
Kengy Barty ◽  
Jean-Sébastien Roy ◽  
Cyrille Strugarek

2006 ◽  
Vol 2006 ◽  
pp. 1-15 ◽  
Author(s):  
Vadim Azhmyakov

We present a technique for analysis of asymptotic stability for a class of differential inclusions. This technique is based on the Lyapunov-type theorems. The construction of the Lyapunov functions for differential inclusions is reduced to an auxiliary problem of mathematical programming, namely, to the problem of searching saddle points of a suitable function. The computational approach to the auxiliary problem contains a gradient-type algorithm for saddle-point problems. We also extend our main results to systems described by difference inclusions. The obtained numerical schemes are applied to some illustrative examples.


Sign in / Sign up

Export Citation Format

Share Document