On convergence and complexity analysis of an accelerated forward–backward algorithm with linesearch technique for convex minimization problems and applications to data prediction and classification
Keyword(s):
AbstractIn this work, we introduce a new accelerated algorithm using a linesearch technique for solving convex minimization problems in the form of a summation of two lower semicontinuous convex functions. A weak convergence of the proposed algorithm is given without assuming the Lipschitz continuity on the gradient of the objective function. Moreover, the convexity of this algorithm is also analyzed. Some numerical experiments in machine learning are also discussed, namely regression and classification problems. Furthermore, in our experiments, we evaluate the convergent behavior of this new algorithm, then compare it with various algorithms mentioned in the literature. It is found that our algorithm performs better than the others.
1991 ◽
Vol 29
(4)
◽
pp. 829-847
◽
Nonconforming discretizations of convex minimization problems and precise relations to mixed methods
2021 ◽
Vol 93
◽
pp. 214-229
Keyword(s):
2015 ◽
Vol 12
(1)
◽
pp. 389-402
◽
2020 ◽
Vol 269
(12)
◽
pp. 10717-10757
◽
1975 ◽
Vol 19
(4)
◽
pp. 788-794
◽
2013 ◽
Vol 2013
(1)
◽
pp. 284
◽
2016 ◽
Vol 15
(1)
◽
pp. 79-91