scholarly journals A zeroth order method for stochastic weakly convex optimization

Author(s):  
V. Kungurtsev ◽  
F. Rinaldi

AbstractIn this paper, we consider stochastic weakly convex optimization problems, however without the existence of a stochastic subgradient oracle. We present a derivative free algorithm that uses a two point approximation for computing a gradient estimate of the smoothed function. We prove convergence at a similar rate as state of the art methods, however with a larger constant, and report some numerical results showing the effectiveness of the approach.

Author(s):  
А.В. Колосницын

Рассматривается метод симплексных погружений, адаптированный для решения задач выпуклой оптимизации с большим числом ограничений. Разработаны две модификации, позволяющие ускорять работу метода. Первая из них использует более экономичный способ расчета невязок ограничений, что позволяет существенно сокращать время работы алгоритма в случае большой размерности задачи. Вторая модификация основана на возможности метода определять неактивные ограничения задачи. Представлены результаты вычислительных экспериментов с использованием модифицированных версий метода симплексных погружений при решении тестовых задач квадратичной и выпуклой недифференцируемой оптимизации. A simplex embedding method adapted for solving convex optimization problems with a large amount of constraints is considered. Two modifications of the method are proposed for better performance. First of them uses a more economical approach to the residual computation for constraints, which allows one to significantly reduce the execution time of the algorithm in the case of a large amount of constraints. One of the important peculiarities of the simplex embedding method is its ability to find inactive constraints. This property of the method is used as the basis for its second modification. The numerical results obtained when solving a number of quadratic and convex nondifferentiable optimization problems show the efficiency of the proposed modifications.


2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Darina Dvinskikh ◽  
Alexander Gasnikov

Abstract We introduce primal and dual stochastic gradient oracle methods for decentralized convex optimization problems. Both for primal and dual oracles, the proposed methods are optimal in terms of the number of communication steps. However, for all classes of the objective, the optimality in terms of the number of oracle calls per node takes place only up to a logarithmic factor and the notion of smoothness. By using mini-batching technique, we show that the proposed methods with stochastic oracle can be additionally parallelized at each node. The considered algorithms can be applied to many data science problems and inverse problems.


Sign in / Sign up

Export Citation Format

Share Document