scholarly journals Distributed Asymptotic Minimization of Sequences of Convex Functions by a Broadcast Adaptive Subgradient Method

2011 ◽  
Vol 5 (4) ◽  
pp. 739-753 ◽  
Author(s):  
Renato L. G. Cavalcante ◽  
Alex Rogers ◽  
Nicholas R. Jennings ◽  
Isao Yamada
2021 ◽  
Vol 0 (0) ◽  
pp. 0
Author(s):  
Gang Li ◽  
Minghua Li ◽  
Yaohua Hu

<p style='text-indent:20px;'>The feasibility problem is at the core of the modeling of many problems in various disciplines of mathematics and physical sciences, and the quasi-convex function is widely applied in many fields such as economics, finance, and management science. In this paper, we consider the stochastic quasi-convex feasibility problem (SQFP), which is to find a common point of infinitely many sublevel sets of quasi-convex functions. Inspired by the idea of a stochastic index scheme, we propose a stochastic quasi-subgradient method to solve the SQFP, in which the quasi-subgradients of a random (and finite) index set of component quasi-convex functions at the current iterate are used to construct the descent direction at each iteration. Moreover, we introduce a notion of Hölder-type error bound property relative to the random control sequence for the SQFP, and use it to establish the global convergence theorem and convergence rate theory of the stochastic quasi-subgradient method. It is revealed in this paper that the stochastic quasi-subgradient method enjoys both advantages of low computational cost requirement and fast convergence feature.</p>


Author(s):  
Axel Böhm ◽  
Stephen J. Wright

AbstractWe study minimization of a structured objective function, being the sum of a smooth function and a composition of a weakly convex function with a linear operator. Applications include image reconstruction problems with regularizers that introduce less bias than the standard convex regularizers. We develop a variable smoothing algorithm, based on the Moreau envelope with a decreasing sequence of smoothing parameters, and prove a complexity of $${\mathcal {O}}(\epsilon ^{-3})$$ O ( ϵ - 3 ) to achieve an $$\epsilon $$ ϵ -approximate solution. This bound interpolates between the $${\mathcal {O}}(\epsilon ^{-2})$$ O ( ϵ - 2 ) bound for the smooth case and the $${\mathcal {O}}(\epsilon ^{-4})$$ O ( ϵ - 4 ) bound for the subgradient method. Our complexity bound is in line with other works that deal with structured nonsmoothness of weakly convex functions.


2020 ◽  
Vol 4 (2) ◽  
pp. 1-14
Author(s):  
Pardeep Kaur ◽  
◽  
Sukhwinder Singh Billing ◽  

Filomat ◽  
2017 ◽  
Vol 31 (4) ◽  
pp. 1009-1016 ◽  
Author(s):  
Ahmet Akdemir ◽  
Özdemir Emin ◽  
Ardıç Avcı ◽  
Abdullatif Yalçın

In this paper, firstly we prove an integral identity that one can derive several new equalities for special selections of n from this identity: Secondly, we established more general integral inequalities for functions whose second derivatives of absolute values are GA-convex functions based on this equality.


Sign in / Sign up

Export Citation Format

Share Document