scholarly journals From error bounds to the complexity of first-order descent methods for convex functions

2016 ◽  
Vol 165 (2) ◽  
pp. 471-507 ◽  
Author(s):  
Jérôme Bolte ◽  
Trong Phong Nguyen ◽  
Juan Peypouquet ◽  
Bruce W. Suter
Author(s):  
Muhammad Uzair Awan ◽  
Muhammad Zakria Javed ◽  
Michael Th. Rassias ◽  
Muhammad Aslam Noor ◽  
Khalida Inayat Noor

AbstractA new generalized integral identity involving first order differentiable functions is obtained. Using this identity as an auxiliary result, we then obtain some new refinements of Simpson type inequalities using a new class called as strongly (s, m)-convex functions of higher order of $$\sigma >0$$ σ > 0 . We also discuss some interesting applications of the obtained results in the theory of means. In last we present applications of the obtained results in obtaining Simpson-like quadrature formula.


2002 ◽  
Vol 34 (03) ◽  
pp. 559-586 ◽  
Author(s):  
Haiyan Huang

Given a sequence S and a collection Ω of d words, it is of interest in many applications to characterize the multivariate distribution of the vector of counts U = (N(S,w 1), …, N(S,w d )), where N(S,w) is the number of times a word w ∈ Ω appears in the sequence S. We obtain an explicit bound on the error made when approximating the multivariate distribution of U by the normal distribution, when the underlying sequence is i.i.d. or first-order stationary Markov over a finite alphabet. When the limiting covariance matrix of U is nonsingular, the error bounds decay at rate O ((log n) / √n) in the i.i.d. case and O ((log n)3 / √n) in the Markov case. In order for U to have a nondegenerate covariance matrix, it is necessary and sufficient that the counted word set Ω is not full, that is, that Ω is not the collection of all possible words of some length k over the given finite alphabet. To supply the bounds on the error, we use a version of Stein's method.


2021 ◽  
Vol 2021 ◽  
pp. 1-16
Author(s):  
Yanliang Dong ◽  
Muhammad Zeb ◽  
Ghulam Farid ◽  
Sidra Bibi

In this paper, we present two versions of the Hadamard inequality for α , m convex functions via Caputo fractional derivatives. Several related results are analyzed for convex and m -convex functions along with their refinements and generalizations. The error bounds of the Hadamard inequalities are established by applying some known identities.


Mathematics ◽  
2019 ◽  
Vol 7 (9) ◽  
pp. 807 ◽  
Author(s):  
Saima Rashid ◽  
Thabet Abdeljawad ◽  
Fahd Jarad ◽  
Muhammad Aslam Noor

In the present paper, we investigate some Hermite-Hadamard ( HH ) inequalities related to generalized Riemann-Liouville fractional integral ( GRLFI ) via exponentially convex functions. We also show the fundamental identity for GRLFI having the first order derivative of a given exponentially convex function. Monotonicity and exponentially convexity of functions are used with some traditional and forthright inequalities. In the application part, we give examples and new inequalities for the special means.


Sign in / Sign up

Export Citation Format

Share Document