approximation rate
Recently Published Documents


TOTAL DOCUMENTS

28
(FIVE YEARS 6)

H-INDEX

6
(FIVE YEARS 0)

2021 ◽  
pp. 1-32
Author(s):  
Zuowei Shen ◽  
Haizhao Yang ◽  
Shijun Zhang

A new network with super-approximation power is introduced. This network is built with Floor ([Formula: see text]) or ReLU ([Formula: see text]) activation function in each neuron; hence, we call such networks Floor-ReLU networks. For any hyperparameters [Formula: see text] and [Formula: see text], we show that Floor-ReLU networks with width [Formula: see text] and depth [Formula: see text] can uniformly approximate a Hölder function [Formula: see text] on [Formula: see text] with an approximation error [Formula: see text], where [Formula: see text] and [Formula: see text] are the Hölder order and constant, respectively. More generally for an arbitrary continuous function [Formula: see text] on [Formula: see text] with a modulus of continuity [Formula: see text], the constructive approximation rate is [Formula: see text]. As a consequence, this new class of networks overcomes the curse of dimensionality in approximation power when the variation of [Formula: see text] as [Formula: see text] is moderate (e.g., [Formula: see text] for Hölder continuous functions), since the major term to be considered in our approximation rate is essentially [Formula: see text] times a function of [Formula: see text] and [Formula: see text] independent of [Formula: see text] within the modulus of continuity.


2021 ◽  
Author(s):  
◽  
Yongqiang Suo

In this thesis, we mainly study some properties for certain stochastic di↵er-ential equations.The types of stochastic di↵erential equations we are interested in are (i) stochastic di↵erential equations driven by Brownian motion, (ii) stochastic functional di↵erential equations driven by fractional Brownian motion, (iii) McKean-Vlasov stochastic di↵erential equations driven by Brownian motion,(iv) McKean-Vlasov stochastic di↵erential equations driven by fractional Brownian motion.The properties we investigate include the weak approximation rate of Euler-Maruyama scheme, the central limit theorem and moderate deviation principle for McKean-Vlasov stochastic di↵erential equations. Additionally, we investigate the existence and uniqueness of solution to McKean-Vlasov stochastic di↵erential equations driven by fractional Brownian motion, and then the Bismut formula of Lion’s derivatives for this model is also obtained.The crucial method we utilised to establish the weak approximation rate of Euler-Maruyama scheme for stochastic equations with irregular drift is the Girsanov transformation. More precisely, giving a reference stochastic equa-tions, we construct the equivalent expressions between the aim stochastic equations and associated numerical stochastic equations in another proba-bility spaces in view of the Girsanov theorem.For the Mckean-Vlasov stochastic di↵erential equation model, we first construct the moderate deviation principle for the law of the approxima-tion stochastic di↵erential equation in view of the weak convergence method. Subsequently, we show that the approximation stochastic equations and the McKean-Vlasov stochastic di↵erential equations are in the same exponen-tially equivalent family, and then we establish the moderate deviation prin-ciple for this model.Based on the result of Well-posedness for Mckean-Vlasov stochastic di↵er-ential equation driven by fractional Brownian motion, by using the Malliavin analysis, we first establish a general result of the Bismut type formula for Lions derivative, and then we apply this result to the non-degenerate case of this model.


Filomat ◽  
2021 ◽  
Vol 35 (4) ◽  
pp. 1191-1203
Author(s):  
Fengfeng Wang ◽  
Dansheng Yu

In the present paper, we introduce a new type of Bernstein-Durrmeyer operators preserving linear functions in movable interval. The approximation rate of the new operators for continuous functions and Voronovskaja?s asymptotic estimate are obtained.


2020 ◽  
pp. 870-874
Author(s):  
Hawraa Abbas Almurieb ◽  
Eman Samir Bhaya

Some researchers are interested in using the flexible and applicable properties of quadratic functions as activation functions for FNNs. We study the essential approximation rate of any Lebesgue-integrable monotone function by a neural network of quadratic activation functions. The simultaneous degree of essential approximation is also studied. Both estimates are proved to be within the second order of modulus of smoothness.


2014 ◽  
Vol 2014 ◽  
pp. 1-10
Author(s):  
Gongqiang You

The works of Smale and Zhou (2003, 2007), Cucker and Smale (2002), and Cucker and Zhou (2007) indicate that approximation operators serve as cores of many machine learning algorithms. In this paper we study the Hermite-Fejér interpolation operator which has this potential of applications. The interpolation is defined by zeros of the Jacobi polynomials with parameters−1<α,β<0. Approximation rate is obtained for continuous functions. Asymptotic expression of theK-functional associated with the interpolation operators is given.


PAMM ◽  
2013 ◽  
Vol 13 (1) ◽  
pp. 373-374 ◽  
Author(s):  
Alexander Vasilyev ◽  
Vladimir Vasilyev

Sign in / Sign up

Export Citation Format

Share Document