ESTIMATING THE APPROXIMATION ERROR IN LEARNING THEORY

2003 ◽  
Vol 01 (01) ◽  
pp. 17-41 ◽  
Author(s):  
STEVE SMALE ◽  
DING-XUAN ZHOU

Let B be a Banach space and (ℋ,‖·‖ℋ) be a dense, imbedded subspace. For a ∈ B, its distance to the ball of ℋ with radius R (denoted as I(a, R)) tends to zero when R tends to infinity. We are interested in the rate of this convergence. This approximation problem arose from the study of learning theory, where B is the L2 space and ℋ is a reproducing kernel Hilbert space. The class of elements having I(a, R) = O(R-r) with r > 0 is an interpolation space of the couple (B, ℋ). The rate of convergence can often be realized by linear operators. In particular, this is the case when ℋ is the range of a compact, symmetric, and strictly positive definite linear operator on a separable Hilbert space B. For the kernel approximation studied in Learning Theory, the rate depends on the regularity of the kernel function. This yields error estimates for the approximation by reproducing kernel Hilbert spaces. When the kernel is smooth, the convergence is slow and a logarithmic convergence rate is presented for analytic kernels in this paper. The purpose of our results is to provide some theoretical estimates, including the constants, for the approximation error required for the learning theory.

Filomat ◽  
2017 ◽  
Vol 31 (18) ◽  
pp. 5711-5717 ◽  
Author(s):  
Ulaş Yamancı ◽  
Mehmet Gürdal ◽  
Mubariz Garayev

By using Hardy-Hilbert?s inequality, some power inequalities for the Berezin number of a selfadjoint operators in Reproducing Kernel Hilbert Spaces (RKHSs) with applications for convex functions are given.


Author(s):  
Ulaş Yamancı ◽  
Mehmet Gürdal

A reproducing kernel Hilbert space (shorty, RKHS) H=H(Ω) on some set Ω is a Hilbert space of complex valued functions on Ω such that for every λ∈Ω the linear functional (evaluation functional) f→f(λ) is bounded on H. If H is RKHS on a set Ω, then, by the classical Riesz representation theorem for every λ∈Ω there is a unique element kH,λ∈H such that f(λ)=〈f,kH,λ〉; for all f∈H. The family {kH,λ:λ∈Ω} is called the reproducing kernel of the space H. The Berezin set and the Berezin number of the operator A was respectively given by Karaev in [26] as following Ber(A)={A(λ):λ∈Ω} and ber(A):=|A(λ)|. In this chapter, the authors give the Berezin number inequalities for an invertible operator and some other related results are studied. Also, they obtain some inequalities of the slater type for convex functions of selfadjoint operators in reproducing kernel Hilbert spaces and examine related results.


2016 ◽  
Vol 3 (1) ◽  
Author(s):  
Nicola Arcozzi ◽  
Pavel Mozolyako ◽  
Karl-Mikael Perfekt ◽  
Stefan Richter ◽  
Giulia Sarfatti

AbstractWe study the reproducing kernel Hilbert space with kernel k


2006 ◽  
Vol 11 (3) ◽  
pp. 331-346 ◽  
Author(s):  
S. B. Yakubovich

We study certain isometries between Hilbert spaces, which are generated by the bilateral Laplace transform In particular, we construct an analog of the Bargmann‐Fock type reproducing kernel Hilbert space related to this transformation. It is shown that under some integra‐bility conditions on $ the Laplace transform FF(z), z = x + iy is an entire function belonging to this space. The corresponding isometrical identities, representations of norms, analogs of the Paley‐Wiener and Plancherel's theorems are established. As an application this approach drives us to a different type of real inversion formulas for the bilateral Laplace transform in the mean convergence sense.


2005 ◽  
Vol 17 (1) ◽  
pp. 177-204 ◽  
Author(s):  
Charles A. Micchelli ◽  
Massimiliano Pontil

In this letter, we provide a study of learning in a Hilbert space of vector-valued functions. We motivate the need for extending learning theory of scalar-valued functions by practical considerations and establish some basic results for learning vector-valued functions that should prove useful in applications. Specifically, we allow an output space Y to be a Hilbert space, and we consider a reproducing kernel Hilbert space of functions whose values lie in Y. In this setting, we derive the form of the minimal norm interpolant to a finite set of data and apply it to study some regularization functionals that are important in learning theory. We consider specific examples of such functionals corresponding to multiple-output regularization networks and support vector machines, for both regression and classification. Finally, we provide classes of operator-valued kernels of the dot product and translation-invariant type.


2017 ◽  
Vol 29 (05) ◽  
pp. 1750017
Author(s):  
K. Thirulogasanthar ◽  
S. Twareque Ali

A general theory of reproducing kernels and reproducing kernel Hilbert spaces on a right quaternionic Hilbert space is presented. Positive operator-valued measures and their connection to a class of generalized quaternionic coherent states are examined. A Naimark type extension theorem associated with the positive operator-valued measures is proved in a right quaternionic Hilbert space. As illustrative examples, real, complex and quaternionic reproducing kernels and reproducing kernel Hilbert spaces arising from Hermite and Laguerre polynomials are presented. In particular, in the Laguerre case, the Naimark type extension theorem on the associated quaternionic Hilbert space is indicated.


2019 ◽  
Vol 18 (03) ◽  
pp. 423-446
Author(s):  
Bao-Huai Sheng ◽  
Jian-Li Wang

[Formula: see text]-functionals are used in learning theory literature to study approximation errors in kernel-based regularization schemes. In this paper, we study the approximation error and [Formula: see text]-functionals in [Formula: see text] spaces with [Formula: see text]. To this end, we give a new viewpoint for a reproducing kernel Hilbert space (RKHS) from a fractional derivative and treat powers of the induced integral operator as fractional derivatives of various orders. Then a generalized translation operator is defined by Fourier multipliers, with which a generalized modulus of smoothness is defined. Some general strong equivalent relations between the moduli of smoothness and the [Formula: see text]-functionals are established. As applications, some strong equivalent relations between these two families of quantities on the unit sphere and the unit ball are provided explicitly.


2019 ◽  
Vol 52 (1) ◽  
pp. 467-474
Author(s):  
Srijanani Anurag Prasad

AbstractReproducing Kernel Hilbert Spaces (RKHS) and their kernel are important tools which have been found to be incredibly useful in many areas like machine learning, complex analysis, probability theory, group representation theory and the theory of integral operator. In the present paper, the space of Coalescence Hidden-variable Fractal Interpolation Functions (CHFIFs) is demonstrated to be an RKHS and its associated kernel is derived. This extends the possibility of using this new kernel function, which is partly self-affine and partly non-self-affine, in diverse fields wherein the structure is not always self-affine.


Sign in / Sign up

Export Citation Format

Share Document