Simultaneous estimations of optimal directions and optimal transformations for functional data

Author(s):  
Heng Chen ◽  
Wei Huang ◽  
Di-Rong Chen

Sliced inverse regression (SIR) is a powerful method to deal with a dimension reduction model. As well known, SIR is equivalent to a transformation-based projection pursuit, where the optimal directions are just the directions in SIR. In this paper, we consider simultaneous estimations of optimal directions for functional data and optimal transformations. We take a reproducing kernel Hilbert space approach. Both the directions and the transformations are chosen from reproducing kernel Hilbert spaces. A learning rate is established for the estimators.

2019 ◽  
Vol 52 (1) ◽  
pp. 467-474
Author(s):  
Srijanani Anurag Prasad

AbstractReproducing Kernel Hilbert Spaces (RKHS) and their kernel are important tools which have been found to be incredibly useful in many areas like machine learning, complex analysis, probability theory, group representation theory and the theory of integral operator. In the present paper, the space of Coalescence Hidden-variable Fractal Interpolation Functions (CHFIFs) is demonstrated to be an RKHS and its associated kernel is derived. This extends the possibility of using this new kernel function, which is partly self-affine and partly non-self-affine, in diverse fields wherein the structure is not always self-affine.


2013 ◽  
Vol 56 (2) ◽  
pp. 400-406
Author(s):  
Bebe Prunaru

Abstract.Let (X;B; μ) be a σ-finite measure space and let H ⊂ L2(X; μ) be a separable reproducing kernel Hilbert space on X. We show that the multiplier algebra of H has property (A1(1)).


2021 ◽  
Vol 15 (5) ◽  
Author(s):  
Monika Drewnik ◽  
Tomasz Miller ◽  
Zbigniew Pasternak-Winiarski

AbstractThe aim of the paper is to create a link between the theory of reproducing kernel Hilbert spaces (RKHS) and the notion of a unitary representation of a group or of a groupoid. More specifically, it is demonstrated on one hand how to construct a positive definite kernel and an RKHS for a given unitary representation of a group(oid), and on the other hand how to retrieve the unitary representation of a group or a groupoid from a positive definite kernel defined on that group(oid) with the help of the Moore–Aronszajn theorem. The kernel constructed from the group(oid) representation is inspired by the kernel defined in terms of the convolution of functions on a locally compact group. Several illustrative examples of reproducing kernels related with unitary representations of groupoids are discussed in detail. The paper is concluded with the brief overview of the possible applications of the proposed constructions.


Author(s):  
Ali Akgül ◽  
Mir Sajjad Hashemi ◽  
Negar Seyfi

We investigate the nonlinear boundary value problems by reproducing kernel Hilbert space technique in this paper. We construct some reproducing kernel Hilbert spaces. We define a bounded linear operator to obtain the solutions of the problems. We demonstrate our numerical results by some tables. We compare our numerical results with some results exist in the literature to present the efficiency of the proposed method.


2003 ◽  
Vol 01 (01) ◽  
pp. 17-41 ◽  
Author(s):  
STEVE SMALE ◽  
DING-XUAN ZHOU

Let B be a Banach space and (ℋ,‖·‖ℋ) be a dense, imbedded subspace. For a ∈ B, its distance to the ball of ℋ with radius R (denoted as I(a, R)) tends to zero when R tends to infinity. We are interested in the rate of this convergence. This approximation problem arose from the study of learning theory, where B is the L2 space and ℋ is a reproducing kernel Hilbert space. The class of elements having I(a, R) = O(R-r) with r > 0 is an interpolation space of the couple (B, ℋ). The rate of convergence can often be realized by linear operators. In particular, this is the case when ℋ is the range of a compact, symmetric, and strictly positive definite linear operator on a separable Hilbert space B. For the kernel approximation studied in Learning Theory, the rate depends on the regularity of the kernel function. This yields error estimates for the approximation by reproducing kernel Hilbert spaces. When the kernel is smooth, the convergence is slow and a logarithmic convergence rate is presented for analytic kernels in this paper. The purpose of our results is to provide some theoretical estimates, including the constants, for the approximation error required for the learning theory.


Filomat ◽  
2017 ◽  
Vol 31 (18) ◽  
pp. 5711-5717 ◽  
Author(s):  
Ulaş Yamancı ◽  
Mehmet Gürdal ◽  
Mubariz Garayev

By using Hardy-Hilbert?s inequality, some power inequalities for the Berezin number of a selfadjoint operators in Reproducing Kernel Hilbert Spaces (RKHSs) with applications for convex functions are given.


Author(s):  
Ulaş Yamancı ◽  
Mehmet Gürdal

A reproducing kernel Hilbert space (shorty, RKHS) H=H(Ω) on some set Ω is a Hilbert space of complex valued functions on Ω such that for every λ∈Ω the linear functional (evaluation functional) f→f(λ) is bounded on H. If H is RKHS on a set Ω, then, by the classical Riesz representation theorem for every λ∈Ω there is a unique element kH,λ∈H such that f(λ)=〈f,kH,λ〉; for all f∈H. The family {kH,λ:λ∈Ω} is called the reproducing kernel of the space H. The Berezin set and the Berezin number of the operator A was respectively given by Karaev in [26] as following Ber(A)={A(λ):λ∈Ω} and ber(A):=|A(λ)|. In this chapter, the authors give the Berezin number inequalities for an invertible operator and some other related results are studied. Also, they obtain some inequalities of the slater type for convex functions of selfadjoint operators in reproducing kernel Hilbert spaces and examine related results.


Sign in / Sign up

Export Citation Format

Share Document