Reproducing kernel‐based functional linear expectile regression

Author(s):  
Meichen Liu ◽  
Matthew Pietrosanu ◽  
Peng Liu ◽  
Bei Jiang ◽  
Xingcai Zhou ◽  
...  
2020 ◽  
Vol 2020 (2) ◽  
pp. 76-84
Author(s):  
G.P. Ismatullaev ◽  
S.A. Bakhromov ◽  
R. Mirzakabilov

Author(s):  
Michael T Jury ◽  
Robert T W Martin

Abstract We extend the Lebesgue decomposition of positive measures with respect to Lebesgue measure on the complex unit circle to the non-commutative (NC) multi-variable setting of (positive) NC measures. These are positive linear functionals on a certain self-adjoint subspace of the Cuntz–Toeplitz $C^{\ast }-$algebra, the $C^{\ast }-$algebra of the left creation operators on the full Fock space. This theory is fundamentally connected to the representation theory of the Cuntz and Cuntz–Toeplitz $C^{\ast }-$algebras; any *−representation of the Cuntz–Toeplitz $C^{\ast }-$algebra is obtained (up to unitary equivalence), by applying a Gelfand–Naimark–Segal construction to a positive NC measure. Our approach combines the theory of Lebesgue decomposition of sesquilinear forms in Hilbert space, Lebesgue decomposition of row isometries, free semigroup algebra theory, NC reproducing kernel Hilbert space theory, and NC Hardy space theory.


2021 ◽  
Vol 14 (2) ◽  
pp. 201-214
Author(s):  
Danilo Croce ◽  
Giuseppe Castellucci ◽  
Roberto Basili

In recent years, Deep Learning methods have become very popular in classification tasks for Natural Language Processing (NLP); this is mainly due to their ability to reach high performances by relying on very simple input representations, i.e., raw tokens. One of the drawbacks of deep architectures is the large amount of annotated data required for an effective training. Usually, in Machine Learning this problem is mitigated by the usage of semi-supervised methods or, more recently, by using Transfer Learning, in the context of deep architectures. One recent promising method to enable semi-supervised learning in deep architectures has been formalized within Semi-Supervised Generative Adversarial Networks (SS-GANs) in the context of Computer Vision. In this paper, we adopt the SS-GAN framework to enable semi-supervised learning in the context of NLP. We demonstrate how an SS-GAN can boost the performances of simple architectures when operating in expressive low-dimensional embeddings; these are derived by combining the unsupervised approximation of linguistic Reproducing Kernel Hilbert Spaces and the so-called Universal Sentence Encoders. We experimentally evaluate the proposed approach over a semantic classification task, i.e., Question Classification, by considering different sizes of training material and different numbers of target classes. By applying such adversarial schema to a simple Multi-Layer Perceptron, a classifier trained over a subset derived from 1% of the original training material achieves 92% of accuracy. Moreover, when considering a complex classification schema, e.g., involving 50 classes, the proposed method outperforms state-of-the-art alternatives such as BERT.


2021 ◽  
Author(s):  
Alexander Seipp ◽  
Verena Uslar ◽  
Dirk Weyhe ◽  
Antje Timmer ◽  
Fabian Otto‐Sobotka

Author(s):  
Wei Jiang ◽  
Zhong Chen ◽  
Ning Hu ◽  
Yali Chen

AbstractIn recent years, the study of fractional differential equations has become a hot spot. It is more difficult to solve fractional differential equations with nonlocal boundary conditions. In this article, we propose a multiscale orthonormal bases collocation method for linear fractional-order nonlocal boundary value problems. In algorithm construction, the solution is expanded by the multiscale orthonormal bases of a reproducing kernel space. The nonlocal boundary conditions are transformed into operator equations, which are involved in finding the collocation coefficients as constrain conditions. In theory, the convergent order and stability analysis of the proposed method are presented rigorously. Finally, numerical examples show the stability, accuracy and effectiveness of the method.


2021 ◽  
Vol 2021 (1) ◽  
Author(s):  
Mohammed Al-Smadi ◽  
Nadir Djeddi ◽  
Shaher Momani ◽  
Shrideh Al-Omari ◽  
Serkan Araci

AbstractOur aim in this paper is presenting an attractive numerical approach giving an accurate solution to the nonlinear fractional Abel differential equation based on a reproducing kernel algorithm with model endowed with a Caputo–Fabrizio fractional derivative. By means of such an approach, we utilize the Gram–Schmidt orthogonalization process to create an orthonormal set of bases that leads to an appropriate solution in the Hilbert space $\mathcal{H}^{2}[a,b]$ H 2 [ a , b ] . We investigate and discuss stability and convergence of the proposed method. The n-term series solution converges uniformly to the analytic solution. We present several numerical examples of potential interests to illustrate the reliability, efficacy, and performance of the method under the influence of the Caputo–Fabrizio derivative. The gained results have shown superiority of the reproducing kernel algorithm and its infinite accuracy with a least time and efforts in solving the fractional Abel-type model. Therefore, in this direction, the proposed algorithm is an alternative and systematic tool for analyzing the behavior of many nonlinear temporal fractional differential equations emerging in the fields of engineering, physics, and sciences.


Sign in / Sign up

Export Citation Format

Share Document