kernel representation
Recently Published Documents


TOTAL DOCUMENTS

55
(FIVE YEARS 8)

H-INDEX

9
(FIVE YEARS 2)

2021 ◽  
Vol 110 ◽  
pp. 107593
Author(s):  
Hao Wang ◽  
Qilong Wang ◽  
Peihua Li ◽  
Wangmeng Zuo

2020 ◽  
Vol 22 (8) ◽  
pp. 1985-1997 ◽  
Author(s):  
Yongyong Chen ◽  
Xiaolin Xiao ◽  
Yicong Zhou

Mathematics ◽  
2020 ◽  
Vol 8 (4) ◽  
pp. 535 ◽  
Author(s):  
Savin Treanţă

In this paper, by using the characteristic system method, the kernel of a polynomial differential equation involving a derivation in R n is described by solving the Cauchy Problem for the corresponding first order system of PDEs. Moreover, the kernel representation has a special significance on the space of solutions to the corresponding system of PDEs. As very important applications, it has been established that the mathematical framework developed in this work can be used for the study of some second-order PDEs involving a finite set of derivations.


2019 ◽  
Vol 18 (01) ◽  
pp. 149-183 ◽  
Author(s):  
Saverio Salzo ◽  
Johan A. K. Suykens

In this paper, we study the variational problem associated to support vector regression in Banach function spaces. Using the Fenchel–Rockafellar duality theory, we give an explicit formulation of the dual problem as well as of the related optimality conditions. Moreover, we provide a new computational framework for solving the problem which relies on a tensor-kernel representation. This analysis overcomes the typical difficulties connected to learning in Banach spaces. We finally present a large class of tensor-kernels to which our theory fully applies: power series tensor kernels. This type of kernels describes Banach spaces of analytic functions and includes generalizations of the exponential and polynomial kernels as well as, in the complex case, generalizations of the Szegö and Bergman kernels.


2019 ◽  
Vol 13 ◽  
pp. 174830261987399 ◽  
Author(s):  
Chu Li ◽  
Xiao-Jun Wu

In the field of pattern recognition, using the symmetric positive-definite matrices to represent image set has been widely studied, and sparse representation-based classification algorithm on the symmetric positive-definite matrix manifold has attracted great attention in recent years. However, the existing kernel representation-based classification methods usually use kernel trick with implicit kernel to rewrite the optimization function and will have some problems. To address the problem, a neighborhood preserving explicit kernel representation-based classification-based Nyström method is proposed on symmetric positive-definite manifold by embedding the symmetric positive-definite matrices into a Reproducing Kernel Hilbert Space with an explicit kernel based on Nyström method. Thus, we can take full advantage of kernel space characteristics. Through the experimental results, we demonstrate the better performance of our method in the task of image set classification.


2018 ◽  
Vol 28 (02) ◽  
pp. 1750040 ◽  
Author(s):  
Xin Li ◽  
Yanqin Bai ◽  
Yaxin Peng ◽  
Shaoyi Du ◽  
Shihui Ying

Changing the metric on the data may change the data distribution, hence a good distance metric can promote the performance of learning algorithm. In this paper, we address the semi-supervised distance metric learning (ML) problem to obtain the best nonlinear metric for the data. First, we describe the nonlinear metric by the multiple kernel representation. By this approach, we project the data into a high dimensional space, where the data can be well represented by linear ML. Then, we reformulate the linear ML by a minimization problem on the positive definite matrix group. Finally, we develop a two-step algorithm for solving this model and design an intrinsic steepest descent algorithm to learn the positive definite metric matrix. Experimental results validate that our proposed method is effective and outperforms several state-of-the-art ML methods.


Sign in / Sign up

Export Citation Format

Share Document