kernel methods
Recently Published Documents


TOTAL DOCUMENTS

668
(FIVE YEARS 88)

H-INDEX

45
(FIVE YEARS 4)

Sensors ◽  
2021 ◽  
Vol 22 (1) ◽  
pp. 241
Author(s):  
Qasem Abu Al-Haija ◽  
Ahmad Al-Badawi

Network Intrusion Detection Systems (NIDSs) are indispensable defensive tools against various cyberattacks. Lightweight, multipurpose, and anomaly-based detection NIDSs employ several methods to build profiles for normal and malicious behaviors. In this paper, we design, implement, and evaluate the performance of machine-learning-based NIDS in IoT networks. Specifically, we study six supervised learning methods that belong to three different classes: (1) ensemble methods, (2) neural network methods, and (3) kernel methods. To evaluate the developed NIDSs, we use the distilled-Kitsune-2018 and NSL-KDD datasets, both consisting of a contemporary real-world IoT network traffic subjected to different network attacks. Standard performance evaluation metrics from the machine-learning literature are used to evaluate the identification accuracy, error rates, and inference speed. Our empirical analysis indicates that ensemble methods provide better accuracy and lower error rates compared with neural network and kernel methods. On the other hand, neural network methods provide the highest inference speed which proves their suitability for high-bandwidth networks. We also provide a comparison with state-of-the-art solutions and show that our best results are better than any prior art by 1~20%.


2021 ◽  
Author(s):  
Ruslan Shaydulin ◽  
Stefan Wild

Abstract Quantum kernel methods are considered a promising avenue for applying quantum computers to machine learning problems. However, recent results overlook the central role hyperparameters play in determining the performance of machine learning methods. In this work we identify the hyperparameter controlling the bandwidth of a quantum kernel and show that it controls the expressivity of the resulting model. We use extensive numerical experiments with multiple quantum kernels and classical datasets to show consistent change in the model behavior from underfitting (bandwidth too large) to overfitting (bandwidth too small), with optimal generalization in between. We draw a connection between the bandwidth of classical and quantum kernels and show analogous behavior in both cases. Furthermore, we show that optimizing the bandwidth can help mitigate the exponential decay of kernel values with qubit count, which is the cause behind recent observations that the performance of quantum kernel methods decreases with qubit count. We reproduce these negative results and show that if the kernel bandwidth is optimized, the performance instead improves with growing qubit count and becomes competitive with the best classical methods.


Author(s):  
Osval Antonio Montesinos-López ◽  
José Cricelio Montesinos-López ◽  
Abelardo Montesinos-Lopez ◽  
Juan Manuel Ramírez-Alcaraz ◽  
Jesse Poland ◽  
...  

Abstract When multi-trait data are available, the preferred models are those that are able to account for correlations between phenotypic traits because when the degree of correlation is moderate or large, this increases the genomic prediction accuracy. For this reason, in this paper we explore Bayesian multi-trait kernel methods for genomic prediction and we illustrate the power of these models with three real datasets. The kernels under study were the linear, Gaussian, polynomial and sigmoid kernels; they were compared with the conventional Ridge regression and GBLUP multi-trait models. The results show that, in general, the Gaussian kernel method outperformed conventional Bayesian Ridge and GBLUP multi-trait linear models by 2.2 to 17.45% (datasets 1 to 3) in terms of prediction performance based on the mean square error of prediction. This improvement in terms of prediction performance of the Bayesian multi-trait kernel method can be attributed to the fact that the proposed model is able to capture non-linear patterns more efficiently than linear multi-trait models. However, not all kernels perform well in the datasets used for evaluation, which is why more than one kernel should be evaluated to be able to choose the best kernel.


2021 ◽  
Vol 7 ◽  
pp. 442-450
Author(s):  
Guillermo Terrén-Serrano ◽  
Manel Martínez-Ramón

Author(s):  
Alexe L. Haywood ◽  
Joseph Redshaw ◽  
Magnus W. D. Hanson-Heine ◽  
Adam Taylor ◽  
Alex Brown ◽  
...  

2021 ◽  
pp. 311-354
Author(s):  
Gabriele Santin ◽  
Bernard Haasdonk

2021 ◽  
Vol 2021 ◽  
pp. 1-11
Author(s):  
Xi Liu ◽  
Zengrong Zhan ◽  
Guo Niu

Image recognition tasks involve an increasingly high amount of symmetric positive definite (SPD) matrices data. SPD manifolds exhibit nonlinear geometry, and Euclidean machine learning methods cannot be directly applied to SPD manifolds. The kernel trick of SPD manifolds is based on the concept of projecting data onto a reproducing kernel Hilbert space. Unfortunately, existing kernel methods do not consider the connection of SPD matrices and linear projections. Thus, a framework that uses the correlation between SPD matrices and projections to model the kernel map is proposed herein. To realize this, this paper formulates a Hilbert–Schmidt independence criterion (HSIC) regularization framework based on the kernel trick, where HSIC is usually used to express the interconnectedness of two datasets. The proposed framework allows us to extend the existing kernel methods to new HSIC regularization kernel methods. Additionally, this paper proposes an algorithm called HSIC regularized graph discriminant analysis (HRGDA) for SPD manifolds based on the HSIC regularization framework. The proposed HSIC regularization framework and HRGDA are highly accurate and valid based on experimental results on several classification tasks.


Algorithms ◽  
2021 ◽  
Vol 14 (10) ◽  
pp. 293
Author(s):  
Efthimios Providas

This article is concerned with the construction of approximate analytic solutions to linear Fredholm integral equations of the second kind with general continuous kernels. A unified treatment of some classes of analytical and numerical classical methods, such as the Direct Computational Method (DCM), the Degenerate Kernel Methods (DKM), the Quadrature Methods (QM) and the Projection Methods (PM), is proposed. The problem is formulated as an abstract equation in a Banach space and a solution formula is derived. Then, several approximating schemes are discussed. In all cases, the method yields an explicit, albeit approximate, solution. Several examples are solved to illustrate the performance of the technique.


Sign in / Sign up

Export Citation Format

Share Document