scholarly journals Importance of Kernel Bandwidth in Quantum Machine Learning

Author(s):  
Ruslan Shaydulin ◽  
Stefan Wild

Abstract Quantum kernel methods are considered a promising avenue for applying quantum computers to machine learning problems. However, recent results overlook the central role hyperparameters play in determining the performance of machine learning methods. In this work we identify the hyperparameter controlling the bandwidth of a quantum kernel and show that it controls the expressivity of the resulting model. We use extensive numerical experiments with multiple quantum kernels and classical datasets to show consistent change in the model behavior from underfitting (bandwidth too large) to overfitting (bandwidth too small), with optimal generalization in between. We draw a connection between the bandwidth of classical and quantum kernels and show analogous behavior in both cases. Furthermore, we show that optimizing the bandwidth can help mitigate the exponential decay of kernel values with qubit count, which is the cause behind recent observations that the performance of quantum kernel methods decreases with qubit count. We reproduce these negative results and show that if the kernel bandwidth is optimized, the performance instead improves with growing qubit count and becomes competitive with the best classical methods.

Author(s):  
Bhanu Chander

The basic idea of artificial intelligence and machine learning is that machines have the talent to learn from data, previous experience, and perform the work in future consequences. In the era of the digitalized world which holds big data has long-established machine learning methods consistently with requisite high-quality computational resources in numerous useful and realistic tasks. At the same time, quantum machine learning methods work exponentially faster than their counterparts by making use of quantum mechanics. Through taking advantage of quantum effects such as interference or entanglement, quantum computers can proficiently explain selected issues that are supposed to be tough for traditional machines. Quantum computing is unexpectedly related to that of kernel methods in machine learning. Hence, this chapter provides quantum computation, advance of QML techniques, QML kernel space and optimization, and future work of QML.


2021 ◽  
Vol 2021 ◽  
pp. 1-11
Author(s):  
Xi Liu ◽  
Zengrong Zhan ◽  
Guo Niu

Image recognition tasks involve an increasingly high amount of symmetric positive definite (SPD) matrices data. SPD manifolds exhibit nonlinear geometry, and Euclidean machine learning methods cannot be directly applied to SPD manifolds. The kernel trick of SPD manifolds is based on the concept of projecting data onto a reproducing kernel Hilbert space. Unfortunately, existing kernel methods do not consider the connection of SPD matrices and linear projections. Thus, a framework that uses the correlation between SPD matrices and projections to model the kernel map is proposed herein. To realize this, this paper formulates a Hilbert–Schmidt independence criterion (HSIC) regularization framework based on the kernel trick, where HSIC is usually used to express the interconnectedness of two datasets. The proposed framework allows us to extend the existing kernel methods to new HSIC regularization kernel methods. Additionally, this paper proposes an algorithm called HSIC regularized graph discriminant analysis (HRGDA) for SPD manifolds based on the HSIC regularization framework. The proposed HSIC regularization framework and HRGDA are highly accurate and valid based on experimental results on several classification tasks.


2018 ◽  
Vol 16 (06) ◽  
pp. 1850026
Author(s):  
Qiangrong Jiang ◽  
Jiajia Ma

Considering the classification of compounds as a nonlinear problem, the use of kernel methods is a good choice. Graph kernels provide a nice framework combining machine learning methods with graph theory, whereas the essence of graph kernels is to compare the substructures of two graphs, how to extract the substructures is a question. In this paper, we propose a novel graph kernel based on matrix named the local block kernel, which can compare the similarity of partial substructures that contain any number of vertexes. The paper finally tests the efficacy of this novel graph kernel in comparison with a number of published mainstream methods and results with two datasets: NCI1 and NCI109 for the convenience of comparison.


Author(s):  
Qing Xu ◽  
Xiaohua (Michael) Xuan

Abstract In this paper, we consider a class of nonlinear regression problems without the assumption of being independent and identically distributed. We propose a correspondent mini-max problem for nonlinear regression and give a numerical algorithm. Such an algorithm can be applied in regression and machine learning problems, and yields better results than traditional least squares and machine learning methods.


2020 ◽  
Author(s):  
Shreya Reddy ◽  
Lisa Ewen ◽  
Pankti Patel ◽  
Prerak Patel ◽  
Ankit Kundal ◽  
...  

<p>As bots become more prevalent and smarter in the modern age of the internet, it becomes ever more important that they be identified and removed. Recent research has dictated that machine learning methods are accurate and the gold standard of bot identification on social media. Unfortunately, machine learning models do not come without their negative aspects such as lengthy training times, difficult feature selection, and overwhelming pre-processing tasks. To overcome these difficulties, we are proposing a blockchain framework for bot identification. At the current time, it is unknown how this method will perform, but it serves to prove the existence of an overwhelming gap of research under this area.<i></i></p>


Sign in / Sign up

Export Citation Format

Share Document