polynomial kernels
Recently Published Documents


TOTAL DOCUMENTS

83
(FIVE YEARS 16)

H-INDEX

13
(FIVE YEARS 1)

2021 ◽  
Vol 10 (6) ◽  
pp. 3121-3126
Author(s):  
Zuherman Rustam ◽  
Fildzah Zhafarina ◽  
Jane Eva Aurelia ◽  
Yasirly Amalia

Nowadays, machine learning technology is needed in the medical field. therefore, this research is useful for solving problems in the medical field by using machine learning. Many cases of colorectal cancer are diagnosed late. When colorectal cancer is detected, the cancer is usually well developed. Machine learning is an approach that is part of artificial intelligence and can detect colorectal cancer early. This study discusses colorectal cancer detection using twin support vector machine (SVM) method and kernel function i.e. linear kernels, polynomial kernels, RBF kernels, and gaussian kernels. By comparing the accuracy and running time, then we will know which method is better in classifying the colorectal cancer dataset that we get from Al-Islam Hospital, Bandung, Indonesia. The results showed that polynomial kernels has better accuracy and running time. It can be seen with a maximum accuracy of twin SVM using polynomial kernels 86% and 0.502 seconds running time.


2021 ◽  
Vol 26 (2) ◽  
pp. 304-317
Author(s):  
Andrej Liptaj

In this text explicit forms of several higher precision order kernel functions (to be used in the differentiation-by-integration procedure) are given for several derivative orders. Also, a system of linear equations is formulated which allows to construct kernels with an arbitrary precision for an arbitrary derivative order. A computer study is realized and it is shown that numerical differentiation based on higher precision order kernels performs much better (w.r.t. errors) than the same procedure based on the usual Legendre-polynomial kernels. Presented results may have implications for numerical implementations of the differentiation-by-integration method.


2020 ◽  
Vol 18 (03) ◽  
pp. 2050006
Author(s):  
Arit Kumar Bishwas ◽  
Ashish Mani ◽  
Vasile Palade

The Gaussian kernel is a very popular kernel function used in many machine learning algorithms, especially in support vector machines (SVMs). It is more often used than polynomial kernels when learning from nonlinear datasets and is usually employed in formulating the classical SVM for nonlinear problems. Rebentrost et al. discussed an elegant quantum version of a least square support vector machine using quantum polynomial kernels, which is exponentially faster than the classical counterpart. This paper demonstrates a quantum version of the Gaussian kernel and analyzes its runtime complexity using the quantum random access memory (QRAM) in the context of quantum SVM. Our analysis shows that the runtime computational complexity of the quantum Gaussian kernel is approximated to [Formula: see text] and even [Formula: see text] when [Formula: see text] and the error [Formula: see text] are small enough to be ignored, where [Formula: see text] is the dimension of the training instances, [Formula: see text] is the accuracy, [Formula: see text] is the dot product of the two quantum states, and [Formula: see text] is the Taylor remainder error term. Therefore, the run time complexity of the quantum version of the Gaussian kernel seems to be significantly faster when compared with its classical version.


Author(s):  
Thomas D. Ahle ◽  
Michael Kapralov ◽  
Jakob B. T. Knudsen ◽  
Rasmus Pagh ◽  
Ameya Velingker ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document