scholarly journals Self-Adaptive Deep Multiple Kernel Learning Based on Rademacher Complexity

Symmetry ◽  
2019 ◽  
Vol 11 (3) ◽  
pp. 325 ◽  
Author(s):  
Shengbing Ren ◽  
Wangbo Shen ◽  
Chaudry Siddique ◽  
You Li

The deep multiple kernel learning (DMKL) method has caused widespread concern due to its better results compared with shallow multiple kernel learning. However, existing DMKL methods, which have a fixed number of layers and fixed type of kernels, have poor ability to adapt to different data sets and are difficult to find suitable model parameters to improve the test accuracy. In this paper, we propose a self-adaptive deep multiple kernel learning (SA-DMKL) method. Our SA-DMKL method can adapt the model through optimizing the model parameters of each kernel function with a grid search method and change the numbers and types of kernel function in each layer according to the generalization bound that is evaluated with Rademacher chaos complexity. Experiments on the three datasets of University of California—Irvine (UCI) and image dataset Caltech 256 validate the effectiveness of the proposed method on three aspects.

Author(s):  
Q. Wang ◽  
Y. Gu ◽  
T. Liu ◽  
H. Liu ◽  
X. Jin

In recent years, many studies on remote sensing image classification have shown that using multiple features from different data sources can effectively improve the classification accuracy. As a very powerful means of learning, multiple kernel learning (MKL) can conveniently be embedded in a variety of characteristics. The conventional combined kernel learned by MKL can be regarded as the compromise of all basic kernels for all classes in classification. It is the best of the whole, but not optimal for each specific class. For this problem, this paper proposes a class-pair-guided MKL method to integrate the heterogeneous features (HFs) from multispectral image (MSI) and light detection and ranging (LiDAR) data. In particular, the <q>one-against-one</q> strategy is adopted, which converts multiclass classification problem to a plurality of two-class classification problem. Then, we select the best kernel from pre-constructed basic kernels set for each class-pair by kernel alignment (KA) in the process of classification. The advantage of the proposed method is that only the best kernel for the classification of any two classes can be retained, which leads to greatly enhanced discriminability. Experiments are conducted on two real data sets, and the experimental results show that the proposed method achieves the best performance in terms of classification accuracies in integrating the HFs for classification when compared with several state-of-the-art algorithms.


2021 ◽  
Vol 22 (S3) ◽  
Author(s):  
Yuqing Qian ◽  
Limin Jiang ◽  
Yijie Ding ◽  
Jijun Tang ◽  
Fei Guo

Abstract Background DNA-Binding Proteins (DBP) plays a pivotal role in biological system. A mounting number of researchers are studying the mechanism and detection methods. To detect DBP, the tradition experimental method is time-consuming and resource-consuming. In recent years, Machine Learning methods have been used to detect DBP. However, it is difficult to adequately describe the information of proteins in predicting DNA-binding proteins. In this study, we extract six features from protein sequence and use Multiple Kernel Learning-based on Centered Kernel Alignment to integrate these features. The integrated feature is fed into Support Vector Machine to build predictive model and detect new DBP. Results In our work, date sets of PDB1075 and PDB186 are employed to test our method. From the results, our model obtains better results (accuracy) than other existing methods on PDB1075 ($$84.19\%$$ 84.19 % ) and PDB186 ($$83.7\%$$ 83.7 % ), respectively. Conclusion Multiple kernel learning could fuse the complementary information between different features. Compared with existing methods, our method achieves comparable and best results on benchmark data sets.


2021 ◽  
Author(s):  
Shervin Rahimzadeh Arashloo

The paper addresses the one-class classification (OCC) problem and advocates a one-class multiple kernel learning (MKL) approach for this purpose. To this aim, based on the Fisher null-space one-class classification principle, we present a multiple kernel learning algorithm where an $\ell_p$-norm constraint ($p\geq1$) on kernel weights is considered. We cast the proposed one-class MKL task as a min-max saddle point Lagrangian optimisation problem and propose an efficient method to solve it. An extension of the proposed one-class MKL approach is also considered where several related one-class MKL tasks are learned concurrently by constraining them to share common kernel weights. <br>An extensive assessment of the proposed method on a range of data sets from different application domains confirms its merits against the baseline and several other algorithms.<br>


2021 ◽  
Author(s):  
Shervin Rahimzadeh Arashloo

The paper addresses the one-class classification (OCC) problem and advocates a one-class multiple kernel learning (MKL) approach for this purpose. To this aim, based on the Fisher null-space one-class classification principle, we present a multiple kernel learning algorithm where an $\ell_p$-norm constraint ($p\geq1$) on kernel weights is considered. We cast the proposed one-class MKL task as a min-max saddle point Lagrangian optimisation problem and propose an efficient method to solve it. An extension of the proposed one-class MKL approach is also considered where several related one-class MKL tasks are learned concurrently by constraining them to share common kernel weights. <br>An extensive assessment of the proposed method on a range of data sets from different application domains confirms its merits against the baseline and several other algorithms.<br>


Author(s):  
Guo ◽  
Xiaoqian Zhang ◽  
Zhigui Liu ◽  
Xuqian Xue ◽  
Qian Wang ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document