misclassification probability
Recently Published Documents


TOTAL DOCUMENTS

25
(FIVE YEARS 6)

H-INDEX

9
(FIVE YEARS 2)

Author(s):  
Tingting Wang ◽  
Yinju Bian ◽  
Qianli Yang ◽  
Mengyi Ren

ABSTRACT Classification of low-magnitude seismic events is a challenging issue for the Comprehensive Nuclear-Test-Ban Treaty. Path correction of the P/S amplitude ratio is the key to identifying earthquakes and explosions. In this article, the Bayesian Kriging interpolation method is used to conduct the path correction of P/S amplitude ratios and recognition of low-magnitude seismic events. Based on a total of 5677 small earthquakes and 1769 explosions in Beijing and its adjacent areas, the Bayesian Kriging method is used to establish the path correction surface and uncertainty surface of Pg/Lg amplitude ratios measured within different frequency bands at five seismic stations, and path correction of amplitude ratios is conducted for all events. The results show that the correction surface is consistent with the observed amplitude ratios, which can reflect the differences in their propagation paths to a certain extent. The root mean square variation of the amplitude ratio is reduced by a maximum of 30% and the misclassification probability is reduced by a maximum of 8.5% after the Kriging correction. The high-frequency Pg/Lg amplitude ratios can effectively classify low-magnitude events, and the misclassification probability at each station is less than 15% and 10% based on high-frequency Pg/Lg of >7 and >9  Hz, respectively. Of the five stations, BJT (Baijiatuan, Beijing) has the best recognition, with the misclassification probability being lower than 5% after Kriging correction based on high-frequency Pg/Lg (>9  Hz). The classification ability of high-frequency amplitude ratios (>15  Hz) is weakened due to high-frequency noises. Bayesian Kriging correction can reduce the variance in the amplitude ratio of low-magnitude seismic events and hence effectively improve the ability to classify small-magnitude events, which has an important reference value for regional seismic monitoring and identification.


2019 ◽  
Vol 49 (2) ◽  
pp. 175-193 ◽  
Author(s):  
Hiroki Watanabe ◽  
Masashi Hyodo ◽  
Yuki Yamada ◽  
Takashi Seo

Sensors ◽  
2019 ◽  
Vol 19 (6) ◽  
pp. 1476 ◽  
Author(s):  
Kewen Li ◽  
Guangyue Zhou ◽  
Jiannan Zhai ◽  
Fulai Li ◽  
Mingwen Shao

The Adaptive Boosting (AdaBoost) algorithm is a widely used ensemble learning framework, and it can get good classification results on general datasets. However, it is challenging to apply the AdaBoost algorithm directly to imbalanced data since it is designed mainly for processing misclassified samples rather than samples of minority classes. To better process imbalanced data, this paper introduces the indicator Area Under Curve (AUC) which can reflect the comprehensive performance of the model, and proposes an improved AdaBoost algorithm based on AUC (AdaBoost-A) which improves the error calculation performance of the AdaBoost algorithm by comprehensively considering the effects of misclassification probability and AUC. To prevent redundant or useless weak classifiers the traditional AdaBoost algorithm generated from consuming too much system resources, this paper proposes an ensemble algorithm, PSOPD-AdaBoost-A, which can re-initialize parameters to avoid falling into local optimum, and optimize the coefficients of AdaBoost weak classifiers. Experiment results show that the proposed algorithm is effective for processing imbalanced data, especially the data with relatively high imbalances.


2018 ◽  
Vol 11 (1) ◽  
pp. 284 ◽  
Author(s):  
Mufda Jameel Alrawashdeh ◽  
Taha Radwan Radwan ◽  
Kalid Abunawas Abunawas

This study aims to combine the new deterministic minimum covariance determinant (DetMCD) algorithm with linear discriminant analysis (LDA) and compare it with the fast minimum covariance determinant (FastMCD), fast consistent high breakdown (FCH), and robust FCH (RFCH) algorithms. LDA classifies new observations into one of the unknown groups and it is widely used in multivariate statistical analysis. The LDA mean and covariance matrix parameters are highly influenced by outliers. The DetMCD algorithm is highly robust and resistant to outliers and it is constructed to overcome the outlier problem. Moreover, the DetMCD algorithm is used to estimate location and scatter matrices. The DetMCD, FastMCD, FCH, and RFCH algorithms are applied to estimate misclassification probability using robust LDA. All the algorithms are expected to improve the LDA model for classification purposes in banks, such as bankruptcy and failures, and to distinguish between Islamic and conventional banks. The performances of the estimators are investigated through simulation and actual data.


Author(s):  
Nicholas A. Nechval ◽  
Konstantin N. Nechval

A product acceptance process is an inspecting one in statistical quality control or reliability tests, which are used to make decisions about accepting or rejecting lots of products to be submitted. This process is important for industrial and business purposes of quality management. To determine the optimal parameters of the product acceptance process under parametric uncertainty of underlying lifetime models (in terms of misclassification probability), a new optimization technique is proposed. The most popular lifetime distribution used in the field of product acceptance is a two-parameter Weibull distribution, with the assumption that the shape parameter is known. Such oversimplified assumptions can facilitate the follow-up analyses, but may overlook the fact that the lifetime distribution can significantly affect the estimation of the failure rate of a product. Therefore, the situations are also considered when both Weibull distribution parameters are unknown. An illustrative numerical example is given.


Sign in / Sign up

Export Citation Format

Share Document