scholarly journals Nonnegative Dictionary-Learning Algorithm Based on L1 Norm with the Sparse Analysis Model

2015 ◽  
Vol 1 (4) ◽  
pp. 1-10
Author(s):  
Yujie Li ◽  
Shuxue Ding ◽  
Zhenni Li ◽  
Wuhui Chen
2021 ◽  
Vol 429 ◽  
pp. 89-100
Author(s):  
Zhenni Li ◽  
Chao Wan ◽  
Benying Tan ◽  
Zuyuan Yang ◽  
Shengli Xie

Author(s):  
Daniel Danso Essel ◽  
Ben-Bright Benuwa ◽  
Benjamin Ghansah

Sparse Representation (SR) and Dictionary Learning (DL) based Classifier have shown promising results in classification tasks, with impressive recognition rate on image data. In Video Semantic Analysis (VSA) however, the local structure of video data contains significant discriminative information required for classification. To the best of our knowledge, this has not been fully explored by recent DL-based approaches. Further, similar coding findings are not being realized from video features with the same video category. Based on the foregoing, a novel learning algorithm, Sparsity based Locality-Sensitive Discriminative Dictionary Learning (SLSDDL) for VSA is proposed in this paper. In the proposed algorithm, a discriminant loss function for the category based on sparse coding of the sparse coefficients is introduced into structure of Locality-Sensitive Dictionary Learning (LSDL) algorithm. Finally, the sparse coefficients for the testing video feature sample are solved by the optimized method of SLSDDL and the classification result for video semantic is obtained by minimizing the error between the original and reconstructed samples. The experimental results show that, the proposed SLSDDL significantly improves the performance of video semantic detection compared with state-of-the-art approaches. The proposed approach also shows robustness to diverse video environments, proving the universality of the novel approach.


IEEE Access ◽  
2020 ◽  
Vol 8 ◽  
pp. 212456-212466
Author(s):  
Zhuoyun Miao ◽  
Hongjuan Zhang ◽  
Shuang Ma

2020 ◽  
Vol 29 ◽  
pp. 9220-9233
Author(s):  
Na Han ◽  
Jigang Wu ◽  
Xiaozhao Fang ◽  
Shaohua Teng ◽  
Guoxu Zhou ◽  
...  

2019 ◽  
Vol 19 (04) ◽  
pp. 1950026 ◽  
Author(s):  
SINAM AJITKUMAR SINGH ◽  
SWANIRBHAR MAJUMDER

Obstructive sleep apnea (OSA) is the most common and severe breathing dysfunction which frequently freezes the breathing for longer than 10[Formula: see text]s while sleeping. Polysomnography (PSG) is the conventional approach concerning the treatment of OSA detection. But, this approach is a costly and cumbersome process. To overcome the above complication, a satisfactory and novel technique for interpretation of sleep apnea using ECG were recording is under development. The methods for OSA analysis based on ECG were analyzed for numerous years. Early work concentrated on extracting features, which depend entirely on the experience of human specialists. A novel approach for the prediction of sleep apnea disorder based on the convolutional neural network (CNN) using a pre-trained (AlexNet) model is analyzed in this study. After filtering per-minute segment of the single-lead ECG recording accompanied by continuous wavelet transform (CWT), the 2D scalogram images are generated. Finally, CNN based on deep learning algorithm is adopted to enhance the classification performance. The efficiency of the proposed model is compared with the previous methods that used the same datasets. Proposed method based on CNN is able to achieve the accuracy of 86.22% with 90% sensitivity in per-minute segment OSA classification. Based on per-recording OSA diagnosis, our works correctly classify all the abnormal apneic recording with 100% accuracy. Our OSA analysis model using time-frequency scalogram generates excellent independent validation performance with different state-of-the-art OSA classification systems. Experimental results proved that the proposed method produces excellent performance outcomes with low cost and less complexity.


2019 ◽  
Vol 11 (13) ◽  
pp. 3499 ◽  
Author(s):  
Se-Hoon Jung ◽  
Jun-Ho Huh

This study sought to propose a big data analysis and prediction model for transmission line tower outliers to assess when something is wrong with transmission line tower big data based on deep reinforcement learning. The model enables choosing automatic cluster K values based on non-labeled sensor big data. It also allows measuring the distance of action between data inside a cluster with the Q-value representing network output in the altered transmission line tower big data clustering algorithm containing transmission line tower outliers and old Deep Q Network. Specifically, this study performed principal component analysis to categorize transmission line tower data and proposed an automatic initial central point approach through standard normal distribution. It also proposed the A-Deep Q-Learning algorithm altered from the deep Q-Learning algorithm to explore policies based on the experiences of clustered data learning. It can be used to perform transmission line tower outlier data learning based on the distance of data within a cluster. The performance evaluation results show that the proposed model recorded an approximately 2.29%~4.19% higher prediction rate and around 0.8% ~ 4.3% higher accuracy rate compared to the old transmission line tower big data analysis model.


Sign in / Sign up

Export Citation Format

Share Document