scholarly journals Intra-Pulse Modulation Classification of Radar Emitter Signals Based on a 1-D Selective Kernel Convolutional Neural Network

2021 ◽  
Vol 13 (14) ◽  
pp. 2799
Author(s):  
Shibo Yuan ◽  
Bin Wu ◽  
Peng Li

The intra-pulse modulation of radar emitter signals is a key feature for analyzing radar systems. Traditional methods which require a tremendous amount of prior knowledge are insufficient to accurately classify the intra-pulse modulations. Recently, deep learning-based methods, especially convolutional neural networks (CNN), have been used in classification of intra-pulse modulation of radar emitter signals. However, those two-dimensional CNN-based methods, which require dimensional transformation of the original sampled signals in the stage of data preprocessing, are resource-consuming and poorly feasible. In order to solve these problems, we proposed a one-dimensional selective kernel convolutional neural network (1-D SKCNN) to accurately classify the intra-pulse modulation of radar emitter signals. Compared with other previous methods described in the literature, the data preprocessing of the proposed method merely includes zero-padding, fast Fourier transformation (FFT) and amplitude normalization, which is much faster and easier to achieve. The experimental results indicate that the proposed method has the advantages of faster speed in data preprocessing and higher accuracy in intra-pulse modulation classification of radar emitter signals.

2020 ◽  
Vol 2020 ◽  
pp. 1-12
Author(s):  
Feng Wang ◽  
Shanshan Huang ◽  
Chao Liang

Sensing the external complex electromagnetic environment is an important function for cognitive radar, and the concept of cognition has attracted wide attention in the field of radar since it was proposed. In this paper, a novel method based on an idea of multidimensional feature map and convolutional neural network (CNN) is proposed to realize the automatic modulation classification of jamming entering the cognitive radar system. The multidimensional feature map consists of two envelope maps before and after the pulse compression processing and a time-frequency map of the receiving beam signal. Drawing the one-dimensional envelope in a 2-dimensional plane and quantizing the time-frequency data to a 2-dimensional plane, we treat the combination of the three planes (multidimensional feature map) as one picture. A CNN-based algorithm with linear kernel sensing the three planes simultaneously is selected to accomplish jamming classification. The classification of jamming, such as noise frequency modulation jamming, noise amplitude modulation jamming, slice jamming, and dense repeat jamming, is validated by computer simulation. A performance comparison study on convolutional kernels in different size demonstrates the advantage of selecting the linear kernel.


IEEE Access ◽  
2020 ◽  
Vol 8 ◽  
pp. 176717-176727
Author(s):  
Jinwook Kim ◽  
Seongwook Lee ◽  
Yong-Hwa Kim ◽  
Seong-Cheol Kim

2020 ◽  
Vol 2020 (4) ◽  
pp. 4-14
Author(s):  
Vladimir Budak ◽  
Ekaterina Ilyina

The article proposes the classification of lenses with different symmetrical beam angles and offers a scale as a spot-light’s palette. A collection of spotlight’s images was created and classified according to the proposed scale. The analysis of 788 pcs of existing lenses and reflectors with different LEDs and COBs carried out, and the dependence of the axial light intensity from beam angle was obtained. A transfer training of new deep convolutional neural network (CNN) based on the pre-trained GoogleNet was performed using this collection. GradCAM analysis showed that the trained network correctly identifies the features of objects. This work allows us to classify arbitrary spotlights with an accuracy of about 80 %. Thus, light designer can determine the class of spotlight and corresponding type of lens with its technical parameters using this new model based on CCN.


Author(s):  
P.L. Nikolaev

This article deals with method of binary classification of images with small text on them Classification is based on the fact that the text can have 2 directions – it can be positioned horizontally and read from left to right or it can be turned 180 degrees so the image must be rotated to read the sign. This type of text can be found on the covers of a variety of books, so in case of recognizing the covers, it is necessary first to determine the direction of the text before we will directly recognize it. The article suggests the development of a deep neural network for determination of the text position in the context of book covers recognizing. The results of training and testing of a convolutional neural network on synthetic data as well as the examples of the network functioning on the real data are presented.


2020 ◽  
Vol 14 ◽  
Author(s):  
Lahari Tipirneni ◽  
Rizwan Patan

Abstract:: Millions of deaths all over the world are caused by breast cancer every year. It has become the most common type of cancer in women. Early detection will help in better prognosis and increases the chance of survival. Automating the classification using Computer-Aided Diagnosis (CAD) systems can make the diagnosis less prone to errors. Multi class classification and Binary classification of breast cancer is a challenging problem. Convolutional neural network architectures extract specific feature descriptors from images, which cannot represent different types of breast cancer. This leads to false positives in classification, which is undesirable in disease diagnosis. The current paper presents an ensemble Convolutional neural network for multi class classification and Binary classification of breast cancer. The feature descriptors from each network are combined to produce the final classification. In this paper, histopathological images are taken from publicly available BreakHis dataset and classified between 8 classes. The proposed ensemble model can perform better when compared to the methods proposed in the literature. The results showed that the proposed model could be a viable approach for breast cancer classification.


Sign in / Sign up

Export Citation Format

Share Document