scholarly journals Robust clutter suppression in heterogeneous environments based on multi frames and similarities

2020 ◽  
Author(s):  
Yifeng Wu ◽  
Yufeng Cheng ◽  
Jun Tang ◽  
Jia Duan ◽  
Xiaobo Deng

Abstract A method of robust clutter suppression with space time adaptive processing (STAP) for airborne radar in heterogeneous environments is proposed, which is based on multi frames and the similarity between the cell under test and each training sample. The proposed method deals with the problem of covariance matrix estimation for radar signal processing, and it provides a solution to overcome the performance degradation of STAP in heterogeneous environments and training samples limitation. Firstly, the method expands the set of training samples by selecting training frames from past frames. Secondly, initial training samples are selected from the expended training samples set, which is composed by the samples of current frame and past frames. Thirdly, general inner product method is adopted to discard heterogeneous samples. Fourthly, the similarities between the cell under test and the remaining training samples are estimated, and training samples which are more similar with the cell under test take higher weight in the estimation of clutter covariance matrix. The accuracy of the estimated clutter character is improved significantly, and thus the performance of clutter suppression is improved. Experimental results based on measured data demonstrate the performance of the proposed method.

2017 ◽  
Vol 2017 ◽  
pp. 1-9
Author(s):  
Qiang Wang ◽  
Yongshun Zhang ◽  
Hanwei Liu ◽  
Yiduo Guo

Training samples contaminated by target-like signals is one of the major reasons for inhomogeneous clutter environment. In such environment, clutter covariance matrix in STAP (space-time adaptive processing) is estimated inaccurately, which finally leads to detection performance reduction. In terms of this problem, a STAP interference detection method based on simplified TT (time-time) transform is proposed in this letter. Considering the sparse physical property of clutter in the space-time plane, data on each range cell is first converted into a discrete slow time series. Then, the expression of simplified TT transform about sample data is derived step by step. Thirdly, the energy of each training sample is focalized and extracted by simplified TT transform from energy-variant difference between the unpolluted and polluted stage, and the physical significance of discarding the contaminated samples is analyzed. Lastly, the contaminated samples are picked out in light of the simplified TT transform-spectrum difference. The result on Monte Carlo simulation indicates that when training samples are contaminated by large power target-like signals, the proposed method is more effective in getting rid of the contaminated samples, reduces the computational complexity significantly, and promotes the target detection performance compared with the method of GIP (generalized inner product).


1997 ◽  
Vol 08 (05n06) ◽  
pp. 509-515
Author(s):  
Yan Li ◽  
A. B. Rad

A new structure and training method for multilayer neural networks is presented. The proposed method is based on cascade training of subnetworks and optimizing weights layer by layer. The training procedure is completed in two steps. First, a subnetwork, m inputs and n outputs as the style of training samples, is trained using the training samples. Secondly the outputs of the subnetwork is taken as the inputs and the outputs of the training sample as the desired outputs, another subnetwork with n inputs and n outputs is trained. Finally the two trained subnetworks are connected and a trained multilayer neural networks is created. The numerical simulation results based on both linear least squares back-propagation (LSB) and traditional back-propagation (BP) algorithm have demonstrated the efficiency of the proposed method.


2018 ◽  
Vol 37 (9) ◽  
pp. 4136-4149 ◽  
Author(s):  
Yan Zhou ◽  
Lin Wang ◽  
Xiaoxuan Chen ◽  
Cai Wen ◽  
Bo Jiang ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document