scholarly journals Information–Theoretic Radar Waveform Design under the SINR Constraint

Entropy ◽  
2020 ◽  
Vol 22 (10) ◽  
pp. 1182
Author(s):  
Yu Xiao ◽  
Zhenghong Deng ◽  
Tao Wu

This study investigates the information–theoretic waveform design problem to improve radar performance in the presence of signal-dependent clutter environments. The goal was to study the waveform energy allocation strategies and provide guidance for radar waveform design through the trade-off relationship between the information theory criterion and the signal-to-interference-plus-noise ratio (SINR) criterion. To this end, a model of the constraint relationship among the mutual information (MI), the Kullback–Leibler divergence (KLD), and the SINR is established in the frequency domain. The effects of the SINR value range on maximizing the MI and KLD under the energy constraint are derived. Under the constraints of energy and the SINR, the optimal radar waveform method based on maximizing the MI is proposed for radar estimation, with another method based on maximizing the KLD proposed for radar detection. The maximum MI value range is bounded by SINR and the maximum KLD value range is between 0 and the Jenson–Shannon divergence (J-divergence) value. Simulation results show that under the SINR constraint, the MI-based optimal signal waveform can make full use of the transmitted energy to target information extraction and put the signal energy in the frequency bin where the target spectrum is larger than the clutter spectrum. The KLD-based optimal signal waveform can therefore make full use of the transmitted energy to detect the target and put the signal energy in the frequency bin with the maximum target spectrum.

Author(s):  
Zhenghan Zhu ◽  
Steven Kay ◽  
R. S. Raghavan

Radar transmit signal design is a critical factor for the radar performance. In this paper, we investigate the problem of radar signal waveform design under the small signal power conditions for detecting a doubly spread target, whose impulse response can be modeled as a random process, in a colored noise environment. The doubly spread target spans multiple range bins (range-spread) and its impulse response is time-varying due to fluctuation (hence also Doppler-spread), such that the target impulse response is both time-selective and frequency-selective. Instead of adopting the conventional assumption that the target is wide-sense stationary uncorrelated scattering,we assume that the target impulse response is both wide-sense stationary in range and in time to account for the possible correlation between the impulse responses corresponding to close range intervals. The locally most powerful detector, which is asymptotically optimal for small signal cases, is then derived for detecting such targets. The signal waveform is optimized to maximizing the detection performance of the detector or equivalently maximizing the Kullback-Leibler divergence. Numerical simulations validate the effectiveness of the proposed waveform design for the small signal power conditions and performance of optimum waveform design are shown in comparison to the frequency modulated waveform.


Author(s):  
Ryan Ka Yau Lai ◽  
Youngah Do

This article explores a method of creating confidence bounds for information-theoretic measures in linguistics, such as entropy, Kullback-Leibler Divergence (KLD), and mutual information. We show that a useful measure of uncertainty can be derived from simple statistical principles, namely the asymptotic distribution of the maximum likelihood estimator (MLE) and the delta method. Three case studies from phonology and corpus linguistics are used to demonstrate how to apply it and examine its robustness against common violations of its assumptions in linguistics, such as insufficient sample size and non-independence of data points.


Entropy ◽  
2019 ◽  
Vol 21 (1) ◽  
pp. 33 ◽  
Author(s):  
Bin Wang ◽  
Xu Chen ◽  
Fengming Xin ◽  
Xin Song

Due to the uncertainties of radar target prior information in the actual scene, the waveform designed based on radar target prior information cannot meet the needs of detection and parameter estimation performance. In this paper, the optimal waveform design techniques under energy constraints for different tasks are considered. To improve the detection performance of radar systems, a novel waveform design method which can maximize the signal-to-interference-plus-noise ratio (SINR) for known and random extended targets is proposed. To improve the performance of parameter estimation, another waveform design method which can maximize the mutual information (MI) between the radar echo and the random-target spectrum response is also considered. Most of the previous waveform design researches assumed that the prior information of the target spectrum is completely known. However, in the actual scene, the real target spectrum cannot be accurately captured. To simulate this scenario, the real target spectrum was assumed to be within an uncertainty range where the upper and lower bounds are known. Then, the SINR- and MI-based maximin robust waveforms were designed, which could optimize the performance under the most unfavorable conditions. The simulation results show that the designed optimal waveforms based on these two criteria are different, which provides useful guidance for waveform energy allocation in different transmission tasks. However, under the constraint of limited energy, we also found that the performance improvement of SINR or MI in the worst case for single targets is less significant than that of multiple targets.


2011 ◽  
Vol 11 (2-3) ◽  
pp. 263-296 ◽  
Author(s):  
SHAY B. COHEN ◽  
ROBERT J. SIMMONS ◽  
NOAH A. SMITH

AbstractWeighted logic programming, a generalization of bottom-up logic programming, is a well-suited framework for specifying dynamic programming algorithms. In this setting, proofs correspond to the algorithm's output space, such as a path through a graph or a grammatical derivation, and are given a real-valued score (often interpreted as a probability) that depends on the real weights of the base axioms used in the proof. The desired output is a function over all possible proofs, such as a sum of scores or an optimal score. We describe the product transformation, which can merge two weighted logic programs into a new one. The resulting program optimizes a product of proof scores from the original programs, constituting a scoring function known in machine learning as a “product of experts.” Through the addition of intuitive constraining side conditions, we show that several important dynamic programming algorithms can be derived by applying product to weighted logic programs corresponding to simpler weighted logic programs. In addition, we show how the computation of Kullback–Leibler divergence, an information-theoretic measure, can be interpreted using product.


2013 ◽  
Vol 427-429 ◽  
pp. 1537-1543 ◽  
Author(s):  
Ya Fen Wang ◽  
Feng Zhen Zhang ◽  
Shan Jian Liu ◽  
Meng Huang

In this paper, we study an information theoretic approach to image similarity measurement for content-base image retrieval. In this novel scheme, similarities are measured by the amount of information the images contained about one another mutual information (MI). The given approach is based on the premise that two similar images should have high mutual information, or equivalently, the querying image should convey high information about those similar to it. The method first generates a set of statistically representative visual patterns and uses the distributions of these patterns as images content descriptors. To measure the similarity of two images, we develop a method to compute the mutual information between their content descriptors. Two images with larger descriptor mutual information are regarded as more similar. We present experimental results, which demonstrate that mutual information is a more effective image similarity measure than those have been used in the literature such as Kullback-Leibler divergence and L2 norms.


2010 ◽  
Vol 2010 ◽  
pp. 1-18 ◽  
Author(s):  
Mostafa Afgani ◽  
Sinan Sinanović ◽  
Harald Haas

Efficient utilisation and sharing of limited spectrum resources in an autonomous fashion is one of the primary goals of cognitive radio. However, decentralised spectrum sharing can lead to interference scenarios that must be detected and characterised to help achieve the other goal of cognitive radio—reliable service for the end user. Interference events can be treated as unusual and therefore anomaly detection algorithms can be applied for their detection. Two complementary algorithms based on information theoretic measures of statistical distribution divergence and information content are proposed. The first method is applicable to signals with periodic structures and is based on the analysis of Kullback-Leibler divergence. The second utilises information content analysis to detect unusual events. Results from software and hardware implementations show that the proposed algorithms are effective, simple, and capable of processing high-speed signals in real time. Additionally, neither of the algorithms require demodulation of the signal.


2016 ◽  
Vol 2 ◽  
pp. e71
Author(s):  
Justin Bedo ◽  
Benjamin Goudey ◽  
Jeremy Wazny ◽  
Zeyu Zhou

While traditional methods for calling variants across whole genome sequence data rely on alignment to an appropriate reference sequence, alternative techniques are needed when a suitable reference does not exist. We present a novel alignment and assembly free variant calling method based on information theoretic principles designed to detect variants have strong statistical evidence for their ability to segregate samples in a given dataset. Our method uses the context surrounding a particular nucleotide to define variants. Given a set of reads, we model the probability of observing a given nucleotide conditioned on the surrounding prefix and suffixes of lengthkas a multinomial distribution. We then estimate which of these contexts are stable intra-sample and varying inter-sample using a statistic based on the Kullback–Leibler divergence.The utility of the variant calling method was evaluated through analysis of a pair of bacterial datasets and a mouse dataset. We found that our variants are highly informative for supervised learning tasks with performance similar to standard reference based calls and another reference free method (DiscoSNP++). Comparisons against reference based calls showed our method was able to capture very similar population structure on the bacterial dataset. The algorithm’s focus on discriminatory variants makes it suitable for many common analysis tasks for organisms that are too diverse to be mapped back to a single reference sequence.


Author(s):  
Bertrand Charpentier ◽  
Thomas Bonald

We introduce the tree sampling divergence (TSD), an information-theoretic metric for assessing the quality of the hierarchical clustering of a graph. Any hierarchical clustering of a graph can be represented as a tree whose nodes correspond to clusters of the graph. The TSD is the Kullback-Leibler divergence between two probability distributions over the nodes of this tree: those induced respectively by sampling at random edges and node pairs of the graph. A fundamental property of the proposed metric is that it is interpretable in terms of graph reconstruction. Specifically, it quantifies the ability to reconstruct the graph from the tree in terms of information loss. In particular, the TSD is maximum when perfect reconstruction is feasible, i.e., when the graph has a complete hierarchical structure. Another key property of TSD is that it applies to any tree, not necessarily binary. In particular, the TSD can be used to compress a binary tree while minimizing the information loss in terms of graph reconstruction, so as to get a compact representation of the hierarchical structure of a graph. We illustrate the behavior of TSD compared to existing metrics on experiments based on both synthetic and real datasets.


Sign in / Sign up

Export Citation Format

Share Document