Optimizing electromagnetic sensors for unexploded ordnance detection

Geophysics ◽  
2017 ◽  
Vol 82 (3) ◽  
pp. EN25-EN31
Author(s):  
Stephen Billings ◽  
Laurens Beran

Time-domain electromagnetic (TEM) instruments are the predominant geophysical sensor for detection of buried unexploded ordnance (UXO). Detection surveys commonly use towed TEM sensor arrays to acquire a digital map for target detection. We use a dipolar model to predict a detection threshold for a UXO at a specified clearance depth, given an arbitrary sensor geometry. In general, the minimum target response is obtained for a horizontally oriented target. We find that for multistatic sensors, the minimum response can also depend on the azimuth of the target. By considering the statistics of the target response, we find that the detection threshold can be raised slightly while still ensuring a high probability of detection of UXO at depth. This increase in the detection threshold can have a significant effect on the number of false alarms that need to be interrogated or investigated and hence on the cost of clearance. We also use Monte Carlo simulation to investigate how array geometry and height affect clutter rejection.

Author(s):  
I.F. Lozovskiy

The use of broadband souding signals in radars, which has become real in recent years, leads to a significant reduction in the size of resolution elements in range and, accordingly, in the size of the window in which the training sample is formed, which is used to adapt the detection threshold in signal detection algorithms with a constant level of false alarms. In existing radars, such a window would lead to huge losses. The purpose of the work was to study the most rational options for constructing detectors with a constant level of false alarms in radars with broadband sounding signals. The problem was solved for the Rayleigh distribution of the envelope of the noise and a number of non-Rayleigh laws — Weibull and the lognormal, the appearance of which is associated with a decrease in the number of reflecting elements in the resolution volume. For Rayleigh interference, an algorithm is proposed with a multi-channel in range incoherent signal amplitude storage and normalization to the larger of the two estimates of the interference power in the range segments. The detection threshold in it adapts not only to the interference power, but also to the magnitude of the «power jump» in range, which allows reducing the number of false alarms during sudden changes in the interference power – the increase in the probability of false alarms did not exceed one order of magnitude. In this algorithm, there is a certain increase in losses associated with incoherent accumulation of signals reflected from target elements, and losses can be reduced by certain increasing the size of the distance segments that make up the window. Algorithms for detecting broadband signals against interference with non-Rayleigh laws of distribution of the envelope – Weibull and lognormal, based on the addition of the algorithm for detecting signals by non-linear transformation of sample counts into counts with a Rayleigh distribution, are studied. The structure of the detection algorithm remains unchanged in practice. The options for detectors of narrowband and broadband signals are considered. It was found that, in contrast to algorithms designed for the Rayleigh distribution, these algorithms provide a stable level of false alarms regardless of the values of the parameters of non-Rayleigh interference. To reduce losses due to interference with the distribution of amplitudes according to the Rayleigh law, detectors consisting of two channels are used, in which one of the channels is tuned for interference with the Rayleigh distribution, and the other for lognormal or Weibull interference. Channels are switched according to special distribution type recognition algorithms. In such detectors, however, there is a certain increase in the probability of false alarms in a rather narrow range of non-Rayleigh interference parameters, where their distribution approaches the Rayleigh distribution. It is shown that when using broadband signals, there is a noticeable decrease in detection losses in non-Rayleigh noise due to lower detection thresholds for in range signal amplitudes incoherent storage.


2021 ◽  
Vol 8 (1) ◽  
Author(s):  
Tian J. Ma

AbstractBig Data in the area of Remote Sensing has been growing rapidly. Remote sensors are used in surveillance, security, traffic, environmental monitoring, and autonomous sensing. Real-time detection of small moving targets using a remote sensor is an ongoing, challenging problem. Since the object is located far away from the sensor, the object often appears too small. The object’s signal-to-noise-ratio (SNR) is often very low. Occurrences such as camera motion, moving backgrounds (e.g., rustling leaves), low contrast and resolution of foreground objects makes it difficult to segment out the targeted moving objects of interest. Due to the limited appearance of the target, it is tough to obtain the target’s characteristics such as its shape and texture. Without these characteristics, filtering out false detections can be a difficult task. Detecting these targets, would often require the detector to operate under a low detection threshold. However, lowering the detection threshold could lead to an increase of false alarms. In this paper, the author will introduce a new method that improves the probability to detect low SNR objects, while decreasing the number of false alarms as compared to using the traditional baseline detection technique.


Author(s):  
S. M. Kostromitsky ◽  
V. M. Artemiev ◽  
D. S. Nefedov

The problem of radar detection of small-sized targets using the traditional methods of selection of signals embedded in background noise is considered. It is shown that for a false alarm rate of 10–5, which provides for 1–2 false alarms within the entire coverage of the modern 3D radar, the probability of detection of a small-sized target is getting unacceptably low. Repeatedly decreasing the threshold can provide an acceptable level of the detection probability at ultra-low signal-tonoise ratio (SNR) values. At the same time, decreasing the threshold will result in an unacceptable increase of the false alarm rate. A new target detection procedure using the “track before detect” method (TBD) is proposed. In the TBD procedure, the target is considered detected when two conditions are met: the signal exceeds once a definite threshold; the target is detected within a strictly defined observation area (acquisition or tracking gate). For low SNR values in the range of 3–8 dB and equal false alarm rate, the detection probability increases by 20–50 % compared to the traditional detection method. The simulation results showed a strong dependence of efficacy of the TBD algorithm on the threshold value and the decision rule. The possibility is noted of adaptive control over the threshold due to the use the detection results in the preceding scanning cycles, as well as the introduction of matrix radar surveillance not only by the target coordinates and parameters, but also by the detection threshold, decision rules, etc. Examination of these issues is the subject of further research.


2020 ◽  
Author(s):  
Geraldo Moura Ramos Filho ◽  
Victor Hugo Rabelo Coelho ◽  
Emerson da Silva Freitas ◽  
Yunqing Xuan ◽  
Cristiano das Neves Almeida

Abstract This paper presents an improved method of using threshold of peak rainfall intensity for robust flood/flash flood evaluation and warnings in the state of São Paulo, Brazil. The improvements involve the use of two tolerance levels and the delineating of an intermediate threshold by incorporating an exponential curve that relates rainfall intensity and Antecedent Precipitation Index (API). The application of the tolerance levels presents an average increase of 14% in the Probability of Detection (POD) of flood and flash flood occurrences above the upper threshold. Moreover, a considerable exclusion (63%) of non-occurrences of floods and flash floods in between the two thresholds significantly reduce the number of false alarms. The intermediate threshold using the exponential curves also exhibits improvements for almost all time steps of both hydrological hazards, with the best results found for floods correlating 8-h peak intensity and 8 days API, with POD and Positive Predictive Value (PPV) values equal to 81% and 82%, respectively. This study provides strong indications that the new proposed rainfall threshold-based approach can help reduce the uncertainties in predicting the occurrences of floods and flash floods.


2020 ◽  
Vol 13 (11) ◽  
pp. 5779-5797
Author(s):  
Giacomo Roversi ◽  
Pier Paolo Alberoni ◽  
Anna Fornasiero ◽  
Federico Porcù

Abstract. There is a growing interest in emerging opportunistic sensors for precipitation, motivated by the need to improve its quantitative estimates at the ground. The scope of this work is to present a preliminary assessment of the accuracy of commercial microwave link (CML) retrieved rainfall rates in Northern Italy. The CML product, obtained by the open-source RAINLINK software package, is evaluated on different scales (single link, 5 km×5 km grid, river basin) against the precipitation products operationally used at Arpae-SIMC, the regional weather service of Emilia-Romagna, in Northern Italy. The results of the 15 min single-link validation with nearby rain gauges show high variability, which can be caused by the complex physiography and precipitation patterns. Known sources of errors (e.g. the attenuation caused by the wetting of the antennas or random fluctuations in the baseline) are particularly hard to mitigate in these conditions without a specific calibration, which has not been implemented. However, hourly cumulated spatially interpolated CML rainfall maps, validated with respect to the established regional gauge-based reference, show similar performance (R2 of 0.46 and coefficient of variation, CV, of 0.78) to adjusted radar-based precipitation gridded products and better performance than satellite-based ones. Performance improves when basin-scale total precipitation amounts are considered (R2 of 0.83 and CV of 0.48). Avoiding regional-specific calibration therefore does not preclude the algorithm from working but has some limitations in probability of detection (POD) and accuracy. A widespread underestimation is evident at both the grid box scale (mean error of −0.26) and the basin scale (multiplicative bias of 0.7), while the number of false alarms is generally low and becomes even lower as link coverage increases. Also taking into account delays in the availability of the data (latency of 0.33 h for CML against 1 h for the adjusted radar and 24 h for the quality-controlled rain gauges), CML appears as a valuable data source in particular from a local operational framework perspective. Finally, results show complementary strengths for CMLs and radars, encouraging joint exploitation.


Author(s):  
Evan S. Bentley ◽  
Richard L. Thompson ◽  
Barry R. Bowers ◽  
Justin G. Gibbs ◽  
Steven E. Nelson

AbstractPrevious work has considered tornado occurrence with respect to radar data, both WSR-88D and mobile research radars, and a few studies have examined techniques to potentially improve tornado warning performance. To date, though, there has been little work focusing on systematic, large-sample evaluation of National Weather Service (NWS) tornado warnings with respect to radar-observable quantities and the near-storm environment. In this work, three full years (2016–2018) of NWS tornado warnings across the contiguous United States were examined, in conjunction with supporting data in the few minutes preceding warning issuance, or tornado formation in the case of missed events. The investigation herein examines WSR-88D and Storm Prediction Center (SPC) mesoanalysis data associated with these tornado warnings with comparisons made to the current Warning Decision Training Division (WDTD) guidance.Combining low-level rotational velocity and the significant tornado parameter (STP), as used in prior work, shows promise as a means to estimate tornado warning performance, as well as relative changes in performance as criteria thresholds vary. For example, low-level rotational velocity peaking in excess of 30 kt (15 m s−1), in a near-storm environment which is not prohibitive for tornadoes (STP > 0), results in an increased probability of detection and reduced false alarms compared to observed NWS tornado warning metrics. Tornado warning false alarms can also be reduced through limiting warnings with weak (<30 kt), broad (>1nm) circulations in a poor (STP=0) environment, careful elimination of velocity data artifacts like sidelobe contamination, and through greater scrutiny of human-based tornado reports in otherwise questionable scenarios.


2018 ◽  
Vol 33 (6) ◽  
pp. 1501-1511 ◽  
Author(s):  
Harold E. Brooks ◽  
James Correia

Abstract Tornado warnings are one of the flagship products of the National Weather Service. We update the time series of various metrics of performance in order to provide baselines over the 1986–2016 period for lead time, probability of detection, false alarm ratio, and warning duration. We have used metrics (mean lead time for tornadoes warned in advance, fraction of tornadoes warned in advance) that work in a consistent way across the official changes in policy for warning issuance, as well as across points in time when unofficial changes took place. The mean lead time for tornadoes warned in advance was relatively constant from 1986 to 2011, while the fraction of tornadoes warned in advance increased through about 2006, and the false alarm ratio slowly decreased. The largest changes in performance take place in 2012 when the default warning duration decreased, and there is an apparent increased emphasis on reducing false alarms. As a result, the lead time, probability of detection, and false alarm ratio all decrease in 2012. Our analysis is based, in large part, on signal detection theory, which separates the quality of the warning system from the threshold for issuing warnings. Threshold changes lead to trade-offs between false alarms and missed detections. Such changes provide further evidence for changes in what the warning system as a whole considers important, as well as highlighting the limitations of measuring performance by looking at metrics independently.


1980 ◽  
Vol 33 (3) ◽  
pp. 475-481
Author(s):  
P. Bertolazzi ◽  
M. Lucertini

The major purpose of an air traffic control system is to ensure the separation of two or more aircraft flying in the same airspace, with an efficiency that can be expressed in terms of capacity and cost. As air traffic grows in numbers it becomes necessary to reduce the workload of the controllers by relieving them of many monitoring tasks, and eventually some decision-making tasks, through computerized automation. In this context many developments tend to build up an efficient conflict-alert subsystem.The problem of conflict-alert in the air needs strategic tools, to make collision unlikely or even impossible, and tactical tools to detect impending collisions. The latter detect potentially hazardous aircraft encounters and alert the controller in time to warn the pilots (if necessary) and should obviously provide this capability with a minimal number of false alarms and no increase in workload.


2017 ◽  
Vol 14 ◽  
pp. 187-194 ◽  
Author(s):  
Stefano Federico ◽  
Marco Petracca ◽  
Giulia Panegrossi ◽  
Claudio Transerici ◽  
Stefano Dietrich

Abstract. This study investigates the impact of the assimilation of total lightning data on the precipitation forecast of a numerical weather prediction (NWP) model. The impact of the lightning data assimilation, which uses water vapour substitution, is investigated at different forecast time ranges, namely 3, 6, 12, and 24 h, to determine how long and to what extent the assimilation affects the precipitation forecast of long lasting rainfall events (> 24 h). The methodology developed in a previous study is slightly modified here, and is applied to twenty case studies occurred over Italy by a mesoscale model run at convection-permitting horizontal resolution (4 km). The performance is quantified by dichotomous statistical scores computed using a dense raingauge network over Italy. Results show the important impact of the lightning assimilation on the precipitation forecast, especially for the 3 and 6 h forecast. The probability of detection (POD), for example, increases by 10 % for the 3 h forecast using the assimilation of lightning data compared to the simulation without lightning assimilation for all precipitation thresholds considered. The Equitable Threat Score (ETS) is also improved by the lightning assimilation, especially for thresholds below 40 mm day−1. Results show that the forecast time range is very important because the performance decreases steadily and substantially with the forecast time. The POD, for example, is improved by 1–2 % for the 24 h forecast using lightning data assimilation compared to 10 % of the 3 h forecast. The impact of the false alarms on the model performance is also evidenced by this study.


Author(s):  
V. M. Artemiev ◽  
S. M. Kostromitsky ◽  
A. O. Naumov

To increase the efficiency of detecting moving objects in radiolocation, additional features are used, associated with the characteristics of trajectories. The authors assumed that trajectories are correlated, that allows extrapolation of the coordinate values taking into account their increments over the scanning period. The detection procedure consists of two stages. At the first, detection is carried out by the classical threshold method with a low threshold level, which provides a high probability of detection with high values of the probability of false alarms. At the same time uncertainty in the selection of object trajectory embedded in false trajectories arises. Due to the statistical independence of the coordinates of the false trajectories in comparison with the correlated coordinates of the object, the average duration of the first of them is less than the average duration of the second ones. This difference is used to solve the detection problem at the second stage based on the time-selection method. The obtained results allow estimation of the degree of gain in the probability of detection when using the proposed method.


Sign in / Sign up

Export Citation Format

Share Document