scholarly journals Detection of signals of moving objects based on the time selection method

Author(s):  
V. M. Artemiev ◽  
S. M. Kostromitsky ◽  
A. O. Naumov

To increase the efficiency of detecting moving objects in radiolocation, additional features are used, associated with the characteristics of trajectories. The authors assumed that trajectories are correlated, that allows extrapolation of the coordinate values taking into account their increments over the scanning period. The detection procedure consists of two stages. At the first, detection is carried out by the classical threshold method with a low threshold level, which provides a high probability of detection with high values of the probability of false alarms. At the same time uncertainty in the selection of object trajectory embedded in false trajectories arises. Due to the statistical independence of the coordinates of the false trajectories in comparison with the correlated coordinates of the object, the average duration of the first of them is less than the average duration of the second ones. This difference is used to solve the detection problem at the second stage based on the time-selection method. The obtained results allow estimation of the degree of gain in the probability of detection when using the proposed method.

2008 ◽  
Vol 16 (3) ◽  
Author(s):  
M. Kastek ◽  
T. Sosnowski ◽  
T. Piątkowski

AbstractThe paper presents construction and principle of operation of passive IR detectors (PIR detectors) of a large detection range. Important virtue of these detectors is highly efficient detection of slowly moving or crawling people. The described here PIR detector detects crawling people at the distance of 140 m. To ensure high probability of detection of slowly moving objects, new method of signals analysis was applied. On the basis of the carried out real-time measurements, both probability of detection and false alarms were determined.


Author(s):  
Evan S. Bentley ◽  
Richard L. Thompson ◽  
Barry R. Bowers ◽  
Justin G. Gibbs ◽  
Steven E. Nelson

AbstractPrevious work has considered tornado occurrence with respect to radar data, both WSR-88D and mobile research radars, and a few studies have examined techniques to potentially improve tornado warning performance. To date, though, there has been little work focusing on systematic, large-sample evaluation of National Weather Service (NWS) tornado warnings with respect to radar-observable quantities and the near-storm environment. In this work, three full years (2016–2018) of NWS tornado warnings across the contiguous United States were examined, in conjunction with supporting data in the few minutes preceding warning issuance, or tornado formation in the case of missed events. The investigation herein examines WSR-88D and Storm Prediction Center (SPC) mesoanalysis data associated with these tornado warnings with comparisons made to the current Warning Decision Training Division (WDTD) guidance.Combining low-level rotational velocity and the significant tornado parameter (STP), as used in prior work, shows promise as a means to estimate tornado warning performance, as well as relative changes in performance as criteria thresholds vary. For example, low-level rotational velocity peaking in excess of 30 kt (15 m s−1), in a near-storm environment which is not prohibitive for tornadoes (STP > 0), results in an increased probability of detection and reduced false alarms compared to observed NWS tornado warning metrics. Tornado warning false alarms can also be reduced through limiting warnings with weak (<30 kt), broad (>1nm) circulations in a poor (STP=0) environment, careful elimination of velocity data artifacts like sidelobe contamination, and through greater scrutiny of human-based tornado reports in otherwise questionable scenarios.


2018 ◽  
Vol 33 (6) ◽  
pp. 1501-1511 ◽  
Author(s):  
Harold E. Brooks ◽  
James Correia

Abstract Tornado warnings are one of the flagship products of the National Weather Service. We update the time series of various metrics of performance in order to provide baselines over the 1986–2016 period for lead time, probability of detection, false alarm ratio, and warning duration. We have used metrics (mean lead time for tornadoes warned in advance, fraction of tornadoes warned in advance) that work in a consistent way across the official changes in policy for warning issuance, as well as across points in time when unofficial changes took place. The mean lead time for tornadoes warned in advance was relatively constant from 1986 to 2011, while the fraction of tornadoes warned in advance increased through about 2006, and the false alarm ratio slowly decreased. The largest changes in performance take place in 2012 when the default warning duration decreased, and there is an apparent increased emphasis on reducing false alarms. As a result, the lead time, probability of detection, and false alarm ratio all decrease in 2012. Our analysis is based, in large part, on signal detection theory, which separates the quality of the warning system from the threshold for issuing warnings. Threshold changes lead to trade-offs between false alarms and missed detections. Such changes provide further evidence for changes in what the warning system as a whole considers important, as well as highlighting the limitations of measuring performance by looking at metrics independently.


2017 ◽  
Vol 14 ◽  
pp. 187-194 ◽  
Author(s):  
Stefano Federico ◽  
Marco Petracca ◽  
Giulia Panegrossi ◽  
Claudio Transerici ◽  
Stefano Dietrich

Abstract. This study investigates the impact of the assimilation of total lightning data on the precipitation forecast of a numerical weather prediction (NWP) model. The impact of the lightning data assimilation, which uses water vapour substitution, is investigated at different forecast time ranges, namely 3, 6, 12, and 24 h, to determine how long and to what extent the assimilation affects the precipitation forecast of long lasting rainfall events (> 24 h). The methodology developed in a previous study is slightly modified here, and is applied to twenty case studies occurred over Italy by a mesoscale model run at convection-permitting horizontal resolution (4 km). The performance is quantified by dichotomous statistical scores computed using a dense raingauge network over Italy. Results show the important impact of the lightning assimilation on the precipitation forecast, especially for the 3 and 6 h forecast. The probability of detection (POD), for example, increases by 10 % for the 3 h forecast using the assimilation of lightning data compared to the simulation without lightning assimilation for all precipitation thresholds considered. The Equitable Threat Score (ETS) is also improved by the lightning assimilation, especially for thresholds below 40 mm day−1. Results show that the forecast time range is very important because the performance decreases steadily and substantially with the forecast time. The POD, for example, is improved by 1–2 % for the 24 h forecast using lightning data assimilation compared to 10 % of the 3 h forecast. The impact of the false alarms on the model performance is also evidenced by this study.


2020 ◽  
Vol 148 (11) ◽  
pp. 4453-4466
Author(s):  
Cong Wang ◽  
Ping Wang ◽  
Di Wang ◽  
Jinyi Hou ◽  
Bing Xue

AbstractShort-term intense precipitation (SIP; i.e., convective precipitation exceeding 20 mm h−1) nowcasting is important for urban flood warning and natural hazards management. This paper presents an algorithm for coupling automatic weather station data and single-polarization S-band radar data with a graph model and a random forest for the nowcasting of SIP. Different from the pixel-by-pixel precipitation nowcasting algorithm, this algorithm takes the convective cells as the basic units to consider their interactions and focuses on multicell convective systems. In particular, the following question could be addressed: Will a multicell convective system cause SIP events in the next hour? First, a method based on spatiotemporal superposition between cells is proposed for multicell systems identification. Then, the graph model is used to represent cell physical attributes and the spatial distribution of the entire system. For each graph model, a fusion operation is used to form a 42-dimensional graph feature vector. Finally, combined with the machine learning approaches, a random forest classifier is trained with the graph feature vector to predict the precipitation. In the experiment, this algorithm achieves a probability of detection (POD) of 79.2% and a critical success index (CSI) of 68.3% with the data between 2015 and 2016 in North China. Compared with other precipitation nowcasting algorithms, the graph model and random forest could predict SIP events more accurately and produce fewer false alarms.


Author(s):  
Wenfei Li ◽  
Rama K. Yedavalli

It is challenging to have a good fault diagnostic scheme that can distinguish between model uncertainties and occurrence of faults, which helps in reducing false alarms and missed detections. In this paper, a dynamic threshold algorithm is developed for aircraft engine sensor fault diagnosis that accommodates parametric uncertainties. Using the robustness analysis of parametric uncertain systems, we generate upper-and-lower bound trajectories for the dynamic threshold. The extent of parametric uncertainties is assumed to be such that the perturbed eigenvalues reside in a set of distinct circular regions. Dedicated observer scheme is used for engine sensor fault diagnosis design. The residuals are errors between estimated state variables from a bank of Kalman filters. With this design approach, the residual crossing the upper-and-lower bounds of the dynamic threshold indicates the occurrence of fault. Application to an aircraft gas turbine engine Component Level Model clearly illustrates the improved performance of the proposed method.


Recent times there is an increase in thefts in the recent past. This creates a very bad environment for people to live in fear. The problem with home security in the modern world is a cause for concern. The conventional intruder detection system now we are using are highly expensive and there can be a possibility of false alarms. This problem is fixed by building a home intruder detection system that can accurately detect a human intruder, while filtering out movements that are caused due to any other moving objects using LabVIEW and Python. The images that were acquired and analyzed through frame comparisons are converted to gray scale images and then processed to detect an intruder. Here LabVIEW works as server and Python works as a client. At client video is acquired continuously, video is converted into images. Images are processed and information is send to the server. Server displays the status of the intruder with date and time. If the intruder is present then the system compares the intruder’s data with the data in the system. The images that were acquired and analyzed through frame comparisons converted to gray scale images that represent change, and then filtered through a series of image refining VI’s, helping to enhance our change detection results. If the data matches then the processing stops, if not then the system alerts the user through SMS or email if any intruder has been detected and sends the image to the user app. Through app we can make an alarm.


Sign in / Sign up

Export Citation Format

Share Document