scholarly journals Detection of AIS Closing Behavior and MMSI Spoofing Behavior of Ships Based on Spatiotemporal Data

2020 ◽  
Vol 12 (4) ◽  
pp. 702
Author(s):  
Tao Zhang ◽  
Shuai Zhao ◽  
Bo Cheng ◽  
Junliang Chen

In marine transportation, many ships are equipped with AIS devices. The AIS data sent by AIS devices can help the maritime authorities and other ships obtain the navigation condition of the ship, thereby ensuring the safety of ships during navigation. However, when a ship is involved in illegal activities, the crew may close the AIS device or tamper with Maritime Mobile Service Identity (MMSI) in the AIS data. To detect these two kinds of behaviors, this paper designs the AIS closing detection algorithm and MMSI spoofing detection algorithm based on the spatiotemporal data provided by AIS and radar. As the radar data does not include the ship’s identification, the associated relationship between radar data and AIS data is difficult to determine in the multi-ship scenario. To solve this problem, the D-TRAP is defined in this paper, it is applied in the process of searching for the associated AIS points of radar trajectory points, when the number of effective AIS points is reduced caused by the above two behaviors, the association method used in the paper has better performance. In addition, real data and simulation data are used to verify the two algorithms. The verification results show that when the ship is simultaneously monitored by radar and AIS, and the monitoring process continues for a period of time, the AIS closing detection algorithm has good performance. When the ship is monitored by AIS and radar before and after MMSI spoofing, and both monitoring processes continue for a period of time, the MMSI spoofing algorithm has good performance.

2021 ◽  
Author(s):  
Tao Zhang ◽  
Chuanchang Liu ◽  
Bodong Wen

Abstract In marine transportation, most ships are equipped with AIS devices. The AIS data sent by these devices can help maritime authorities to manage ships in relevant sea areas. However, AIS is a self-reporting system, when a ship is engaged in illegal activities, the AIS device may be turned off. Therefore, after the AIS is closed, if the ship's behavior during a certain period of time is different from the ship's behavior before the closure of AIS, the different behavior is likely to represent that the ship is conducting illegal activities. This behavior is considered abnormal and needs to be detected in time. Based on radar trajectory data, the detection of abnormal ship behavior is studied from two aspects: speed and direction. In order to improve the intelligent level of abnormal ship behavior detection, the abnormal speed behavior detection algorithm combined with rules and clustering (ASBD-RC) and the abnormal direction behavior detection algorithm combined with partition and the earth mover's distance (ADBD-PE) are proposed. The ASBD-RC algorithm can reduce the influence of noise and sea clutter on abnormal speed behavior detection. The ADBD-PE algorithm can effectively partition and identify trajectory segments with abnormal direction. In the experiment, based on the real and simulated radar trajectories, the abnormal behaviors of ships under different scenarios are generated. The experimental results show that in most scenarios, the ASBD-RC algorithm and the ADBD-PE algorithm can effectively detect abnormal ship behavior. And compared with other algorithms, the proposed two algorithms have better and more stable detection results.


2010 ◽  
Vol 27 (11) ◽  
pp. 1868-1880 ◽  
Author(s):  
Kenta Hood ◽  
Sebastián Torres ◽  
Robert Palmer

Abstract Wind turbines cause contamination of weather radar signals that is often detrimental and difficult to distinguish from cloud returns. Because the turbines are always at the same location, it would seem simple to identify where wind turbine clutter (WTC) contaminates the weather radar data. However, under certain atmospheric conditions, anomalous propagation of the radar beam can occur such that WTC corrupts weather data on constantly evolving locations, or WTC can be relatively weak such that contamination on predetermined locations does not occur. Because of the deficiency of using turbine locations as a proxy for WTC, an effective detection algorithm is proposed to perform automatic flagging of contaminated weather radar data, which can then be censored or filtered. Thus, harmful effects can be reduced that may propagate to automatic algorithms or may hamper the forecaster’s ability to issue timely warnings. In this work, temporal and spectral features related to WTC signatures are combined in a fuzzy logic algorithm to classify the radar return as being contaminated by WTC or not. The performance of the algorithm is quantified using simulations and the algorithm is applied to a real data case from the radar facility in Dodge City, Kansas (KDDC). The results illustrate that WTC contamination can be detected automatically, thereby improving the quality control of weather radar data.


Author(s):  
Berend Terluin ◽  
Ewa M. Roos ◽  
Caroline B. Terwee ◽  
Jonas B. Thorlund ◽  
Lina H. Ingelsrud

Abstract Purpose The minimal important change (MIC) of a patient-reported outcome measure (PROM) is often suspected to be baseline dependent, typically in the sense that patients who are in a poorer baseline health condition need greater improvement to qualify as minimally important. Testing MIC baseline dependency is commonly performed by creating two or more subgroups, stratified on the baseline PROM score. This study’s purpose was to show that this practice produces biased subgroup MIC estimates resulting in spurious MIC baseline dependency, and to develop alternative methods to evaluate MIC baseline dependency. Methods Datasets with PROM baseline and follow-up scores and transition ratings were simulated with and without MIC baseline dependency. Mean change MICs, ROC-based MICs, predictive MICs, and adjusted MICs were estimated before and after stratification on the baseline score. Three alternative methods were developed and evaluated. The methods were applied in a real data example for illustration. Results Baseline stratification resulted in biased subgroup MIC estimates and the false impression of MIC baseline dependency, due to redistribution of measurement error. Two of the alternative methods require a second baseline measurement with the same PROM or another correlated PROM. The third method involves the construction of two parallel tests based on splitting the PROM’s item set. Two methods could be applied to the real data. Conclusion MIC baseline dependency should not be tested in subgroups based on stratification on the baseline PROM score. Instead, one or more of the suggested alternative methods should be used.


2013 ◽  
Vol 2013 ◽  
pp. 1-13 ◽  
Author(s):  
Yuan Jiang ◽  
Qin Xu ◽  
Pengfei Zhang ◽  
Kang Nai ◽  
Liping Liu

As an important part of Doppler velocity data quality control for radar data assimilation and other quantitative applications, an automated technique is developed to identify and remove contaminated velocities by birds, especially migrating birds. This technique builds upon the existing hydrometeor classification algorithm (HCA) for dual-polarimetric WSR-88D radars developed at the National Severe Storms Laboratory, and it performs two steps. In the first step, the fuzzy-logic method in the HCA is simplified and used to identify biological echoes (mainly from birds and insects). In the second step, another simple fuzzy logic method is developed to detect bird echoes among the biological echoes identified in the first step and thus remove bird-contaminated velocities. The membership functions used by the fuzzy logic method in the second step are extracted from normalized histograms of differential reflectivity and differential phase for birds and insects, respectively, while the normalized histograms are constructed by polarimetric data collected during the 2012 fall migrating season and sorted for bird and insects, respectively. The performance and effectiveness of the technique are demonstrated by real-data examples.


2021 ◽  
Vol 8 ◽  
Author(s):  
Tianshu Gu ◽  
Lishi Wang ◽  
Ning Xie ◽  
Xia Meng ◽  
Zhijun Li ◽  
...  

The complexity of COVID-19 and variations in control measures and containment efforts in different countries have caused difficulties in the prediction and modeling of the COVID-19 pandemic. We attempted to predict the scale of the latter half of the pandemic based on real data using the ratio between the early and latter halves from countries where the pandemic is largely over. We collected daily pandemic data from China, South Korea, and Switzerland and subtracted the ratio of pandemic days before and after the disease apex day of COVID-19. We obtained the ratio of pandemic data and created multiple regression models for the relationship between before and after the apex day. We then tested our models using data from the first wave of the disease from 14 countries in Europe and the US. We then tested the models using data from these countries from the entire pandemic up to March 30, 2021. Results indicate that the actual number of cases from these countries during the first wave mostly fall in the predicted ranges of liniar regression, excepting Spain and Russia. Similarly, the actual deaths in these countries mostly fall into the range of predicted data. Using the accumulated data up to the day of apex and total accumulated data up to March 30, 2021, the data of case numbers in these countries are falling into the range of predicted data, except for data from Brazil. The actual number of deaths in all the countries are at or below the predicted data. In conclusion, a linear regression model built with real data from countries or regions from early pandemics can predict pandemic scales of the countries where the pandemics occur late. Such a prediction with a high degree of accuracy provides valuable information for governments and the public.


Geophysics ◽  
2021 ◽  
pp. 1-73
Author(s):  
Bastien Dupuy ◽  
Anouar Romdhane ◽  
Pierre-Louis Nordmann ◽  
Peder Eliasson ◽  
Joonsang Park

Risk assessment of CO2 storage requires the use of geophysical monitoring techniques to quantify changes in selected reservoir properties such as CO2 saturation, pore pressure and porosity. Conformance monitoring and associated decision-making rest upon the quantified properties derived from geophysical data, with uncertainty assessment. A general framework combining seismic and controlled source electromagnetic inversions with rock physics inversion is proposed with fully Bayesian formulations for proper quantification of uncertainty. The Bayesian rock physics inversion rests upon two stages. First, a search stage consists in exploring the model space and deriving models with associated probability density function (PDF). Second, an appraisal or importance sampling stage is used as a "correction" step to ensure that the full model space is explored and that the estimated posterior PDF can be used to derive quantities like marginal probability densities. Both steps are based on the neighbourhood algorithm. The approach does not require any linearization of the rock physics model or assumption about the model parameters distribution. After describing the CO2 storage context, the available data at the Sleipner field before and after CO2 injection (baseline and monitor), and the rock physics models, we perform an extended sensitivity study. We show that prior information is crucial, especially in the monitor case. We demonstrate that joint inversion of seismic and CSEM data is also key to quantify CO2 saturations properly. We finally apply the full inversion strategy to real data from Sleipner. We obtain rock frame moduli, porosity, saturation and patchiness exponent distributions and associated uncertainties along a 1D profile before and after injection. The results are consistent with geology knowledge and reservoir simulations, i.e., that the CO2 saturations are larger under the caprock confirming the CO2 upward migration by buoyancy effect. The estimates of patchiness exponent have a larger uncertainty, suggesting semi-patchy mixing behaviour.


2018 ◽  
Vol 10 (11) ◽  
pp. 3990 ◽  
Author(s):  
Zhibin Huang ◽  
Min Xu ◽  
Wei Chen ◽  
Xiaojuan Lin ◽  
Chunxiang Cao ◽  
...  

Using Landsat remote-sensing data combined with geological information extracted from ALOS and Sentinel-1A radar data, the ecological environment was evaluated in the years 2007, 2008, 2013, and 2017 through gray correlation analysis on the basis of the construction of the pressure-state-response model. The main objective of this research was to assess the ecological environment changes in Wenchuan County before and after the earthquake, and to provide reference for future social development and policy implementation. The grading map of the ecological environment was obtained for every year, and the ecological restoration status of Wenchuan County after the earthquake was evaluated. The results showed that the maximum area cover at a “safe” ecological level was over 46.4% in 2007. After the 2008 earthquake, the proportion of “unsafe” and “very unsafe” ecological levels was 40.0%, especially around the Lancang River and the western mountain area in Wenchuan County. After five years of restoration, ecological conditions were improved, up to 48.0% in the region. The areas at “critically safe” and above recovered to 85.5% in 2017 within nine years after the deadly Wenchuan earthquake of May 12, 2008. In this paper, we discuss the results of detailed analysis of ecological improvements and correlation with the degrees of pressure, state, and response layers of the Pressure-State-Response (PSR) model.


Electronics ◽  
2020 ◽  
Vol 9 (3) ◽  
pp. 541 ◽  
Author(s):  
Željko Bugarinović ◽  
Lara Pajewski ◽  
Aleksandar Ristić ◽  
Milan Vrtunski ◽  
Miro Govedarica ◽  
...  

This paper focuses on the use of the Canny edge detector as the first step of an advanced imaging algorithm for automated detection of hyperbolic reflections in ground-penetrating radar (GPR) data. Since the imaging algorithm aims to work in real time; particular attention is paid to its computational efficiency. Various alternative criteria are designed and examined, to fasten the procedure by eliminating unnecessary edge pixels from Canny-processed data, before such data go through the subsequent steps of the detection algorithm. The effectiveness and reliability of the proposed methodology are tested on a wide set of synthetic and experimental radargrams with promising results. The finite-difference time-domain simulator gprMax is used to generate synthetic radargrams for the tests, while the real radargrams come from GPR surveys carried out by the authors in urban areas. The imaging algorithm is implemented in MATLAB.


2018 ◽  
Vol 10 (8) ◽  
pp. 1272 ◽  
Author(s):  
Stephanie Olen ◽  
Bodo Bookhagen

The emergence of the Sentinel-1A and 1B satellites now offers freely available and widely accessible Synthetic Aperture Radar (SAR) data. Near-global coverage and rapid repeat time (6–12 days) gives Sentinel-1 data the potential to be widely used for monitoring the Earth’s surface. Subtle land-cover and land surface changes can affect the phase and amplitude of the C-band SAR signal, and thus the coherence between two images collected before and after such changes. Analysis of SAR coherence therefore serves as a rapidly deployable and powerful tool to track both seasonal changes and rapid surface disturbances following natural disasters. An advantage of using Sentinel-1 C-band radar data is the ability to easily construct time series of coherence for a region of interest at low cost. In this paper, we propose a new method for Potentially Affected Area (PAA) detection following a natural hazard event. Based on the coherence time series, the proposed method (1) determines the natural variability of coherence within each pixel in the region of interest, accounting for factors such as seasonality and the inherent noise of variable surfaces; and (2) compares pixel-by-pixel syn-event coherence to temporal coherence distributions to determine where statistically significant coherence loss has occurred. The user can determine to what degree the syn-event coherence value (e.g., 1st, 5th percentile of pre-event distribution) constitutes a PAA, and integrate pertinent regional data, such as population density, to rank and prioritise PAAs. We apply the method to two case studies, Sarpol-e, Iran following the 2017 Iran-Iraq earthquake, and a landslide-prone region of NW Argentina, to demonstrate how rapid identification and interpretation of potentially affected areas can be performed shortly following a natural hazard event.


2013 ◽  
Vol 52 (1) ◽  
pp. 169-185 ◽  
Author(s):  
Qing Cao ◽  
Guifu Zhang ◽  
Ming Xue

AbstractThis study presents a two-dimensional variational approach to retrieving raindrop size distributions (DSDs) from polarimetric radar data in the presence of attenuation. A two-parameter DSD model, the constrained-gamma model, is used to represent rain DSDs. Three polarimetric radar measurements—reflectivity ZH, differential reflectivity ZDR, and specific differential phase KDP—are optimally used to correct for the attenuation and retrieve DSDs by taking into account measurement error effects. Retrieval results with simulated data demonstrate that the proposed algorithm performs well. Applications to real data collected by the X-band Center for Collaborative Adaptive Sensing of the Atmosphere (CASA) radars and the C-band University of Oklahoma–Polarimetric Radar for Innovations in Meteorology and Engineering (OU-PRIME) also demonstrate the efficacy of this approach.


Sign in / Sign up

Export Citation Format

Share Document