scholarly journals An Algorithm Using DBSCAN to Solve the Velocity Dealiasing Problem

2021 ◽  
Vol 2021 ◽  
pp. 1-9
Author(s):  
Wei Zhao ◽  
Qinglan Li ◽  
Kuifeng Jin

Velocity dealiasing is an essential task for correcting the radial velocity data collected by Doppler radar. To improve the accuracy of velocity dealiasing, traditional dealiasing algorithms usually set a series of empirical thresholds, combine three- or four-dimensional data, or introduce other observation data as a reference. In this study, we transform the velocity dealiasing problem into a clustering problem and solve this problem using the density-based spatial clustering of applications with noise (DBSCAN) method. This algorithm is verified with a case study involving radar data on the tropical cyclone Mangkhut in 2018. The results show that the accuracy of the proposed algorithm is close to that of the four-dimensional dealiasing (4DD) method proposed by James and Houze; yet, it only requires two-dimensional velocity data and eliminates the need for other reference data. The results of the case study also show that the 4DD algorithm filters out many observation gates close to the missing data or radar center, whereas the proposed algorithm tends to retain and correct these gates.

2008 ◽  
Vol 136 (3) ◽  
pp. 945-963 ◽  
Author(s):  
Jidong Gao ◽  
Ming Xue

Abstract A new efficient dual-resolution (DR) data assimilation algorithm is developed based on the ensemble Kalman filter (EnKF) method and tested using simulated radar radial velocity data for a supercell storm. Radar observations are assimilated on both high-resolution and lower-resolution grids using the EnKF algorithm with flow-dependent background error covariances estimated from the lower-resolution ensemble. It is shown that the flow-dependent and dynamically evolved background error covariances thus estimated are effective in producing quality analyses on the high-resolution grid. The DR method has the advantage of being able to significantly reduce the computational cost of the EnKF analysis. In the system, the lower-resolution ensemble provides the flow-dependent background error covariance, while the single-high-resolution forecast and analysis provides the benefit of higher resolution, which is important for resolving the internal structures of thunderstorms. The relative smoothness of the covariance obtained from the lower 4-km-resolution ensemble does not appear to significantly degrade the quality of analysis. This is because the cross covariance among different variables is of first-order importance for “retrieving” unobserved variables from the radar radial velocity data. For the DR analysis, an ensemble size of 40 appears to be a reasonable choice with the use of a 4-km horizontal resolution in the ensemble and a 1-km resolution in the high-resolution analysis. Several sensitivity tests show that the DR EnKF system is quite robust to different observation errors. A 4-km thinned data resolution is a compromise that is acceptable under the constraint of real-time applications. A data density of 8 km leads to a significant degradation in the analysis.


2020 ◽  
Vol 2020 ◽  
pp. 1-12
Author(s):  
Yue Yuan ◽  
Ping Wang ◽  
Di Wang ◽  
Junzhi Shi

The velocity dealiasing is an essential work of automatic weather phenomenon identification, nowcasting, and disaster monitoring based on radial velocity data. The noise data, strong wind shear, and isolated echo region in the Doppler radar radial velocity data severely interfere with the velocity dealiasing algorithm. This paper proposes a two-step velocity dealiasing algorithm based on the minimization of velocity differences between regions to solve this problem. The first step is to correct aliased velocities by minimizing the sum of gradients in every region to eliminate abnormal velocity gradients between points. The interference of noise data and strong wind shear can be reduced by minimizing the whole gradients in a region. The second step is to dealiase velocities by the velocity differences between different isolated regions. The velocity of an unknown isolated region is determined by the velocities of all known regions. This step improves the dealiasing results of isolated regions. In this paper, 604 volume scan samples, including typhoons, squall lines, and heavy precipitation, were used to test the algorithm. The statistical results and analysis show that the proposed algorithm can dealiase the velocity field with a high probability of detection and a low false alarm rate.


2006 ◽  
Vol 23 (9) ◽  
pp. 1239-1248 ◽  
Author(s):  
Jian Zhang ◽  
Shunxin Wang

Abstract An automated 2D multipass velocity dealiasing scheme has been developed to correct velocity fields when wind velocities are very large compared to the Nyquist velocity of the weather Doppler radars. The new velocity dealiasing algorithm is based on the horizontal continuity of velocity fields. The algorithm first determines a set of reference radials and gates by finding the weakest wind region. Then from these reference radials and gates, the scheme checks continuities among adjacent gates and corrects for the velocity values with large differences that are close to 2 × (Nyquist velocity). Multiple passes of unfolding are performed and velocities identified as “folded” with low confidence in an earlier pass are not unfolded until a discontinuity is detected with high confidence at a subsequent pass. The new velocity dealiasing scheme does not need external reference velocity data as do many existing algorithms, thus making it more easily applicable. Over 1000 radar volume scans that include tornadoes, hurricanes, and typhoons are selected to test and to evaluate the new algorithm. The results show that the new algorithm is very robust and very computationally efficient. In cases with many data voids, the new algorithm shows improvements over the current WSR-88D operational velocity dealiasing scheme. The new dealiasing algorithm is a simple and stand-alone program that can be a very useful tool to various Doppler radar data users.


2013 ◽  
Vol 28 (1) ◽  
pp. 194-211 ◽  
Author(s):  
Jennifer F. Newman ◽  
Valliappa Lakshmanan ◽  
Pamela L. Heinselman ◽  
Michael B. Richman ◽  
Travis M. Smith

Abstract The current tornado detection algorithm (TDA) used by the National Weather Service produces a large number of false detections, primarily because it calculates azimuthal shear in a manner that is adversely impacted by noisy velocity data and range-degraded velocity signatures. Coincident with the advent of new radar-derived products and ongoing research involving new weather radar systems, the National Severe Storms Laboratory is developing an improved TDA. A primary component of this algorithm is the local, linear least squares derivatives (LLSD) azimuthal shear field. The LLSD method incorporates rotational derivatives of the velocity field and is affected less strongly by noisy velocity data in comparison with traditional “peak to peak” azimuthal shear calculations. LLSD shear is generally less range dependent than peak-to-peak shear, although some range dependency is unavoidable. The relationship between range and the LLSD shear values of simulated circulations was examined to develop a range correction for LLSD shear. A linear regression and artificial neural networks (ANNs) were investigated as range-correction models. Both methods were used to produce fits for the simulated shear data, although the ANN excelled as it could capture the nonlinear nature of the data. The range-correction methods were applied to real radar data from tornadic and nontornadic events to measure the capacity of the corrected shear to discriminate between tornadic and nontornadic circulations. The findings presented herein suggest that both methods increased shear values during tornadic periods by nearly an order of magnitude, facilitating differentiation between tornadic and nontornadic scans in tornadic events.


2008 ◽  
Vol 25 (10) ◽  
pp. 1845-1858 ◽  
Author(s):  
Mario Majcen ◽  
Paul Markowski ◽  
Yvette Richardson ◽  
David Dowell ◽  
Joshua Wurman

Abstract This note assesses the improvements in dual-Doppler wind syntheses by employing a multipass Barnes objective analysis in the interpolation of radial velocities to a Cartesian grid, as opposed to a more typical single-pass Barnes objective analysis. Steeper response functions can be obtained by multipass objective analyses; that is, multipass objective analyses are less damping at well-resolved wavelengths (e.g., 8–20Δ, where Δ is the data spacing) than single-pass objective analyses, while still suppressing small-scale (<4Δ) noise. Synthetic dual-Doppler data were generated from a three-dimensional numerical simulation of a supercell thunderstorm in a way that emulates the data collection by two mobile radars. The synthetic radial velocity data from a pair of simulated radars were objectively analyzed to a grid, after which the three-dimensional wind field was retrieved by iteratively computing the horizontal divergence and integrating the anelastic mass continuity equation. Experiments with two passes and three passes of the Barnes filter were performed, in addition to a single-pass objective analysis. Comparison of the analyzed three-dimensional wind fields to the model wind fields suggests that multipass objective analysis of radial velocity data prior to dual-Doppler wind synthesis is probably worth the added computational cost. The improvements in the wind syntheses derived from multipass objective analyses are even more apparent for higher-order fields such as vorticity and divergence, and for trajectory calculations and pressure/buoyancy retrievals.


2013 ◽  
Vol 726-731 ◽  
pp. 4541-4546 ◽  
Author(s):  
Li Li Yang ◽  
Yi Yang ◽  
You Cun Qi ◽  
Xue Xing Qiu ◽  
Zhong Qiang Gong

The convective and stratiform precipitations have different precipitation mechanisms. Different reflectivityrainfall rate (ZR) relations should be used for them. A heavy precipitation process on 22nd July, 2009(UTC) in Anhui Province is analyzed with Hefei Doppler radar and 269 rain gauges. First, the type of precipitation is obtained by a fuzzy logic algorithm with radar data. Then the reflectivity values are converted to rainfall rates using an adaptive Z-R relation according to different rain types. It is tested with the case and showed significant improvements over the current operational Z-R QPE when compared with gauges. Results also show that the precipitation process is caused by stratiform and convective precipitation; the rain estimated from radar corresponds well with cloud classification.


2017 ◽  
Vol 145 (7) ◽  
pp. 2611-2633 ◽  
Author(s):  
Rui Qin ◽  
Mingxuan Chen

A case study is presented of convection initiation (CI) resulting from the merger of a cold front with a dryline in southwestern Beijing, China, on the afternoon of 11 June 2011. This process is analyzed with S-band Doppler radar data, surface automatic weather station data, and mesoscale numerical simulation results. The formation of this dryline is analogous to that on the Great Plains of the United States, and it is conducive to CI with mesoscale updrafts generated from the baroclinic frontogenesis, and with favorable instability immediately on the moist side. Prior to the front–dryline merger, as the cold front approached the observed boundary layer convergence line, or the simulated meso- γ-scale secondary dryline, CI occurred ahead of the cold front with little contribution from frontogenetic baroclinity of the dryline. The cold front then merged with the dryline, and the baroclinity of the dryline was enhanced by the associated convergence, to a degree comparable to that caused by frontogenesis of the dryline itself, thus leading to more CI. During the front–dryline merger, meso- γ-scale discrete cold pools associated with the cold front led to a diverse distribution of CI.


Author(s):  
Ying Wang ◽  
Zhaoxia Pu

AbstractThe benefits of assimilating NEXRAD (Next Generation Weather Radar) radial velocity data for convective systems have been demonstrated in previous studies. However, impacts of assimilation of such high spatial and temporal resolution observations on hurricane forecasts has not been demonstrated with the NCEP (National Centers for Environmental Prediction) HWRF (Hurricane Weather and Research Forecasting) system. This study investigates impacts of NEXRAD radial velocity data on forecasts of the evolution of landfalling hurricanes with different configurations of data assimilation. The sensitivity of data assimilation results to influencing parameters within the data assimilation system, such as the maximum range of the radar data, super-observations, horizontal and vertical localization correlation length scale, and weight of background error covariances, is examined. Two hurricane cases, Florence and Michael, that occurred in the summer of 2018 are chosen to conduct a series of experiments. Results show that hurricane intensity, asymmetric structure of inland wind and precipitation, and quantitative precipitation forecasting are improved. Suggestions for implementation of operational configurations are provided.


Sign in / Sign up

Export Citation Format

Share Document