scholarly journals COMPREHENSIVE RADAR DATA FOR THE CONTIGUOUS UNITED STATES: MULTI-YEAR REANALYSIS OF REMOTELY SENSED STORMS

Abstract The Multi-Year Reanalysis of Remotely Sensed Storms (MYRORSS) data set blends radar data from the WSR-88D network and Near-Storm Environmental (NSE) model analyses using the Multi-Radar Multi-Sensor (MRMS) framework. The MYRORSS data set uses the WSR-88D archive starting in 1998 through 2011, processing all valid single-radar volumes to produce a seamless three-dimensional reflectivity volume over the entire contiguous United States with an approximate 5-min update frequency. The three-dimensional grid has an approximate 1-km by 1-km horizontal dimension and is on a stretched vertical grid that extends to 20 km MSL with a maximal vertical spacing of 1 km. Several reflectivity-derived, severe storm related products are also produced, which leverage the ability to merge the MRMS and NSE data. Two Doppler velocity-derived azimuthal shear layer maximum products are produced at a higher horizontal resolution of approximately 0.5-km by 0.5-km. The initial period of record for the data set is 1998-2011. The data set underwent intensive manual quality control to ensure that all available and valid data were included while excluding highly problematic radar volumes that were a negligible percentage of the overall data set, but which caused large data errors in some cases. This data set has applications towards radar-based climatologies, post-event analysis, machine learning applications, model verification, and warning improvements. Details of the manual quality control process are included and examples of some of these applications are presented.

2021 ◽  
Vol 7 (1) ◽  
pp. 47-58
Author(s):  
Roman Fedorov ◽  
Oleg Berngardt

The paper considers the implementation of algorithms for automatic search for signals scattered by meteor trails according to EKB ISTP SB RAS radar data. In general, the algorithm is similar to the algorithms adopted in specialized meteor systems. The algorithm is divided into two stages: detecting a meteor echo and determining its parameters. We show that on the day of the maximum Geminid shower, December 13, 2016, the scattered signals detected by the algorithm are foreshortening and correspond to scattering by irregularities extended in the direction of the meteor shower radiant. This confirms that the source of the signals detected by the algorithm is meteor trails. We implement an additional program for indirect trail height determination. It uses a decay time of echo and the NRLMSIS-00 atmosphere model to estimate the trail height. The dataset from 2017 to 2019 is used for further testing of the algorithm. We demonstrate a correlation in calculated Doppler velocity between the new algorithm and FitACF. We present a solution of the inverse problem of reconstructing the neutral wind velocity vector from the data obtained by the weighted least squares method. We compare calculated speeds and directions of horizontal neutral winds, obtained in the three-dimensional wind model, and the HWM-14 horizontal wind model. The algorithm allows real-time scattered signal processing and has been put into continuous operation at the EKB ISTP SB RAS radar.


2013 ◽  
Vol 71 (1) ◽  
pp. 332-348 ◽  
Author(s):  
Cameron R. Homeyer

Abstract The responsible mechanism for the formation of the enhanced-V infrared cloud-top feature observed above tropopause-penetrating thunderstorms is not well understood. A new method for the combination of volumetric radar reflectivity from individual radars into three-dimensional composites with high vertical resolution (1 km) is introduced and used to test various formation mechanisms proposed in the literature. For analysis, a set of 89 enhanced-V storms over the eastern continental United States are identified in the 10-yr period from 2001 to 2010 using geostationary satellite data. The background atmospheric state from each storm is determined using the Interim ECMWF Re-Analysis (ERA-Interim) and radiosonde observations. In conjunction with the infrared temperature fields, analysis of the radar data in a coordinate relative to the location of the overshooting convective top and in altitudes relative to the tropopause suggests that above-anvil (stratospheric) cirrus clouds are the most likely mechanism for the formation of the enhanced V.


2015 ◽  
Vol 7 (2) ◽  
pp. 289-297 ◽  
Author(s):  
L. Holinde ◽  
T. H. Badewien ◽  
J. A. Freund ◽  
E. V. Stanev ◽  
O. Zielinski

Abstract. The quality of water level time series data strongly varies with periods of high- and low-quality sensor data. In this paper we are presenting the processing steps which were used to generate high-quality water level data from water pressure measured at the Time Series Station (TSS) Spiekeroog. The TSS is positioned in a tidal inlet between the islands of Spiekeroog and Langeoog in the East Frisian Wadden Sea (southern North Sea). The processing steps will cover sensor drift, outlier identification, interpolation of data gaps and quality control. A central step is the removal of outliers. For this process an absolute threshold of 0.25 m 10 min−1 was selected which still keeps the water level increase and decrease during extreme events as shown during the quality control process. A second important feature of data processing is the interpolation of gappy data which is accomplished with a high certainty of generating trustworthy data. Applying these methods a 10-year data set (December 2002–December 2012) of water level information at the TSS was processed resulting in a 7-year time series (2005–2011). Supplementary data are available at doi:10.1594/PANGAEA.843740.


2007 ◽  
Vol 46 (8) ◽  
pp. 1196-1213 ◽  
Author(s):  
Brenda A. Dolan ◽  
Steven A. Rutledge

Abstract Polarimetric Doppler radars provide valuable information about the kinematic and microphysical structure of storms. However, in-depth analysis using radar products, such as Doppler-derived wind vectors and hydrometeor identification, has been difficult to achieve in (near) real time, mainly because of the large volumes of data generated by these radars, lack of quick access to these data, and the challenge of applying quality-control measures in real time. This study focuses on modifying and automating several radar-analysis and quality-control algorithms currently used in postprocessing and merging the resulting data from several radars into an integrated analysis and display in (near) real time. Although the method was developed for a specific network of four Doppler radars: two Weather Surveillance Radar-1988 Doppler (WSR-88D) radars (KFTG and KCYS) and two Colorado State University (CSU) research radars [Pawnee and CSU–University of Chicago–Illinois State Water Survey (CSU–CHILL)], the software is easily adaptable to any radar platform or network of radars. The software includes code to synthesize radial velocities to obtain three-dimensional wind vectors and includes algorithms for automatic quality control of the raw polarimetric data, hydrometeor identification, and rainfall rate. The software was successfully tested during the summers of 2004 and 2005 at the CSU–CHILL radar facility, ingesting data from the four-radar network. The display software allows users the ability to view mosaics of reflectivity, wind vectors, and rain rates, to zoom in and out of radar features easily, to create vertical cross sections, to contour data, and to archive data in real time. Despite the lag time of approximately 10 min, the software proved invaluable for diagnosing areas of intense rainfall, hail, strong updrafts, and other features such as mesocyclones and convergence lines. A case study is presented to demonstrate the utility of the software.


2019 ◽  
Vol 100 (9) ◽  
pp. 1739-1752 ◽  
Author(s):  
Elena Saltikoff ◽  
Katja Friedrich ◽  
Joshua Soderholm ◽  
Katharina Lengfeld ◽  
Brian Nelson ◽  
...  

AbstractWeather radars have been widely used to detect and quantify precipitation and nowcast severe weather for more than 50 years. Operational weather radars generate huge three-dimensional datasets that can accumulate to terabytes per day. So it is essential to review what can be done with existing vast amounts of data, and how we should manage the present datasets for the future climatologists. All weather radars provide the reflectivity factor, and this is the main parameter to be archived. Saving reflectivity as volumetric data in the original spherical coordinates allows for studies of the three-dimensional structure of precipitation, which can be applied to understand a number of processes, for example, analyzing hail or thunderstorm modes. Doppler velocity and polarimetric moments also have numerous applications for climate studies, for example, quality improvement of reflectivity and rain rate retrievals, and for interrogating microphysical and dynamical processes. However, observational data alone are not useful if they are not accompanied by sufficient metadata. Since the lifetime of a radar ranges between 10 and 20 years, instruments are typically replaced or upgraded during climatologically relevant time periods. As a result, present metadata often do not apply to past data. This paper outlines the work of the Radar Task Team set by the Atmospheric Observation Panel for Climate (AOPC) and summarizes results from a recent survey on the existence and availability of long time series. We also provide recommendations for archiving current and future data and examples of climatological studies in which radar data have already been used.


2006 ◽  
Vol 23 (7) ◽  
pp. 865-887 ◽  
Author(s):  
Katja Friedrich ◽  
Martin Hagen ◽  
Thomas Einfalt

Abstract Over the last few years the use of weather radar data has become a fundamental part of various applications like rain-rate estimation, nowcasting of severe weather events, and assimilation into numerical weather prediction models. The increasing demand for radar data necessitates an automated, flexible, and modular quality control. In this paper a quality control procedure is developed for radar reflectivity factors, polarimetric parameters, and Doppler velocity. It consists of several modules that can be extended, modified, and omitted depending on the user requirement, weather situation, and radar characteristics. Data quality is quantified on a pixel-by-pixel basis and encoded into a quality-index field that can be easily interpreted by a nontrained end user or an automated scheme that generates radar products. The quality-index algorithms detect and quantify the influence of beam broadening, the height of the first radar echo, ground clutter contamination, return from non-weather-related objects, and attenuation of electromagnetic energy by hydrometeors on the quality of the radar measurement. The quality-index field is transferred together with the radar data to the end user who chooses the amount of data and the level of quality used for further processing. The calculation of quality-index fields is based on data measured by the polarimetric C-band Doppler radar (POLDIRAD) located in the Alpine foreland in southern Germany.


2021 ◽  
Vol 7 (1) ◽  
pp. 59-73
Author(s):  
Roman Fedorov ◽  
Oleg Berngardt

The paper considers the implementation of algorithms for automatic search for signals scattered by meteor trails according to EKB ISTP SB RAS radar data. In general, the algorithm is similar to the algorithms adopted in specialized meteor systems. The algorithm is divided into two stages: detecting a meteor echo and determining its parameters. We show that on the day of the maximum Geminid shower, December 13, 2016, the scattered signals detected by the algorithm are foreshortening and correspond to scattering by irregularities extended in the direction of the meteor shower radiant. This confirms that the source of the signals detected by the algorithm is meteor trails. We implement an additional program for indirect trail height determination. It uses a decay time of echo and the NRLMSIS-00 atmosphere model to estimate the trail height. The dataset from 2017 to 2019 is used for further testing of the algorithm. We demonstrate a correlation in calculated Doppler velocity between the new algorithm and FitACF. We present a solution of the inverse problem of reconstructing the neutral wind velocity vector from the data obtained by the weighted least squares method. We compare calculated speeds and directions of horizontal neutral winds, obtained in the three-dimensional wind model, and the HWM-14 horizontal wind model. The algorithm allows real-time scattered signal processing and has been put into continuous operation at the EKB ISTP SB RAS radar.


Author(s):  
J. K. Samarabandu ◽  
R. Acharya ◽  
D. R. Pareddy ◽  
P. C. Cheng

In the study of cell organization in a maize meristem, direct viewing of confocal optical sections in 3D (by means of 3D projection of the volumetric data set, Figure 1) becomes very difficult and confusing because of the large number of nucleus involved. Numerical description of the cellular organization (e.g. position, size and orientation of each structure) and computer graphic presentation are some of the solutions to effectively study the structure of such a complex system. An attempt at data-reduction by means of manually contouring cell nucleus in 3D was reported (Summers et al., 1990). Apart from being labour intensive, this 3D digitization technique suffers from the inaccuracies of manual 3D tracing related to the depth perception of the operator. However, it does demonstrate that reducing stack of confocal images to a 3D graphic representation helps to visualize and analyze complex tissues (Figure 2). This procedure also significantly reduce computational burden in an interactive operation.


Author(s):  
Weiping Liu ◽  
John W. Sedat ◽  
David A. Agard

Any real world object is three-dimensional. The principle of tomography, which reconstructs the 3-D structure of an object from its 2-D projections of different view angles has found application in many disciplines. Electron Microscopic (EM) tomography on non-ordered structures (e.g., subcellular structures in biology and non-crystalline structures in material science) has been exercised sporadically in the last twenty years or so. As vital as is the 3-D structural information and with no existing alternative 3-D imaging technique to compete in its high resolution range, the technique to date remains the kingdom of a brave few. Its tedious tasks have been preventing it from being a routine tool. One keyword in promoting its popularity is automation: The data collection has been automated in our lab, which can routinely yield a data set of over 100 projections in the matter of a few hours. Now the image processing part is also automated. Such automations finish the job easier, faster and better.


1966 ◽  
Vol 05 (02) ◽  
pp. 67-74 ◽  
Author(s):  
W. I. Lourie ◽  
W. Haenszeland

Quality control of data collected in the United States by the Cancer End Results Program utilizing punchcards prepared by participating registries in accordance with a Uniform Punchcard Code is discussed. Existing arrangements decentralize responsibility for editing and related data processing to the local registries with centralization of tabulating and statistical services in the End Results Section, National Cancer Institute. The most recent deck of punchcards represented over 600,000 cancer patients; approximately 50,000 newly diagnosed cases are added annually.Mechanical editing and inspection of punchcards and field audits are the principal tools for quality control. Mechanical editing of the punchcards includes testing for blank entries and detection of in-admissable or inconsistent codes. Highly improbable codes are subjected to special scrutiny. Field audits include the drawing of a 1-10 percent random sample of punchcards submitted by a registry; the charts are .then reabstracted and recoded by a NCI staff member and differences between the punchcard and the results of independent review are noted.


Sign in / Sign up

Export Citation Format

Share Document