scholarly journals Simultaneous Radar and Satellite Data Storm-Scale Assimilation Using an Ensemble Kalman Filter Approach for 24 May 2011

2015 ◽  
Vol 143 (1) ◽  
pp. 165-194 ◽  
Author(s):  
Thomas A. Jones ◽  
David Stensrud ◽  
Louis Wicker ◽  
Patrick Minnis ◽  
Rabindra Palikonda

Abstract Assimilating high-resolution radar reflectivity and radial velocity into convection-permitting numerical weather prediction models has proven to be an important tool for improving forecast skill of convection. The use of satellite data for the application is much less well understood, only recently receiving significant attention. Since both radar and satellite data provide independent information, combing these two sources of data in a robust manner potentially represents the future of high-resolution data assimilation. This research combines Geostationary Operational Environmental Satellite 13 (GOES-13) cloud water path (CWP) retrievals with Weather Surveillance Radar-1988 Doppler (WSR-88D) reflectivity and radial velocity to examine the impacts of assimilating each for a severe weather event occurring in Oklahoma on 24 May 2011. Data are assimilated into a 3-km model using an ensemble adjustment Kalman filter approach with 36 members over a 2-h assimilation window between 1800 and 2000 UTC. Forecasts are then generated for 90 min at 5-min intervals starting at 1930 and 2000 UTC. Results show that both satellite and radar data are able to initiate convection, but that assimilating both spins up a storm much faster. Assimilating CWP also performs well at suppressing spurious precipitation and cloud cover in the model as well as capturing the anvil characteristics of developed storms. Radar data are most effective at resolving the 3D characteristics of the core convection. Assimilating both satellite and radar data generally resulted in the best model analysis and most skillful forecast for this event.

2011 ◽  
Vol 139 (11) ◽  
pp. 3446-3468 ◽  
Author(s):  
Nathan Snook ◽  
Ming Xue ◽  
Youngsun Jung

Abstract One of the goals of the National Science Foundation Engineering Research Center (ERC) for Collaborative Adaptive Sensing of the Atmosphere (CASA) is to improve storm-scale numerical weather prediction (NWP) by collecting data with a dense X-band radar network that provides high-resolution low-level coverage, and by assimilating such data into NWP models. During the first spring storm season after the deployment of four radars in the CASA Integrated Project-1 (IP-1) network in southwest Oklahoma, a tornadic mesoscale convective system (MCS) was captured by CASA and surrounding Weather Surveillance Radars-1988 Doppler (WSR-88Ds) on 8–9 May 2007. The MCS moved across northwest Texas and western and central Oklahoma; two tornadoes rated as category 1 on the enhanced Fujita scale (EF-1) and one tornado of EF-0 intensity were reported during the event, just to the north of the IP-1 network. This was the first tornadic convective system observed by CASA. To quantify the impacts of CASA radar data in storm-scale NWP, a set of data assimilation experiments were performed using the Advanced Regional Prediction System (ARPS) ensemble Kalman filter (EnKF) system configured with full model physics and high-resolution terrain. Data from four CASA IP-1 radars and five WSR-88Ds were assimilated in some of the experiments. The ensemble contained 40 members, and radar data were assimilated every 5 min for 1 h. While the assimilation of WSR-88D data alone was able to produce a reasonably accurate analysis of the convective system, assimilating CASA data in addition to WSR-88D data is found to improve the representation of storm-scale circulations, particularly in the lowest few kilometers of the atmosphere, as evidenced by analyses of gust front position and comparison of simulated Vr with observations. Assimilating CASA data decreased RMS innovation of the resulting ensemble mean analyses of Z, particularly in early assimilation cycles, suggesting that the addition of CASA data allowed the EnKF system to more quickly achieve a good result. Use of multiple microphysics schemes in the forecast ensemble was found to alleviate underdispersion by increasing the ensemble spread. This work is the first assimilating real CASA data into an NWP model using EnKF.


Author(s):  
Di Xian ◽  
Peng Zhang ◽  
Ling Gao ◽  
Ruijing Sun ◽  
Haizhen Zhang ◽  
...  

AbstractFollowing the progress of satellite data assimilation in the 1990s, the combination of meteorological satellites and numerical models has changed the way scientists understand the earth. With the evolution of numerical weather prediction models and earth system models, meteorological satellites will play a more important role in earth sciences in the future. As part of the space-based infrastructure, the Fengyun (FY) meteorological satellites have contributed to earth science sustainability studies through an open data policy and stable data quality since the first launch of the FY-1A satellite in 1988. The capability of earth system monitoring was greatly enhanced after the second-generation polar orbiting FY-3 satellites and geostationary orbiting FY-4 satellites were developed. Meanwhile, the quality of the products generated from the FY-3 and FY-4 satellites is comparable to the well-known MODIS products. FY satellite data has been utilized broadly in weather forecasting, climate and climate change investigations, environmental disaster monitoring, etc. This article reviews the instruments mounted on the FY satellites. Sensor-dependent level 1 products (radiance data) and inversion algorithm-dependent level 2 products (geophysical parameters) are introduced. As an example, some typical geophysical parameters, such as wildfires, lightning, vegetation indices, aerosol products, soil moisture, and precipitation estimation have been demonstrated and validated by in-situ observations and other well-known satellite products. To help users access the FY products, a set of data sharing systems has been developed and operated. The newly developed data sharing system based on cloud technology has been illustrated to improve the efficiency of data delivery.


2008 ◽  
Vol 136 (3) ◽  
pp. 945-963 ◽  
Author(s):  
Jidong Gao ◽  
Ming Xue

Abstract A new efficient dual-resolution (DR) data assimilation algorithm is developed based on the ensemble Kalman filter (EnKF) method and tested using simulated radar radial velocity data for a supercell storm. Radar observations are assimilated on both high-resolution and lower-resolution grids using the EnKF algorithm with flow-dependent background error covariances estimated from the lower-resolution ensemble. It is shown that the flow-dependent and dynamically evolved background error covariances thus estimated are effective in producing quality analyses on the high-resolution grid. The DR method has the advantage of being able to significantly reduce the computational cost of the EnKF analysis. In the system, the lower-resolution ensemble provides the flow-dependent background error covariance, while the single-high-resolution forecast and analysis provides the benefit of higher resolution, which is important for resolving the internal structures of thunderstorms. The relative smoothness of the covariance obtained from the lower 4-km-resolution ensemble does not appear to significantly degrade the quality of analysis. This is because the cross covariance among different variables is of first-order importance for “retrieving” unobserved variables from the radar radial velocity data. For the DR analysis, an ensemble size of 40 appears to be a reasonable choice with the use of a 4-km horizontal resolution in the ensemble and a 1-km resolution in the high-resolution analysis. Several sensitivity tests show that the DR EnKF system is quite robust to different observation errors. A 4-km thinned data resolution is a compromise that is acceptable under the constraint of real-time applications. A data density of 8 km leads to a significant degradation in the analysis.


Fire ◽  
2021 ◽  
Vol 4 (3) ◽  
pp. 55
Author(s):  
Gary L. Achtemeier ◽  
Scott L. Goodrick

Abrupt changes in wind direction and speed caused by thunderstorm-generated gust fronts can, within a few seconds, transform slow-spreading low-intensity flanking fires into high-intensity head fires. Flame heights and spread rates can more than double. Fire mitigation strategies are challenged and the safety of fire crews is put at risk. We propose a class of numerical weather prediction models that incorporate real-time radar data and which can provide fire response units with images of accurate very short-range forecasts of gust front locations and intensities. Real-time weather radar data are coupled with a wind model that simulates density currents over complex terrain. Then two convective systems from formation and merger to gust front arrival at the location of a wildfire at Yarnell, Arizona, in 2013 are simulated. We present images of maps showing the progress of the gust fronts toward the fire. Such images can be transmitted to fire crews to assist decision-making. We conclude, therefore, that very short-range gust front prediction models that incorporate real-time radar data show promise as a means of predicting the critical weather information on gust front propagation for fire operations, and that such tools warrant further study.


2017 ◽  
Vol 145 (12) ◽  
pp. 4911-4936 ◽  
Author(s):  
Jonathan Labriola ◽  
Nathan Snook ◽  
Youngsun Jung ◽  
Bryan Putnam ◽  
Ming Xue

Explicit prediction of hail using numerical weather prediction models remains a significant challenge; microphysical uncertainties and errors are a significant contributor to this challenge. This study assesses the ability of storm-scale ensemble forecasts using single-moment Lin or double-moment Milbrandt and Yau microphysical schemes in predicting hail during a severe weather event over south-central Oklahoma on 10 May 2010. Radar and surface observations are assimilated using an ensemble Kalman filter (EnKF) at 5-min intervals. Three sets of ensemble forecasts, launched at 15-min intervals, are then produced from EnKF analyses at times ranging from 30 min prior to the first observed hail to the time of the first observed hail. Forty ensemble members are run at 500-m horizontal grid spacing in both EnKF assimilation cycles and subsequent forecasts. Hail forecasts are verified using radar-derived products including information from single- and dual-polarization radar data: maximum estimated size of hail (MESH), hydrometeor classification algorithm (HCA) output, and hail size discrimination algorithm (HSDA) output. Resulting hail forecasts show at most marginal skill, with the level of skill dependent on the forecast initialization time and microphysical scheme used. Forecasts using the double-moment scheme predict many small hailstones aloft, while the single-moment members predict larger hailstones. Near the surface, double-moment members predict larger hailstone sizes than their single-member counterparts. Hail in the forecasts is found to melt too quickly near the surface for members using either of the microphysics schemes examined. Analysis of microphysical budgets in both schemes indicates that both schemes suboptimally represent hail processes, adversely impacting the skill of surface hail forecasts.


2018 ◽  
Vol 35 (8) ◽  
pp. 1605-1620 ◽  
Author(s):  
Susan Rennie ◽  
Peter Steinle ◽  
Alan Seed ◽  
Mark Curtis ◽  
Yi Xiao

AbstractA new quality control system, primarily using a naïve Bayesian classifier, has been developed to enable the assimilation of radial velocity observations from Doppler radar. The ultimate assessment of this system is the assimilation of observations in a pseudo-operational numerical weather prediction system during the Sydney 2014 Forecast Demonstration Project. A statistical analysis of the observations assimilated during this period provides an assessment of the data quality. This will influence how observations will be assimilated in the future, and what quality control and errors are applicable. This study compares observation-minus-background statistics for radial velocities from precipitation and insect echoes. The results show that with the applied level of quality control, these echo types have comparable biases. With the latest quality control, the clear air observations of wind are apparently of similar quality to those from precipitation and are therefore suitable for use in high-resolution NWP assimilation systems.


2018 ◽  
Vol 10 (10) ◽  
pp. 1520 ◽  
Author(s):  
Adrianos Retalis ◽  
Dimitris Katsanos ◽  
Filippos Tymvios ◽  
Silas Michaelides

Global Precipitation Measurement (GPM) high-resolution product is validated against rain gauges over the island of Cyprus for a three-year period, starting from April 2014. The precipitation estimates are available in both high temporal (half hourly) and spatial (10 km) resolution and combine data from all passive microwave instruments in the GPM constellation. The comparison performed is twofold: first the GPM data are compared with the precipitation measurements on a monthly basis and then the comparison focuses on extreme events, recorded throughout the first 3 years of GPM’s operation. The validation is based on ground data from a dense and reliable network of rain gauges, also available in high temporal (hourly) resolution. The first results show very good correlation regarding monthly values; however, the correspondence of GPM in extreme precipitation varies from “no correlation” to “high correlation”, depending on case. This study aims to verify the GPM rain estimates, since such a high-resolution dataset has numerous applications, including the assimilation in numerical weather prediction models and the study of flash floods with hydrological models.


2012 ◽  
Vol 140 (8) ◽  
pp. 2689-2705 ◽  
Author(s):  
Marc Berenguer ◽  
Madalina Surcel ◽  
Isztar Zawadzki ◽  
Ming Xue ◽  
Fanyou Kong

Abstract This second part of a two-paper series compares deterministic precipitation forecasts from the Storm-Scale Ensemble Forecast System (4-km grid) run during the 2008 NOAA Hazardous Weather Testbed (HWT) Spring Experiment, and from the Canadian Global Environmental Multiscale (GEM) model (15 km), in terms of their ability to reproduce the average diurnal cycle of precipitation during spring 2008. Moreover, radar-based nowcasts generated with the McGill Algorithm for Precipitation Nowcasting Using Semi-Lagrangian Extrapolation (MAPLE) are analyzed to quantify the portion of the diurnal cycle explained by the motion of precipitation systems, and to evaluate the potential of the NWP models for very short-term forecasting. The observed diurnal cycle of precipitation during spring 2008 is characterized by the dominance of the 24-h harmonic, which shifts with longitude, consistent with precipitation traveling across the continent. Time–longitude diagrams show that the analyzed NWP models partially reproduce this signal, but show more variability in the timing of initiation in the zonal motion of the precipitation systems than observed from radar. Traditional skill scores show that the radar data assimilation is the main reason for differences in model performance, while the analyzed models that do not assimilate radar observations have very similar skill. The analysis of MAPLE forecasts confirms that the motion of precipitation systems is responsible for the dominance of the 24-h harmonic in the longitudinal range 103°–85°W, where 8-h MAPLE forecasts initialized at 0100, 0900, and 1700 UTC successfully reproduce the eastward motion of rainfall systems. Also, on average, MAPLE outperforms radar data assimilating models for the 3–4 h after initialization, and nonradar data assimilating models for up to 5 h after initialization.


2019 ◽  
Vol 101 (1) ◽  
pp. E43-E57 ◽  
Author(s):  
Thomas N. Nipen ◽  
Ivar A. Seierstad ◽  
Cristian Lussana ◽  
Jørn Kristiansen ◽  
Øystein Hov

Abstract Citizen weather stations are rapidly increasing in prevalence and are becoming an emerging source of weather information. These low-cost consumer-grade devices provide observations in real time and form parts of dense networks that capture high-resolution meteorological information. Despite these benefits, their adoption into operational weather prediction systems has been slow. However, MET Norway recently introduced observations from Netatmo’s network of weather stations in the postprocessing of near-surface temperature forecasts for Scandinavia, Finland, and the Baltic countries. The observations are used to continually correct errors in the weather model output caused by unresolved features such as cold pools, inversions, urban heat islands, and an intricate coastline. Corrected forecasts are issued every hour. Integrating citizen observations into operational systems comes with a number of challenges. First, operational systems must be robust and therefore rely on strict quality control procedures to filter out unreliable measurements. Second, postprocessing methods must be selected and tuned to make use of the high-resolution data that at times can contain conflicting information. Central to resolving these challenges is the need to use the massive redundancy of citizen observations, with up to dozens of observations per square kilometer, and treating the data source as a network rather than a collection of individual stations. We present our experiences with introducing citizen observations into the operational production chain of automated public weather forecasts. Their inclusion shows a clear improvement to the accuracy of short-term temperature forecasts, especially in areas where existing professional stations are sparse.


Sign in / Sign up

Export Citation Format

Share Document