scholarly journals Hybridizing sequential and variational data assimilation for robust high-resolution hydrologic forecasting

2016 ◽  
Author(s):  
Felipe Hernández ◽  
Xu Liang

Abstract. There are two main frameworks for the estimation of initial states in geophysical models for real-time and forecasting applications: sequential data assimilation and variational data assimilation. However, modern high-resolution models offer challenges, both in terms of indeterminacy and computational requirements, which render most traditional methods insufficient. In this article we introduce a hybrid algorithm called OPTIMISTS which combines advantageous features from both of these data assimilation perspectives. These features are integrated with a multi-objective approach for selecting ensemble members to create a probabilistic estimate of the state variables, which promotes the reduction of observational errors as well as the maintenance of the dynamic consistency of states. Additionally, we propose simplified computations as alternatives aimed at reducing memory and processor requirements. OPTIMISTS was tested on two models of real watersheds, one with over 1,000 variables and the second with over 30,000, on two distributed hydrologic modelling engines: VIC and the DHSVM. Our tests, consisting of assimilating streamflow observations, allowed determining which features of the traditional approaches lead to more accurate forecasts while at the same time making an efficient use of the available computational resources. The results also demonstrated the benefits of the coupled probabilistic/multi-objective approach, which proved instrumental in reducing the harmful effects of overfitting – especially on the model with higher dimensionality.

2011 ◽  
Vol 46 (1) ◽  
pp. 137-141 ◽  
Author(s):  
G.M. Baxter ◽  
S.L. Dance ◽  
A.S. Lawless ◽  
N.K. Nichols

2016 ◽  
Author(s):  
Colette Kerry ◽  
Brian Powell ◽  
Moninya Roughan ◽  
Peter Oke

Abstract. As with other western boundary currents globally, the East Australian Current (EAC) is inherently dynamic making it a challenge to model and predict. For the EAC region, we combine a high-resolution state-of-the-art numerical ocean model with a variety of traditional and newly available observations using an advanced variational data assimilation scheme. The numerical model is configured using the Regional Ocean Modelling System (ROMS 3.4) and takes boundary forcing from the BlueLink ReANalysis (BRAN3). For the data assimilation we use an Incremental Strong-Constraint 4-Dimensional Variational (IS4D-Var) scheme. This paper describes the data assimilative model configuration that achieves an optimised minimisation of the difference between the modelled solution and the observations to give a dynamically-consistent `best-estimate' of the ocean state over a 2-year period. The reanalysis is shown to represent both assimilated and non-assimilated observations well. It achieves mean spatially-averaged RMS residuals with the observations of 7 cm for SSH and 0.4 °C for SST over the assimilation period. The time-mean RMS residual for subsurface temperature measured by Argo floats is a maximum of 1 °C between water depths of 100–300 m and smaller throughout the rest of the water column. Velocities at several offshore and continental shelf moorings are well represented in the reanalysis with complex correlations between 0.8–1 for all observations in the upper 500 m. Surface radial velocities from a high-frequency radar array are assimilated and the reanalysis provides surface velocity estimates with complex correlations with observed velocities of 0.8–1 across the radar footprint. Comparison with independent (non-assimilated) shipboard CTD cast observations shows a marked improvement in the representation of the subsurface ocean in the reanalysis, with the RMS residual in potential density reduced to about half of the residual with the free-running model in the upper eddy-influenced part of the water column. This shows that information is successfully propagated from observed variables to unobserved regions as the assimilation system uses the model dynamics to determine covariance, such that the ocean state better fits and is in balance with the observations. This is the first study to generate a reanalysis of the region at such a high resolution, making use of an unprecedented observational data set and using an assimilation method that uses the time-evolving model physics to adjust the model in a dynamically consistent way. As such, the reanalysis potentially represents a marked improvement in our ability to capture important circulation dynamics in the EAC. The reanalysis is being used to study EAC dynamics, observation impact in state-estimation and as forcing for a variety of downscaling studies.


2020 ◽  
Vol 148 (7) ◽  
pp. 2819-2836 ◽  
Author(s):  
Frédéric Fabry ◽  
Véronique Meunier

Abstract Although radar is our most useful tool for monitoring severe weather, the benefits of assimilating its data are often short lived. To understand why, we documented the assimilation requirements, the data characteristics, and the common practices that could hinder optimum data assimilation by traditional approaches. Within storms, radars provide dense measurements of a few highly variable storm outcomes (precipitation and wind) in atmospherically unstable conditions. However, statistical relationships between errors of observed and unobserved quantities often become nonlinear because the errors in these areas tend to become large rapidly. Beyond precipitating areas lie large regions for which radars provide limited new information, yet whose properties will soon shape the outcome of future storms. For those areas, any innovation must consequently be projected from sometimes distant precipitating areas. Thus, radar data assimilation must contend with a double need at odds with many traditional assimilation implementations: correcting in-storm properties with complex errors while projecting information at unusually far distances outside precipitating areas. To further complicate the issue, other data properties and practices, such as assimilating reflectivity in logarithmic units, are not optimal to correct all state variables. Therefore, many characteristics of radar measurements and common practices of their assimilation are incompatible with necessary conditions for successful data assimilation. Facing these dataset-specific challenges may force us to consider new approaches that use the available information differently.


2015 ◽  
Vol 47 (5) ◽  
pp. 051401
Author(s):  
Yoichi Ishikawa ◽  
Teiji In ◽  
Satoshi Nakada ◽  
Kei Nishina ◽  
Hiromichi Igarashi ◽  
...  

2012 ◽  
Vol 12 (5) ◽  
pp. 13515-13552 ◽  
Author(s):  
Z. Li ◽  
Z. Zang ◽  
Q. B. Li ◽  
Y. Chao ◽  
D. Chen ◽  
...  

Abstract. A three-dimensional variational data assimilation (3-DVAR) algorithm for aerosols in a WRF/Chem model is presented. The WRF/Chem model uses the MOSAIC (Model for Simulating Aerosol Interactions and Chemistry) scheme, which explicitly treats eight major species (elemental/black carbon, organic carbon, nitrate, sulfate, chloride, ammonium, sodium, and the sum of other inorganic, inert mineral and metal species) and represents size distributions using a sectional method with four size bins. The 3-DVAR scheme is formulated to take advantage of the MOSAIC scheme in providing comprehensive analyses of specie concentrations and size distributions. To treat the large number of state variables associated with the MOSAIC scheme, this 3-DVAR algorithm first determines the analysis increments of the total mass concentrations of the eight species, defined as the sum of the mass concentrations across all size bins, and then distributes the analysis increments over four size bins according to the background error variances. The number concentrations for each size bin are adjusted based on the ratios between the mass and number concentrations of the background state. This system has been applied to the analysis and prediction of PM2.5 in the Los Angeles basin during the CalNex 2010 field experiment, with assimilation of surface PM2.5 and speciated concentration observations. The results demonstrate that the data assimilation significantly reduces the errors in comparison with a down scaling simulation and improved forecasts of the concentrations of PM2.5 as well as individual species for up to 24 h. Some implementation difficulties and limitations of the system are also discussed.


2017 ◽  
Author(s):  
Felipe Hernández ◽  
Xu Liang

Abstract. The success of real-time estimation and forecasting applications based on geophysical models has been possible thanks to the two main frameworks for the determination of the models’ initial conditions: Bayesian data assimilation and variational data assimilation. However, while there have been efforts to unify these two paradigms, existing attempts struggle to fully leverage the advantages of both in order to face the challenges posed by modern high-resolution models – mainly related to model indeterminacy and steep computational requirements. In this article we introduce a hybrid algorithm called OPTIMISTS (Optimized PareTo Inverse Modeling through Integrated STochastic Search) which is targeted at non-linear high-resolution problems and that brings together ideas from particle filters, 4-dimensional variational methods, evolutionary Pareto optimization, and kernel density estimation in a unique way. Streamflow forecasting experiments were conducted to test which specific parameterizations of OPTIMISTS led to higher predictive accuracy. The experiments analysed two watersheds, one with a low resolution using the VIC (Variable Infiltration Capacity) model and one with a high-resolution using the DHSVM (Distributed Hydrology Soil Vegetation Model). By selecting kernel-based non-parametric sampling, non-sequential evaluation of candidate particles, and through the multi-objective minimization of departures from the streamflow observations and from the background states, OPTIMISTS was shown to outperform a particle filter and a 4D variational method. Moreover, the experiments demonstrated that OPTIMISTS scales well in high-resolution cases without imposing a significant computational overhead and that it was successful in mitigating the harmful effects of overfitting. With these combined advantages, the algorithm shows the potential to increase the accuracy and efficiency of operational prediction systems for the improved management of natural resources.


2009 ◽  
Vol 78 (2) ◽  
pp. 237-248 ◽  
Author(s):  
Yoichi Ishikawa ◽  
Toshiyuki Awaji ◽  
Takahiro Toyoda ◽  
Teiji In ◽  
Kei Nishina ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document