scholarly journals On Optimal and Simultaneous Stochastic Perturbations with Application to Estimation of High-Dimensional Matrix and Data Assimilation in High-Dimensional Systems

Author(s):  
Hong Son Hoang ◽  
Remy Baraille
2020 ◽  
Vol 21 (9) ◽  
pp. 2023-2039
Author(s):  
Dikra Khedhaouiria ◽  
Stéphane Bélair ◽  
Vincent Fortin ◽  
Guy Roy ◽  
Franck Lespinas

AbstractConsistent and continuous fields provided by precipitation analyses are valuable for hydrometeorological applications and land data assimilation modeling, among others. Providing uncertainty estimates is a logical step in the analysis development, and a consistent approach to reach this objective is the production of an ensemble analysis. In the present study, a 6-h High-Resolution Ensemble Precipitation Analysis (HREPA) was developed for the domain covering Canada and the northern part of the contiguous United States. The data assimilation system is the same as the Canadian Precipitation Analysis (CaPA) and is based on optimal interpolation (OI). Precipitation from the Canadian national 2.5-km atmospheric prediction system constitutes the background field of the analysis, while at-site records and radar quantitative precipitation estimates (QPE) compose the observation datasets. By using stochastic perturbations, multiple observations and background field random realizations were generated to subsequently feed the data assimilation system and provide 24 HREPA members plus one control run. Based on one summer and one winter experiment, HREPA capabilities in terms of bias and skill were verified against at-site observations for different climatic regions. The results indicated HREPA’s reliability and skill for almost all types of precipitation events in winter, and for precipitation of medium intensity in summer. For both seasons, HREPA displayed resolution and sharpness. The overall good performance of HREPA and the lack of ensemble precipitation analysis (PA) at such spatiotemporal resolution in the literature motivate further investigations on transitional seasons and more advanced perturbation approaches.


Author(s):  
Rohita H. Jagdale ◽  
Sanjeevani K. Shah

In video Super Resolution (SR), the problem of cost expense concerning the attainment of enhanced spatial resolution, computational complexity and difficulties in motion blur makes video SR a complex task. Moreover, maintaining temporal consistency is crucial to achieving an efficient and robust video SR model. This paper plans to develop an intelligent SR model for video frames. Initially, the video frames in RGB format will be transformed into HSV. In general, the improvement in video frames is done in V-channel to achieve High-Resolution (HR) videos. In order to enhance the RGB pixels, the current window size is enhanced to high-dimensional window size. As a novelty, this paper intends to formulate a high-dimensional matrix with enriched pixel intensity in V-channel to produce enhanced HR video frames. Estimating the enriched pixels in the high-dimensional matrix is complex, however in this paper, it is dealt in a significant way by means of a certain process: (i) motion estimation (ii) cubic spline interpolation and deblurring or sharpening. As the main contribution, the cubic spline interpolation process is enhanced via optimization in terms of selecting the optimal resolution factor and different cubic spline parameters. For optimal tuning, this paper introduces a new modified algorithm, which is the modification of the Rider Optimization Algorithm (ROA) named Mean Fitness-ROA (MF-ROA). Once the HR image is attained, it combines the HSV and converts to RGB, which obtains the enhanced output RGB video frame. Finally, the performance of the proposed work is compared over other state-of-the-art models with respect to BRISQUE, SDME and ESSIM measures, and proves its superiority over other models.


2020 ◽  
Author(s):  
Milija Zupanski

<p>High-dimensional ensemble data assimilation applications require error covariance localization in order to address the problem of insufficient degrees of freedom, typically accomplished using the observation-space covariance localization. However, this creates a challenge for vertically integrated observations, such as satellite radiances, aerosol optical depth, etc., since the exact observation location in vertical does not exist. For nonlinear problems, there is an implied inconsistency in iterative minimization due to using observation-space localization which effectively prevents finding the optimal global minimizing solution. Using state-space localization, however, in principal resolves both issues associated with observation space localization.</p><p> </p><p>In this work we present a new nonlinear ensemble data assimilation method that employs covariance localization in state space and finds an optimal analysis solution. The new method resembles “modified ensembles” in the sense that ensemble size is increased in the analysis, but it differs in methodology used to create ensemble modifications, calculate the analysis error covariance, and define the initial ensemble perturbations for data assimilation cycling. From a practical point of view, the new method is considerably more efficient and potentially applicable to realistic high-dimensional data assimilation problems. A distinct characteristic of the new algorithm is that the localized error covariance and minimization are global, i.e. explicitly defined over all state points. The presentation will focus on examining feasible options for estimating the analysis error covariance and for defining the initial ensemble perturbations.</p>


2016 ◽  
Vol 144 (1) ◽  
pp. 409-427 ◽  
Author(s):  
Julian Tödter ◽  
Paul Kirchgessner ◽  
Lars Nerger ◽  
Bodo Ahrens

Abstract This work assesses the large-scale applicability of the recently proposed nonlinear ensemble transform filter (NETF) in data assimilation experiments with the NEMO ocean general circulation model. The new filter constitutes a second-order exact approximation to fully nonlinear particle filtering. Thus, it relaxes the Gaussian assumption contained in ensemble Kalman filters. The NETF applies an update step similar to the local ensemble transform Kalman filter (LETKF), which allows for efficient and simple implementation. Here, simulated observations are assimilated into a simplified ocean configuration that exhibits globally high-dimensional dynamics with a chaotic mesoscale flow. The model climatology is used to initialize an ensemble of 120 members. The number of observations in each local filter update is of the same order resulting from the use of a realistic oceanic observation scenario. Here, an importance sampling particle filter (PF) would require at least 106 members. Despite the relatively small ensemble size, the NETF remains stable and converges to the truth. In this setup, the NETF achieves at least the performance of the LETKF. However, it requires a longer spinup period because the algorithm only relies on the particle weights at the analysis time. These findings show that the NETF can successfully deal with a large-scale assimilation problem in which the local observation dimension is of the same order as the ensemble size. Thus, the second-order exact NETF does not suffer from the PF’s curse of dimensionality, even in a deterministic system.


2016 ◽  
Vol 190 ◽  
pp. 25-34 ◽  
Author(s):  
Dong Wang ◽  
Haipeng Shen ◽  
Young Truong

2021 ◽  
Vol 28 (4) ◽  
pp. 633-649
Author(s):  
Yumeng Chen ◽  
Alberto Carrassi ◽  
Valerio Lucarini

Abstract. Data assimilation (DA) aims at optimally merging observational data and model outputs to create a coherent statistical and dynamical picture of the system under investigation. Indeed, DA aims at minimizing the effect of observational and model error and at distilling the correct ingredients of its dynamics. DA is of critical importance for the analysis of systems featuring sensitive dependence on the initial conditions, as chaos wins over any finitely accurate knowledge of the state of the system, even in absence of model error. Clearly, the skill of DA is guided by the properties of dynamical system under investigation, as merging optimally observational data and model outputs is harder when strong instabilities are present. In this paper we reverse the usual angle on the problem and show that it is indeed possible to use the skill of DA to infer some basic properties of the tangent space of the system, which may be hard to compute in very high-dimensional systems. Here, we focus our attention on the first Lyapunov exponent and the Kolmogorov–Sinai entropy and perform numerical experiments on the Vissio–Lucarini 2020 model, a recently proposed generalization of the Lorenz 1996 model that is able to describe in a simple yet meaningful way the interplay between dynamical and thermodynamical variables.


2017 ◽  
Vol 21 (6) ◽  
pp. 2637-2647 ◽  
Author(s):  
Sujay V. Kumar ◽  
Jiarui Dong ◽  
Christa D. Peters-Lidard ◽  
David Mocko ◽  
Breogán Gómez

Abstract. Accurate specification of the model error covariances in data assimilation systems is a challenging issue. Ensemble land data assimilation methods rely on stochastic perturbations of input forcing and model prognostic fields for developing representations of input model error covariances. This article examines the limitations of using a single forcing dataset for specifying forcing uncertainty inputs for assimilating snow depth retrievals. Using an idealized data assimilation experiment, the article demonstrates that the use of hybrid forcing input strategies (either through the use of an ensemble of forcing products or through the added use of the forcing climatology) provide a better characterization of the background model error, which leads to improved data assimilation results, especially during the snow accumulation and melt-time periods. The use of hybrid forcing ensembles is then employed for assimilating snow depth retrievals from the AMSR2 instrument over two domains in the continental USA with different snow evolution characteristics. Over a region near the Great Lakes, where the snow evolution tends to be ephemeral, the use of hybrid forcing ensembles provides significant improvements relative to the use of a single forcing dataset. Over the Colorado headwaters characterized by large snow accumulation, the impact of using the forcing ensemble is less prominent and is largely limited to the snow transition time periods. The results of the article demonstrate that improving the background model error through the use of a forcing ensemble enables the assimilation system to better incorporate the observational information.


2020 ◽  
Author(s):  
Bertrand Cluzet ◽  
Matthieu Lafaysse ◽  
Marie Dumont ◽  
Emmanuel Cosme ◽  
Clément Albergel

<p>In mountainous areas, detailed snowpack models are essential to capture the high spatio-temporal variability of the snowpack. This task is highly challenging, and models suffer from large simulation errors. In these regions, in-situ observations are scarce, while remote sensing observations are generally patchy owing to complex physiographic features (steep slopes, forests, shadows,...) and weather conditions (clouds). This point is stressing the need for a spatially coherent data assimilation system able to propagate the informations into unobserved locations.</p><p>In this study, we present CRAMPON (CRocus with AssiMilation of snowPack ObservatioNs), an ensemble data assimilation system ingesting snowpack observations in a spatialized context. CRAMPON quantifies snowpack modelling uncertainties with an ensemble and reduces them using a Particle Filter. Stochastic perturbations of meteorological forcings and the multi-physical version of Crocus snowpack model (ESCROC) are used to build the ensemble. Two variants of the Sequential Importance Resampling Particle Filter (PF) were implemented to tackle the common PF degeneracy issue that arises when assimilating a large number of observations. In a first approach (so-called global approach), the observations information is spread across topographic conditions by looking for a global analysis. Degeneracy is mitigated by inflating the observation error covariance matrix, with the side effect of reducing the impact of the assimilation. In a second approach (klocal), we propagate the information and mitigate degeneracy by a localisation of the PF based on background correlation patterns between topographic conditions.</p><p>Here, we investigate the ability of CRAMPON to globally benefit from partial observations in a conceptual semi-distributed domain which accounts for the main features of topographic-induced snowpack variability. We compare simulations without assimilation with experiments assimilating synthetic observations of the Height of Snow and VIS/NIR reflectance. This setup demonstrates the ability of CRAMPON to spread the information of various snow observations into unobserved locations.</p>


Sign in / Sign up

Export Citation Format

Share Document