Coupled Thermo-Hydro-Geochemical Models of Engineered Barrier Systems: The Febex Project

2000 ◽  
Vol 663 ◽  
Author(s):  
J. Samper ◽  
R. Juncosa ◽  
V. Navarro ◽  
J. Delgado ◽  
L. Montenegro ◽  
...  

ABSTRACTFEBEX (Full-scale Engineered Barrier EXperiment) is a demonstration and research project dealing with the bentonite engineered barrier designed for sealing and containment of waste in a high level radioactive waste repository (HLWR). It includes two main experiments: an situ full-scale test performed at Grimsel (GTS) and a mock-up test operating since February 1997 at CIEMAT facilities in Madrid (Spain) [1,2,3]. One of the objectives of FEBEX is the development and testing of conceptual and numerical models for the thermal, hydrodynamic, and geochemical (THG) processes expected to take place in engineered clay barriers. A significant improvement in coupled THG modeling of the clay barrier has been achieved both in terms of a better understanding of THG processes and more sophisticated THG computer codes. The ability of these models to reproduce the observed THG patterns in a wide range of THG conditions enhances the confidence in their prediction capabilities. Numerical THG models of heating and hydration experiments performed on small-scale lab cells provide excellent results for temperatures, water inflow and final water content in the cells [3]. Calculated concentrations at the end of the experiments reproduce most of the patterns of measured data. In general, the fit of concentrations of dissolved species is better than that of exchanged cations. These models were later used to simulate the evolution of the large-scale experiments (in situ and mock-up). Some thermo-hydrodynamic hypotheses and bentonite parameters were slightly revised during TH calibration of the mock-up test. The results of the reference model reproduce simultaneously the observed water inflows and bentonite temperatures and relative humidities. Although the model is highly sensitive to one-at-a-time variations in model parameters, the possibility of parameter combinations leading to similar fits cannot be precluded. The TH model of the “in situ” test is based on the same bentonite TH parameters and assumptions as for the “mock-up” test. Granite parameters were slightly modified during the calibration process in order to reproduce the observed thermal and hydrodynamic evolution. The reference model captures properly relative humidities and temperatures in the bentonite [3]. It also reproduces the observed spatial distribution of water pressures and temperatures in the granite. Once calibrated the TH aspects of the model, predictions of the THG evolution of both tests were performed. Data from the dismantling of the in situ test, which is planned for the summer of 2001, will provide a unique opportunity to test and validate current THG models of the EBS.

2020 ◽  
Author(s):  
Yuan Yuan ◽  
Lei Lin

Satellite image time series (SITS) classification is a major research topic in remote sensing and is relevant for a wide range of applications. Deep learning approaches have been commonly employed for SITS classification and have provided state-of-the-art performance. However, deep learning methods suffer from overfitting when labeled data is scarce. To address this problem, we propose a novel self-supervised pre-training scheme to initialize a Transformer-based network by utilizing large-scale unlabeled data. In detail, the model is asked to predict randomly contaminated observations given an entire time series of a pixel. The main idea of our proposal is to leverage the inherent temporal structure of satellite time series to learn general-purpose spectral-temporal representations related to land cover semantics. Once pre-training is completed, the pre-trained network can be further adapted to various SITS classification tasks by fine-tuning all the model parameters on small-scale task-related labeled data. In this way, the general knowledge and representations about SITS can be transferred to a label-scarce task, thereby improving the generalization performance of the model as well as reducing the risk of overfitting. Comprehensive experiments have been carried out on three benchmark datasets over large study areas. Experimental results demonstrate the effectiveness of the proposed method, leading to a classification accuracy increment up to 1.91% to 6.69%. <div><b>This work has been submitted to the IEEE for possible publication. Copyright may be transferred without notice, after which this version may no longer be accessible.</b></div>


2020 ◽  
Author(s):  
Yuan Yuan ◽  
Lei Lin

<div>Satellite image time series (SITS) classification is a major research topic in remote sensing and is relevant for a wide range of applications. Deep learning approaches have been commonly employed for SITS classification and have provided state-of-the-art performance. However, deep learning methods suffer from overfitting when labeled data is scarce. To address this problem, we propose a novel self-supervised pre-training scheme to initialize a Transformer-based network by utilizing large-scale unlabeled data. In detail, the model is asked to predict randomly contaminated observations given an entire time series of a pixel. The main idea of our proposal is to leverage the inherent temporal structure of satellite time series to learn general-purpose spectral-temporal representations related to land cover semantics. Once pre-training is completed, the pre-trained network can be further adapted to various SITS classification tasks by fine-tuning all the model parameters on small-scale task-related labeled data. In this way, the general knowledge and representations about SITS can be transferred to a label-scarce task, thereby improving the generalization performance of the model as well as reducing the risk of overfitting. Comprehensive experiments have been carried out on three benchmark datasets over large study areas. Experimental results demonstrate the effectiveness of the proposed method, leading to a classification accuracy increment up to 2.38% to 5.27%. The code and the pre-trained model will be available at https://github.com/linlei1214/SITS-BERT upon publication.</div><div><b>This work has been submitted to the IEEE for possible publication. Copyright may be transferred without notice, after which this version may no longer be accessible.</b></div>


Author(s):  
Matthias Rempel

Sunspots are central to our understanding of solar (and stellar) magnetism in many respects. On the large scale, they link the magnetic field observable in the photosphere to the dynamo processes operating in the solar interior. Properly interpreting the constraints that sunspots impose on the dynamo process requires a detailed understanding of the processes involved in their formation, dynamical evolution and decay. On the small scale, they give an insight into how convective energy transport interacts with the magnetic field over a wide range of field strengths and inclination angles, leading to sunspot fine structure observed in the form of umbral dots and penumbral filaments. Over the past decade, substantial progress has been made on both observational and theoretical sides. Advanced ground- and space-based observations have resolved, for the first time, the details of umbral dots and penumbral filaments and discovered similarities in their substructures. Numerical models have advanced to the degree that simulations of entire sunspots with sufficient resolution to resolve sunspot fine structure are feasible. A combination of improved helioseismic inversion techniques with seismic forward modelling provides new views on the subsurface structure of sunspots. In this review, we summarize recent progress, with particular focus on numerical modelling.


2012 ◽  
Vol 8 (S294) ◽  
pp. 225-236
Author(s):  
M. Hanasz ◽  
D. Woltanski ◽  
K. Kowalik

AbstractWe review recent developments of amplification models of galactic and intergalactic magnetic field. The most popular scenarios involve variety of physical mechanisms, including turbulence generation on a wide range of physical scales, effects of supernovae, buoyancy as well as the magnetorotational instability. Other models rely on galaxy interaction, which generate galactic and intergalactic magnetic fields during galaxy mergers. We present also global galactic-scale numerical models of the Cosmic Ray (CR) driven dynamo, which was originally proposed by Parker (1992). We conduct a series of direct CR+MHD numerical simulations of the dynamics of the interstellar medium (ISM), composed of gas, magnetic fields and CR components. We take into account CRs accelerated in randomly distributed supernova (SN) remnants, and assume that SNe deposit small-scale, randomly oriented, dipolar magnetic fields into the ISM. The amplification timescale of the large-scale magnetic field resulting from the CR-driven dynamo is comparable to the galactic rotation period. The process efficiently converts small-scale magnetic fields of SN-remnants into galactic-scale magnetic fields. The resulting magnetic field structure resembles the X-shaped magnetic fields observed in edge-on galaxies.


2020 ◽  
Author(s):  
Yuan Yuan ◽  
Lei Lin

<div>Satellite image time series (SITS) classification is a major research topic in remote sensing and is relevant for a wide range of applications. Deep learning approaches have been commonly employed for SITS classification and have provided state-of-the-art performance. However, deep learning methods suffer from overfitting when labeled data is scarce. To address this problem, we propose a novel self-supervised pre-training scheme to initialize a Transformer-based network by utilizing large-scale unlabeled data. In detail, the model is asked to predict randomly contaminated observations given an entire time series of a pixel. The main idea of our proposal is to leverage the inherent temporal structure of satellite time series to learn general-purpose spectral-temporal representations related to land cover semantics. Once pre-training is completed, the pre-trained network can be further adapted to various SITS classification tasks by fine-tuning all the model parameters on small-scale task-related labeled data. In this way, the general knowledge and representations about SITS can be transferred to a label-scarce task, thereby improving the generalization performance of the model as well as reducing the risk of overfitting. Comprehensive experiments have been carried out on three benchmark datasets over large study areas. Experimental results demonstrate the effectiveness of the proposed method, leading to a classification accuracy increment up to 2.38% to 5.27%. The code and the pre-trained model will be available at https://github.com/linlei1214/SITS-BERT upon publication.</div><div><b>This work has been submitted to the IEEE for possible publication. Copyright may be transferred without notice, after which this version may no longer be accessible.</b></div>


2020 ◽  
Author(s):  
Yuan Yuan ◽  
Lei Lin

<div>Satellite image time series (SITS) classification is a major research topic in remote sensing and is relevant for a wide range of applications. Deep learning approaches have been commonly employed for SITS classification and have provided state-of-the-art performance. However, deep learning methods suffer from overfitting when labeled data is scarce. To address this problem, we propose a novel self-supervised pre-training scheme to initialize a Transformer-based network by utilizing large-scale unlabeled data. In detail, the model is asked to predict randomly contaminated observations given an entire time series of a pixel. The main idea of our proposal is to leverage the inherent temporal structure of satellite time series to learn general-purpose spectral-temporal representations related to land cover semantics. Once pre-training is completed, the pre-trained network can be further adapted to various SITS classification tasks by fine-tuning all the model parameters on small-scale task-related labeled data. In this way, the general knowledge and representations about SITS can be transferred to a label-scarce task, thereby improving the generalization performance of the model as well as reducing the risk of overfitting. Comprehensive experiments have been carried out on three benchmark datasets over large study areas. Experimental results demonstrate the effectiveness of the proposed method, leading to a classification accuracy increment up to 2.38% to 5.27%. The code and the pre-trained model will be available at https://github.com/linlei1214/SITS-BERT upon publication.</div><div><b>This work has been submitted to the IEEE for possible publication. Copyright may be transferred without notice, after which this version may no longer be accessible.</b></div>


2019 ◽  
Vol 862 ◽  
pp. 672-695 ◽  
Author(s):  
Timour Radko

A theoretical model is developed which illustrates the dynamics of layering instability, frequently realized in ocean regions with active fingering convection. Thermohaline layering is driven by the interplay between large-scale stratification and primary double-diffusive instabilities operating at the microscale – temporal and spatial scales set by molecular dissipation. This interaction is described by a combination of direct numerical simulations and an asymptotic multiscale model. The multiscale theory is used to formulate explicit and dynamically consistent flux laws, which can be readily implemented in large-scale analytical and numerical models. Most previous theoretical investigations of thermohaline layering were based on the flux-gradient model, which assumes that the vertical transport of density components is uniquely determined by their local background gradients. The key deficiency of this approach is that layering instabilities predicted by the flux-gradient model have unbounded growth rates at high wavenumbers. The resulting ultraviolet catastrophe precludes the analysis of such basic properties of layering instability as its preferred wavelength or the maximal growth rate. The multiscale model, on the other hand, incorporates hyperdiffusion terms that stabilize short layering modes. Overall, the presented theory carries the triple advantage of (i) offering an explicit description of the interaction between microstructure and layering modes, (ii) taking into account the influence of non-uniform stratification on microstructure-driven mixing, and (iii) avoiding unphysical behaviour of the flux-gradient laws at small scales. While the multiscale approach to the parametrization of time-dependent small-scale processes is illustrated here on the example of fingering convection, we expect the proposed technique to be readily adaptable to a wide range of applications.


2002 ◽  
Vol 757 ◽  
Author(s):  
S. Vomvoris ◽  
B. Lanyon ◽  
P. Marschall ◽  
K. Ando ◽  
T. Adachi ◽  
...  

ABSTRACTThe Gas Migration Test in the engineered barrier system (GMT) investigates the migration of waste-generated gas from low and intermediate level waste in a silo-type disposal concept. The EBS has now been emplaced and saturation was initiated in August 2001. The saturation patterns show heterogeneity within and between different layers of the EBS. Plans for the remaining test sequence are also presented.


Author(s):  
D.M. Seyedi ◽  
C. Plúa ◽  
M. Vitel ◽  
G. Armand ◽  
J. Rutqvist ◽  
...  

2021 ◽  
Vol 8 (1) ◽  
Author(s):  
Sungmin O. ◽  
Rene Orth

AbstractWhile soil moisture information is essential for a wide range of hydrologic and climate applications, spatially-continuous soil moisture data is only available from satellite observations or model simulations. Here we present a global, long-term dataset of soil moisture derived through machine learning trained with in-situ measurements, SoMo.ml. We train a Long Short-Term Memory (LSTM) model to extrapolate daily soil moisture dynamics in space and in time, based on in-situ data collected from more than 1,000 stations across the globe. SoMo.ml provides multi-layer soil moisture data (0–10 cm, 10–30 cm, and 30–50 cm) at 0.25° spatial and daily temporal resolution over the period 2000–2019. The performance of the resulting dataset is evaluated through cross validation and inter-comparison with existing soil moisture datasets. SoMo.ml performs especially well in terms of temporal dynamics, making it particularly useful for applications requiring time-varying soil moisture, such as anomaly detection and memory analyses. SoMo.ml complements the existing suite of modelled and satellite-based datasets given its distinct derivation, to support large-scale hydrological, meteorological, and ecological analyses.


Sign in / Sign up

Export Citation Format

Share Document