Assimilating sparse data in glaciological inverse problems

Author(s):  
Daniel Shapero ◽  
Reuben Nixon-Hill

<p>Most of the existing work on solving inverse problems in glaciology has assumed that the observational data used to constrain the model are spatially dense. This assumption is very convenient because it means that the model-data misfit term in the objective functional can be written as an integral. In many scenarios, however, the computational mesh can locally be much finer than the observational grid, or the observations can have large patches of missing data. Moreover, pretending as if the observations are a globally-defined continuous field obscures valuable information about the number of independent measurements we have. It is then impossible to apply a posteriori sanity checks on the expected model-data misfit from regression theory. Here we'll describe some recent work we've done on assimilating sparse point data into ice flow models and how this allows us to be more rigorous about the statistical interpretation of our results. For now we are focusing on the kinds of inverse problems that have been solved in the glaciology literature for a long time -- inferring rheology and basal friction from surface velocities. But these developments open up the possibility of assimilating new sources of data, such as measurements from strain gauges or ice cores.</p>

1978 ◽  
Vol 20 (82) ◽  
pp. 3-26 ◽  
Author(s):  
C.U. Hammer ◽  
H. B. Clausen ◽  
W. Dansgaard ◽  
N. Gundestrup ◽  
S. J. Johnsen ◽  
...  

AbstractThe available methods for dating of ice cores are based on radioactive decay, ice-flow calculations, or stratigraphic observations. The two former categories are broadly outlined, and special emphasis is given to stratigraphic methods. Reference horizons are established back to A.D. 1783, in the form of elevated electrical conductivities due to fallout of soluble volcanic debris. Seasonal variations in the concentrations of insoluble microparticles and/or stable isotopes are measured over the entire 400 m lengths of three ice cores, recovered by Greenland Ice Sheet Program (GISP). The resulting absolute time scales are probably accurate within a few years per thousand. Techniques are outlined for re-establishing the approximate, original shape of heavy-isotope profiles that have been more or less smoothed by diffusion in firn and ice. Annual-layer thickness measurements on 24 increments down to 1130 m depth in the Camp Century ice core determine a flow pattern, consistent with that suggested by Dansgaard and Johnsen (1969), and a Camp Century time scale with an estimated uncertainty better than 3% back to 10000 years B.P.


2018 ◽  
Author(s):  
Oliver Bothe ◽  
Sebastian Wagner ◽  
Eduardo Zorita

Abstract. Climate reconstructions are means to extract the signal from uncertain paleo-observations, i.e. proxies. It is essential to evaluate these to understand and quantify their uncertainties. Similarly, comparing climate simulations and proxies requires approaches to bridge the, e.g., temporal and spatial differences between both and address their specific uncertainties. One way to achieve these two goals are so called pseudoproxies. These are surrogate proxy records within, e.g., the virtual reality of a climate simulation. They in turn depend on an understanding of the uncertainties of the real proxies, i.e. the noise-characteristics disturbing the original environmental signal. Common pseudoproxy approaches so far concentrated on data with high temporal resolution from, e.g., tree-rings or ice-cores over the last approximately 2,000 years. Here we provide a simple but flexible noise model for potentially low-resolution sedimentary climate proxies for temperature on millennial time-scales, the code for calculating a set of pseudoproxies from a simulation and, for one simulation, the pseudoproxies themselves. The noise model considers the influence of other environmental variables, a dependence on the climate state, a bias due to changing seasonality, modifications of the archive (e.g., bioturbation), potential sampling variability, and a measurement error. Model, code, and data should allow to develop new ways of comparing simulation data with proxies on long time-scales. Code and data are available at https://doi.org/10.17605/OSF.IO/ZBEHX.


2000 ◽  
Vol 54 (3) ◽  
pp. 348-358 ◽  
Author(s):  
Valérie Masson ◽  
Françoise Vimeux ◽  
Jean Jouzel ◽  
Vin Morgan ◽  
Marc Delmotte ◽  
...  

A comparison is made of the Holocene records obtained from water isotope measurements along 11 ice cores from coastal and central sites in east Antarctica (Vostok, Dome B, Plateau Remote, Komsomolskaia, Dome C, Taylor Dome, Dominion Range, D47, KM105, and Law Dome) and west Antarctica (Byrd), with temporal resolution from 20 to 50 yr. The long-term trends possibly reflect local ice sheet elevation fluctuations superimposed on common climatic fluctuations. All the records confirm the widespread Antarctic early Holocene optimum between 11,500 and 9000 yr; in the Ross Sea sector, a secondary optimum is identified between 7000 and 5000 yr, whereas all eastern Antarctic sites show a late optimum between 6000 and 3000 yr. Superimposed on the long time trend, all the records exhibit 9 aperiodic millennial-scale oscillations. Climatic optima show a reduced pacing between warm events (typically 800 yr), whereas cooler periods are associated with less-frequent warm events (pacing >1200 yr).


1999 ◽  
Vol 45 (149) ◽  
pp. 154-164 ◽  
Author(s):  
C. David Chadwell

Abstract Measurement of glacier surface velocity provides some constraint on glacier flow models used to date ice cores recovered near the flow divide of remote high-altitude ice caps. The surface velocity is inferred from the change in position of a network of stakes estimated from the least-squares adjustment of geodetic observations – terrestrial and/or spaced-based – collected approximately 1 year apart. The lack of outliers in and the random distribution of the post-fit observation residuals are regarded as evidence that the observations contain no blunders. However, if the network lacks sufficient geometric redundancy, the estimated stake positions can shift to fit erroneous observations. To determine the maximum size of these potential undetected shifts, given the covariance of the observations and the approximate network geometry, expressions are developed to analyze a network for redundancy number and marginally detectable blunders (internal reliability), and the position shifts from marginally detectable blunders (external reliability). Two stake networks, one on the col of Huascar–n (9°–07'S, 77°–37'W; 6050 m a.s.l.) in the north-central Andes of Peru and one on the Guliya ice cap (35°–17'N, 81°–29'E; 6200 m a.s.l.) on the Qinghai–Tibetan Plateau in China, are examined for precision and internal and external reliability.


2020 ◽  
Vol 22 (5) ◽  
pp. 1093-1121
Author(s):  
J. Fernández-Pato ◽  
S. Martínez-Aranda ◽  
M. Morales-Hernández ◽  
P. García-Navarro

Abstract Culverts allow roads to safely traverse small streams or drainage ditches, and their proper design is critical to ensure a safe and reliable transportation network. A correct modelization of these hydraulic structures becomes crucial in the assessment of flood footprints or discharge peak estimation in a risk evaluation plan. The question of how to include culverts comes up frequently when assembling a hydraulic model that requires the presence of as many singular elements as possible. In this work, three different culvert integrations with the surface domain are studied and compared in the context of a 2D shallow water (SW) model. All of them are based on the Federal Highway Administration (FHWA) formulation for the culvert discharge estimation but differ in complexity and in the interaction with the numerical model for surface flow, some of them as internal boundary conditions. Several steady and unsteady validation test cases are presented and the numerical results are compared with the predictions from HEC-RAS 1D and HY-8 software. The culvert area, shape and their sensitivity to the 2D computational mesh is also analyzed.


2014 ◽  
Vol 10 (3) ◽  
pp. 2547-2594
Author(s):  
L. B. Stap ◽  
R. S. W. van de Wal ◽  
B. de Boer ◽  
R. Bintanja ◽  
L. J. Lourens

Abstract. During the Cenozoic, land ice and climate have interacted on many different time scales. On long time scales, the effect of land ice on global climate and sea level is mainly set by large ice sheets on North America, Eurasia, Greenland and Antarctica. The climatic forcing of these ice sheets is largely determined by the meridional temperature profile resulting from radiation and greenhouse gas (GHG) forcing. As response, the ice sheets cause an increase in albedo and surface elevation, which operates as a feedback in the climate system. To quantify the importance of these climate-land ice processes, a zonally-averaged energy balance climate model is coupled to five one-dimensional ice-sheet models, representing the major ice sheets. In this study, we focus on the transient simulation of the past 800 000 years, where a high-confidence CO2-record from ice cores samples is used as input in combination with Milankovitch radiation changes. We obtain simulations of atmospheric temperature, ice volume and sea level, that are in good agreement with recent proxy-data reconstructions. We examine long-term climate-ice sheet interactions by a comparison of simulations with uncoupled and coupled ice sheets. We show that these interactions amplify global temperature anomalies by up to a factor 2.6, and that they increase polar amplification by 94%. We demonstrate that, on these long time scales, the ice-albedo feedback has a larger and more global influence on the meridional atmospheric temperature profile than the surface-height temperature feedback. Furthermore, we assess the influence of CO2 and insolation, by performing runs with one or both of these variables held constant. We find that atmospheric temperature is controlled by a complex interaction of CO2 and insolation, and both variables serve as thresholds for Northern Hemispheric glaciation.


1999 ◽  
Vol 45 (149) ◽  
pp. 154-164 ◽  
Author(s):  
C. David Chadwell

AbstractMeasurement of glacier surface velocity provides some constraint on glacier flow models used to date ice cores recovered near the flow divide of remote high-altitude ice caps. The surface velocity is inferred from the change in position of a network of stakes estimated from the least-squares adjustment of geodetic observations – terrestrial and/or spaced-based – collected approximately 1 year apart. The lack of outliers in and the random distribution of the post-fit observation residuals are regarded as evidence that the observations contain no blunders. However, if the network lacks sufficient geometric redundancy, the estimated stake positions can shift to fit erroneous observations. To determine the maximum size of these potential undetected shifts, given the covariance of the observations and the approximate network geometry, expressions are developed to analyze a network for redundancy number and marginally detectable blunders (internal reliability), and the position shifts from marginally detectable blunders (external reliability). Two stake networks, one on the col of Huascar–n (9°–07'S, 77°–37'W; 6050 m a.s.l.) in the north-central Andes of Peru and one on the Guliya ice cap (35°–17'N, 81°–29'E; 6200 m a.s.l.) on the Qinghai–Tibetan Plateau in China, are examined for precision and internal and external reliability.


2010 ◽  
Vol 437 ◽  
pp. 492-496 ◽  
Author(s):  
Lei Chen ◽  
Zhuang De Jiang ◽  
Bing Li ◽  
Jian Jun Ding ◽  
Fei Zhang

In reverse engineering, complex free-form shaped parts are usually digitized quickly and accurately using the newly arisen non-contact measuring methods. However, they produce extremely dense point data at great rate. Not all the point data are necessary for generating a surface CAD model. Moreover, owing to inefficiencies in storing and manipulating them it takes a long time to generate a surface CAD model from the measured data. Therefore, an important task is to reduce the large amount of data. After analyzing the existing methods developed by other researchers, a new data reduction method, which based on bi-directional point cloud slicing, is presented in this paper. Using the proposed method, point cloud can be reduced while considering geometric features in both two parametric directions. Finally, a face model is used to verify the effectiveness of the proposed method and experimental results are given.


Sign in / Sign up

Export Citation Format

Share Document