Uncertainty Estimation for Magnetic Maps

Author(s):  
Richard Saltus ◽  
Arnaud Chulliat ◽  
Brian Meyer ◽  
Christopher Amante

<p>Magnetic maps depict spatial variations in the Earth’s magnetic field.  These variations occur at a wide range of scales and are produced via a variety of physical processes related to factors including structure and evolution of the Earth’s core field and the geologic distribution of magnetic minerals in the lithosphere.  Mankind has produced magnetic maps for 100’s of years with increasing fidelity and accuracy and there is a general understanding (particularly among the geophysicists who produce and use these maps) of the approximate level of resolution and accuracy of these maps.  However, few magnetic maps, or the digital grids that typically underpin these maps, have been produced with accompanying uncertainty quantification.  When uncertainty is addressed, it is typically a statistical representation at the grid or survey level (e.g., +- 10 nT overall uncertainty based on line crossings for a modern airborne survey) and not at the cell by cell local level.</p><p>As magnetic map data are increasingly used in complex inversions and in combination with other data or constraints (including in machine learning applications), it is increasingly important to have a handle on the uncertainties in these data.  An example of an application with need for detailed uncertainty estimation is the use of magnetic map information for alternative navigation.  In this application data from an onboard magnetometer is compared with previously mapped (or modeled) magnetic variations.  The uncertainty of this previously mapped information has immediate implications for the potential accuracy of navigation.</p><p>We are exploring the factors contributing to magnetic map uncertainty and producing uncertainty estimates for testing using new data collection in previously mapped (or modeled) map areas.  These factors include (but are likely not limited to) vintage and type of measured data, spatial distribution of measured data, expectation of magnetic variability (e.g., geologic or geochemical environment), statistics of redundant measurement, and spatial scale/resolution of the magnetic map or model.  The purpose of this talk is to discuss the overall issue and our initial results and solicit feedback and ideas from the interpretation community.</p>

Fractals ◽  
1993 ◽  
Vol 01 (01) ◽  
pp. 87-115 ◽  
Author(s):  
B. LEA COX ◽  
J. S. Y. WANG

Earth scientists have measured fractal dimensions of surfaces by different techniques, including the divider, box, triangle, slit-island, power spectral, variogram and distribution methods. We review these seven measurement techniques, finding that fractal dimensions may vary systematically with measurement method. We discuss possible reasons for these differences, and point to common problems shared by all of the methods, including the remainder problem, curve-fitting, orientation of the measurement plane, size and direction of the sample. Fractal measurements have been applied to many problems in the earth sciences, at a wide range of spatial scales. These include map data of topography; fault traces and fracture networks; fracture surfaces of natural rocks, both in the field and at laboratory scales; metal surfaces; porous aggregate geometry; flow and transport through heterogeneous systems; and various microscopic surface phenomena associated with adsorption, aggregation, erosion and chemical dissolution. We review these applications and discuss the usefulness and limitations of fractal analysis to these types of problems in the earth sciences.


2019 ◽  
Vol 942 (12) ◽  
pp. 41-49
Author(s):  
A.M. Portnov

Using unified principles of formation and maintenance of register/cadaster with information about spatial data of landscape objects as the informational and technological basis for updating the public topographic maps and modernization of state cartographic system is proposed. The problems of informational relevancy of unified electronical cartographic basis and capacity of its renovation in case of public cadaster map data. The need to modernize the system of classification and coding of cartographic information, the use of unified standards for the coordinate description of register objects for their topological consistency, verification and updating is emphasized. Implementing such solutions is determined by economical expediency as well as necessity of providing a variety of real thematic data for wide range of consumers in the field of urban planning, territories development and completing the tasks of Governmental program “Digital economy of the Russian Federation”.


This volume vividly demonstrates the importance and increasing breadth of quantitative methods in the earth sciences. With contributions from an international cast of leading practitioners, chapters cover a wide range of state-of-the-art methods and applications, including computer modeling and mapping techniques. Many chapters also contain reviews and extensive bibliographies which serve to make this an invaluable introduction to the entire field. In addition to its detailed presentations, the book includes chapters on the history of geomathematics and on R.G.V. Eigen, the "father" of mathematical geology. Written to commemorate the 25th anniversary of the International Association for Mathematical Geology, the book will be sought after by both practitioners and researchers in all branches of geology.


Author(s):  
David Fisher

There are eight columns in the Periodic Table. The eighth column is comprised of the rare gases, so-called because they are the rarest elements on earth. They are also called the inert or noble gases because, like nobility, they do no work. They are colorless, odorless, invisible gases which do not react with anything, and were thought to be unimportant until the early 1960s. Starting in that era, David Fisher has spent roughly fifty years doing research on these gases, publishing nearly a hundred papers in the scientific journals, applying them to problems in geophysics and cosmochemistry, and learning how other scientists have utilized them to change our ideas about the universe, the sun, and our own planet. Much Ado about (Practically) Nothing will cover this spectrum of ideas, interspersed with the author's own work which will serve to introduce each gas and the important work others have done with them. The rare gases have participated in a wide range of scientific advances-even revolutions-but no book has ever recorded the entire story. Fisher will range from the intricacies of the atomic nucleus and the tiniest of elementary particles, the neutrino, to the energy source of the stars; from the age of the earth to its future energies; from life on Mars to cancer here on earth. A whole panoply that has never before been told as an entity.


Computers ◽  
2021 ◽  
Vol 10 (6) ◽  
pp. 82
Author(s):  
Ahmad O. Aseeri

Deep Learning-based methods have emerged to be one of the most effective and practical solutions in a wide range of medical problems, including the diagnosis of cardiac arrhythmias. A critical step to a precocious diagnosis in many heart dysfunctions diseases starts with the accurate detection and classification of cardiac arrhythmias, which can be achieved via electrocardiograms (ECGs). Motivated by the desire to enhance conventional clinical methods in diagnosing cardiac arrhythmias, we introduce an uncertainty-aware deep learning-based predictive model design for accurate large-scale classification of cardiac arrhythmias successfully trained and evaluated using three benchmark medical datasets. In addition, considering that the quantification of uncertainty estimates is vital for clinical decision-making, our method incorporates a probabilistic approach to capture the model’s uncertainty using a Bayesian-based approximation method without introducing additional parameters or significant changes to the network’s architecture. Although many arrhythmias classification solutions with various ECG feature engineering techniques have been reported in the literature, the introduced AI-based probabilistic-enabled method in this paper outperforms the results of existing methods in outstanding multiclass classification results that manifest F1 scores of 98.62% and 96.73% with (MIT-BIH) dataset of 20 annotations, and 99.23% and 96.94% with (INCART) dataset of eight annotations, and 97.25% and 96.73% with (BIDMC) dataset of six annotations, for the deep ensemble and probabilistic mode, respectively. We demonstrate our method’s high-performing and statistical reliability results in numerical experiments on the language modeling using the gating mechanism of Recurrent Neural Networks.


2012 ◽  
Vol 717-720 ◽  
pp. 1101-1104 ◽  
Author(s):  
M.G. Jaikumar ◽  
Shreepad Karmalkar

4H-Silicon Carbide VDMOSFET is simulated using the Sentaurus TCAD package of Synopsys. The simulator is calibrated against measured data for a wide range of bias conditions and temperature. Material parameters of 4H-SiC are taken from literature and used in the available silicon models of the simulator. The empirical parameters are adjusted to get a good fit between the simulated curves and measured data. The simulation incorporates the bias and temperature dependence of important physical mechanisms like interface trap density, coulombic interface trap scattering, surface roughness scattering and velocity saturation.


Radiocarbon ◽  
2001 ◽  
Vol 43 (2B) ◽  
pp. 731-742 ◽  
Author(s):  
D Lal ◽  
A J T Jull

Nuclear interactions of cosmic rays produce a number of stable and radioactive isotopes on the earth (Lai and Peters 1967). Two of these, 14C and 10Be, find applications as tracers in a wide variety of earth science problems by virtue of their special combination of attributes: 1) their source functions, 2) their half-lives, and 3) their chemical properties. The radioisotope, 14C (half-life = 5730 yr) produced in the earth's atmosphere was the first to be discovered (Anderson et al. 1947; Libby 1952). The next longer-lived isotope, also produced in the earth's atmosphere, 10Be (half-life = 1.5 myr) was discovered independently by two groups within a decade (Arnold 1956; Goel et al. 1957; Lal 1991a). Both the isotopes are produced efficiently in the earth's atmosphere, and also in solids on the earth's surface. Independently and jointly they serve as useful tracers for characterizing the evolutionary history of a wide range of materials and artifacts. Here, we specifically focus on the production of 14C in terrestrial solids, designated as in-situ-produced 14C (to differentiate it from atmospheric 14C, initially produced in the atmosphere). We also illustrate the application to several earth science problems. This is a relatively new area of investigations, using 14C as a tracer, which was made possible by the development of accelerator mass spectrometry (AMS). The availability of the in-situ 14C variety has enormously enhanced the overall scope of 14C as a tracer (singly or together with in-situ-produced 10Be), which eminently qualifies it as a unique tracer for studying earth sciences.


2009 ◽  
Vol 404 ◽  
pp. 61-67 ◽  
Author(s):  
Michael N. Morgan ◽  
V. Baines-Jones

The delivery of grinding fluid to the contact zone is generally achieved via a nozzle. The nozzle geometry influences the fluid velocity and flow pattern on exit from the nozzle orifice. It is important to the efficiency of the process and to the performance of the operation that the fluid is delivered in a manner that ensures the desired jet velocity has adequate coverage of the contact zone. Often, assumptions about adequate coverage are based on visual inspections of the jet coherence. This paper provides new insight into the internal nozzle flows and the coherent length of a wide range of nozzle designs. The work presents a new analytical model to predict coherent length which is shown to correlate well with measured data from experiment. Recommendations are given to guide a user to optimal design of nozzles to ensure adequate fluid supply to the contact zone.


Author(s):  
Joachim Kurzke

Precise simulations of gas turbine performance cannot be done without component maps. In the early days of a new project one often has to use scaled maps of similar machines. Alternatively one can calculate the component partload characteristics provided that the many details needed for such an exercise are available. In a later stage often rig tests will be done to get detailed information about the behavior of the compressors respectively turbines. Performance calculation programs usually require the map data in a specific format. To produce this format needs some preprocessing. Measured data cannot be used directly because they show a scatter and they are not evenly distributed over the range of interest. Due to limitations in the test equipment often there is lack of data for very low and very high speed. With the help of a specialized drawing program available on a PC one can easily eliminate the scatter in the data and also inter- and extrapolate additional lines of constant corrected speed. Many graphs showing both the measured data and the lines passing through the data as a function of physically meaningful parameters allow to check whether the result makes sense or not. The extrapolation of compressor maps toward very low speed, as required for the calculation of starting, idle and windmilling performance calculations, is discussed in some detail. Instead of true measured data one can use data read from maps published in open literature. The program is also an excellent tool for checking and extending component maps one has derived from sparse information about a gas turbine to be simulated.


2020 ◽  
Vol 24 (4) ◽  
pp. 2061-2081 ◽  
Author(s):  
Xudong Zhou ◽  
Jan Polcher ◽  
Tao Yang ◽  
Ching-Sheng Huang

Abstract. Ensemble estimates based on multiple datasets are frequently applied once many datasets are available for the same climatic variable. An uncertainty estimate based on the difference between the ensemble datasets is always provided along with the ensemble mean estimate to show to what extent the ensemble members are consistent with each other. However, one fundamental flaw of classic uncertainty estimates is that only the uncertainty in one dimension (either the temporal variability or the spatial heterogeneity) can be considered, whereas the variation along the other dimension is dismissed due to limitations in algorithms for classic uncertainty estimates, resulting in an incomplete assessment of the uncertainties. This study introduces a three-dimensional variance partitioning approach and proposes a new uncertainty estimation (Ue) that includes the data uncertainties in both spatiotemporal scales. The new approach avoids pre-averaging in either of the spatiotemporal dimensions and, as a result, the Ue estimate is around 20 % higher than the classic uncertainty metrics. The deviation of Ue from the classic metrics is apparent for regions with strong spatial heterogeneity and where the variations significantly differ in temporal and spatial scales. This shows that classic metrics underestimate the uncertainty through averaging, which means a loss of information in the variations across spatiotemporal scales. Decomposing the formula for Ue shows that Ue has integrated four different variations across the ensemble dataset members, while only two of the components are represented in the classic uncertainty estimates. This analysis of the decomposition explains the correlation as well as the differences between the newly proposed Ue and the two classic uncertainty metrics. The new approach is implemented and analysed with multiple precipitation products of different types (e.g. gauge-based products, merged products and GCMs) which contain different sources of uncertainties with different magnitudes. Ue of the gauge-based precipitation products is the smallest, while Ue of the other products is generally larger because other uncertainty sources are included and the constraints of the observations are not as strong as in gauge-based products. This new three-dimensional approach is flexible in its structure and particularly suitable for a comprehensive assessment of multiple datasets over large regions within any given period.


Sign in / Sign up

Export Citation Format

Share Document