scholarly journals A new database structure for the IHFC Global Heat Flow Database

Author(s):  
Sven Fuchs ◽  
Graeme Beardsmore ◽  
Paolo Chiozzi ◽  
Orlando Miguel Espinoza-Ojeda ◽  
Gianluca Gola ◽  
...  

Periodic revisions of the Global Heat Flow Database (GHFD) take place under the auspices of the International Heat Flow Commission (IHFC) of the International Association of Seismology and Physics of the Earth's Interior (IASPEI). A growing number of heat-flow values, advances in scientific methods, digitization, and improvements in database technologies all warrant a revision of the structure of the GHFD that was last amended in 1976. We present a new structure for the GHFD, which will provide a basis for a reassessment and revision of the existing global heat-flow data set. The database fields within the new structure are described in detail to ensure a common understanding of the respective database entries. The new structure of the database takes advantage of today's possibilities for data management. It supports FAIR and open data principles, including interoperability with external data services, and links to DOI and IGSN numbers and other data resources (e.g., world geological map, world stratigraphic system, and International Ocean Drilling Program data). Aligned with this publication, a restructured version of the existing database is published, which provides a starting point for the upcoming collaborative process of data screening, quality control and revision. In parallel, the IHFC will work on criteria for a new quality scheme that will allow future users of the database to evaluate the quality of the collated heat-flow data based on specific criteria.

2021 ◽  
Author(s):  
Sven Fuchs ◽  
Graeme Beardsmore ◽  
Paolo Chiozzi ◽  
Orlando Miguel Espinoza-Ojeda ◽  
Gianluca Gola ◽  
...  

<p>The compilation of global heat-flow data is currently under major revision by the International Heat Flow Commission (IHFC) of the International Association of Seismology and Physics of the Earth's Interior (IASPEI). Heat flow represents a fundamental parameter in thermal studies, e.g., the evolution of hydrocarbons or mineral and geothermal resources. Comparable, comprehensible and reliable heat-flow data are of utmost interest also for geophysical and geological studies on the global scale. Here, we present the first results of a stepwise revision of the IHFC Global Heat Flow Database based on a researcher driven, collaborative approach. The first step comprises the review and revision of the most recent database structure established in 1976. The revised structure of the Global Heat Flow Database considers the demands and opportunities presented by the evolution of scientific work, digitization and the breakthroughs in database technologies over the past decades.  Based on the new structure, the existing dataset will be re-assessed and new data incorporated. By supporting the ideas of FAIR and open data principles, the new database facilitates interoperability with external data services, like DOI and IGSN numbers, and other data resources (e.g., world geological map, world stratigraphic system, and International Ocean Drilling Program data). We give an overview of the new database and introduce the community workflow of global heat-flow data revision.</p>


2019 ◽  
Vol 219 (3) ◽  
pp. 1648-1659 ◽  
Author(s):  
B Mather ◽  
L Moresi ◽  
P Rayner

SUMMARY The variation of temperature in the crust is difficult to quantify due to the sparsity of surface heat flow observations and lack of measurements on the thermal properties of rocks at depth. We examine the degree to which the thermal structure of the crust can be constrained from the Curie depth and surface heat flow data in Southeastern Australia. We cast the inverse problem of heat conduction within a Bayesian framework and derive its adjoint so that we can efficiently find the optimal model that best reproduces the data and prior information on the thermal properties of the crust. Efficiency gains obtained from the adjoint method facilitate a detailed exploration of thermal structure in SE Australia, where we predict high temperatures within Precambrian rocks of 650 °C due to relatively high rates of heat production (0.9–1.4 μW m−3). In contrast, temperatures within dominantly Phanerozoic crust reach only 520 °C at the Moho due to the low rates of heat production in Cambrian mafic volcanics. A combination of the Curie depth and heat flow data is required to constrain the uncertainty of lower crustal temperatures to ±73 °C. We also show that parts of the crust are unconstrained if either data set is omitted from the inversion.


2015 ◽  
Vol 19 (1) ◽  
pp. 71-81 ◽  
Author(s):  
M. Cristina Pattuelli ◽  
Matthew Miller

Purpose – The purpose of this paper is to describe a novel approach to the development and semantic enhancement of a social network to support the analysis and interpretation of digital oral history data from jazz archives and special collections. Design/methodology/approach – A multi-method approach was applied including automated named entity recognition and extraction to create a social network, and crowdsourcing techniques to semantically enhance the data through the classification of relations and the integration of contextual information. Linked open data standards provided the knowledge representation technique for the data set underlying the network. Findings – The study described here identifies the challenges and opportunities of a combination of a machine and a human-driven approach to the development of social networks from textual documents. The creation, visualization and enrichment of a social network are presented within a real-world scenario. The data set from which the network is based is accessible via an application programming interface and, thus, shareable with the knowledge management community for reuse and mash-ups. Originality/value – This paper presents original methods to address the issue of detecting and representing semantic relationships from text. Another element of novelty is in that it applies semantic web technologies to the construction and enhancement of the network and underlying data set, making the data readable across platforms and linkable with external data sets. This approach has the potential to make social networks dynamic and open to integration with external data sources.


2018 ◽  
Vol 2 (2) ◽  
Author(s):  
Rosa Maria Prol-Ledesma ◽  
Juan Luis Carrillo-de la Cruz ◽  
Marco Antonio Torres-Vera ◽  
Alejandra Selene Membrillo-Abad ◽  
Orlando Miguel Espinoza-Ojeda

Heat flow maps are a powerful tool for regional exploration of geothermal resources. Mexico is one of the main producers ofgeothermal energy and the search for undiscovered resources at a regional level should be based on heat flow values. Here, we present a heat flow map at 1:4,000,000 scale, produced with heat flow data compiled from open data bases and previously unpub-lished data. The compiled heat flow data includes bottom hole temperature, temperature logs, transient temperature measurements and measured temperature logs. The new data were calculated from temperature gradient information and estimating a mean con-ductivity value characteristic for the type of rock present in the stratigraphic column or assigning the mean conductivity value for the crust. Geothermal gradient and the thermal resistivity (inverse thermal conductivity) were plotted and heat flow was calculatedusing the Bullard method. The map covers the whole continental territory of Mexico and shows that most of the country has valueshigher than the world average. The highest heat flow values are concentrated in two provinces: the Gulf of California extensionalprovince and the Trans-Mexican Volcanic Belt.


2021 ◽  
Author(s):  
Alberto Pastorutti ◽  
Carla Braitenberg

<p>Partitioning of the Earth surface in "provinces": tectonic domains, outcropping geological units, crustal types, discrete classes extracted from age or geophysical data (e.g. tomography, gravity) is often employed to perform data imputation of ill-sampled observables (e.g. the similarity-based NGHF surface heat flow map [1]) or to constrain the parameters of ill-posed inverse problems (e.g. the gravimetric global Moho model GEMMA [2]).</p><p>We define provinces as noncontiguous areas where quantities or their relationships are similar. Following the goodness metric employed for proxy observables, an adequate province model should be able to significantly improve prediction of the extrapolated quantity. Interpolation of a quantity with no reliance on external data sets a predictivity benchmark, which a province-based prediction should exceed.<br>In a solid Earth modelling perspective, gravity, topography, and their relationship, seem ideal candidates to constrain a province clustering model. Earth gravity and topography, at resolutions of at least 100 km, are known with an incomparable sampling uniformity and negligible error, respect to other observables.</p><p>Most of the observed topography-gravity relationship can be explained by regional isostatic compensation. The topography, representing the load exerted on the lithosphere, is compensated by the elastic, thin-shell like response of the latter. In the spectral domain, flexure results in a lowpass transfer function between topography and isostatic roots. The signal of both surfaces, superimposed, is observed in the gravity field.<br>However, reality shows significant shifts from the ideal case: the separation of nonisostatic effects [3], such as density inhomogeneities, glacial isostatic adjustments, dynamic mantle processes, is nontrivial. Acknowledging this superposition, we aim at identifying clusters of similar topography-gravity transfer functions.</p><p>We evaluate the transfer functions, in the form of admittance and correlation [4], in the spherical harmonics domain. Spatial localization is achieved with the method by Wieczorek and Simons [5], using SHTOOLS [6]. Admittance and correlation spectra are computed on a set of regularly spaced sample points, each point being representative of the topo-gravity relationship in its proximity. The coefficients of the localized topo-gravity admittance and correlation spectra constitute each point features.</p><p>We present a set of experiments performed on synthetic models, in which we can control the variations of elastic parameters and non-isostatic contributions. These tests allowed to define both the feature extraction segment: the spatial localization method and the range of spherical harmonics degrees which are more sensible to lateral variations in flexural rigidity; and the clustering segment: metrics of the ground-truth clusters, performance of dimensionality reduction methods and of different clustering models.</p><p>[1] Lucazeau (2019). Analysis and Mapping of an Updated Terrestrial Heat Flow Data Set. doi:10.1029/2019GC008389<br>[2] Reguzzoni and Sampietro (2015). GEMMA: An Earth crustal model based on GOCE satellite data. doi:10.1016/j.jag.2014.04.002<br>[3] Bagherbandi and Sjöberg (2013). Improving gravimetric–isostatic models of crustal depth by correcting for non-isostatic effects and using CRUST2.0. doi:10.1016/j.earscirev.2012.12.002<br>[4] Simons et al. (1997). Localization of gravity and topography: Constraints on the tectonics and mantle dynamics of Venus. doi:10.1111/j.1365-246X.1997.tb00593.x<br>[5] Wieczorek and Simons (2005). Localized spectral analysis on the sphere. doi:10.1111/j.1365-246X.2005.02687.x<br>[6] Wieczorek and Meschede (2018). SHTools: Tools for Working with Spherical Harmonics. doi:10.1029/2018GC007529</p>


2016 ◽  
Vol 16 (2) ◽  
pp. 417-429 ◽  
Author(s):  
R. Figueiredo ◽  
M. Martina

Abstract. One of the necessary components to perform catastrophe risk modelling is information on the buildings at risk, such as their spatial location, geometry, height, occupancy type and other characteristics. This is commonly referred to as the exposure model or data set. When modelling large areas, developing exposure data sets with the relevant information about every individual building is not practicable. Thus, census data at coarse spatial resolutions are often used as the starting point for the creation of such data sets, after which disaggregation to finer resolutions is carried out using different methods, based on proxies such as the population distribution. While these methods can produce acceptable results, they cannot be considered ideal. Nowadays, the availability of open data is increasing and it is possible to obtain information about buildings for some regions. Although this type of information is usually limited and, therefore, insufficient to generate an exposure data set, it can still be very useful in its elaboration. In this paper, we focus on how open building data can be used to develop a gridded exposure model by disaggregating existing census data at coarser resolutions. Furthermore, we analyse how the selection of the level of spatial resolution can impact the accuracy and precision of the model, and compare the results in terms of affected residential building areas, due to a flood event, between different models.


2020 ◽  
Vol 245 ◽  
pp. 08026
Author(s):  
Leonid Serkin

The ATLAS Collaboration is releasing a new set of proton–proton collision data to the public for educational purposes. The data was collected by the ATLAS detector at the Large Hadron Collider at a centre-of-mass energy √s = 13 TeV during the year 2016 and corresponds to an integrated luminosity of 10 fb−1. This dataset is accompanied by simulated events describing several Standard Model processes, as well as hypothetical Beyond Standard Model signal processes. Associated computing tools are provided to make the analysis of the dataset easily accessible. In the following, we summarise the properties of the 13 TeV ATLAS Open Data set and the available analysis tools. Several examples intended as a starting point for further analysis work by users are shown. The general aim of the dataset and tools released is to provide user-friendly and straightforward interactive interfaces to replicate the procedures used by high-energy-physics researchers and enable users to experience the analysis of particle-physics data in educational environments.


Sign in / Sign up

Export Citation Format

Share Document