An Ontology-based approach to enable data-driven research in the field of NDT in Civil Engineering

Author(s):  
Benjamin Moreno-Torres ◽  
Christoph Völker ◽  
Sabine Kruschwitz

<div> <p>Non-destructive testing (NDT) data in civil engineering is regularly used for scientific analysis. However, there is no uniform representation of the data yet. An analysis of distributed data sets across different test objects is therefore too difficult in most cases.</p> <p>To overcome this, we present an approach for an integrated data management of distributed data sets based on Semantic Web technologies. The cornerstone of this approach is an ontology, a semantic knowledge representation of our domain. This NDT-CE ontology is later populated with the data sources. Using the properties and the relationships between concepts that the ontology contains, we make these data sets meaningful also for machines. Furthermore, the ontology can be used as a central interface for database access. Non-domain data sources can be integrated by linking them with the NDT ontology, making them directly available for generic use in terms of digitization. Based on an extensive literature research, we outline the possibilities that result for NDT in civil engineering, such as computer-aided sorting and analysis of measurement data, and the recognition and explanation of correlations.</p> <p>A common knowledge representation and data access allows the scientific exploitation of existing data sources with data-based methods (such as image recognition, measurement uncertainty calculations, factor analysis or material characterization) and simplifies bidirectional knowledge and data transfer between engineers and NDT specialists.</p> </div>

2020 ◽  
Author(s):  
Christoph Völker ◽  
Benjamin Moreno-Torres ◽  
Sabine Kruschwitz

<p>In the field of non-destructive testing (NDT) in civil engineering, a large number of measurement data are collected. Although they serve as a basis for scientific analyses, there is still no uniform representation of the data. An analysis of various distributed data sets across different test objects is therefore only possible with high manual effort.</p><p>We present a system architecture for an integrated data management of distributed data sets based on Semantic Web technologies. The approach is essentially based on a mathematical model - the so-called ontology - which represents the knowledge of our domain NDT. The ontology developed by us is linked to data sources and thus describes the semantic meaning of the data. Furthermore, the ontology acts as a central concept for database access. Non-domain data sources can be easily integrated by linking them to the NDT construction ontology and are directly available for generic use in the sense of digitization. Based on an extensive literature research, we outline the possibilities that this offers for NDT in civil engineering, such as computer-aided sorting, analysis, recognition and explanation of relationships (explainable AI) for several million measurement data.</p><p>The expected benefits of this approach of knowledge representation and data access for the NDT community are an expansion of knowledge through data exchange in research (interoperability), the scientific exploitation of large existing data sources with data-based methods (such as image recognition, measurement uncertainty calculations, factor analysis, material characterization) and finally a simplified exchange of NDT data with engineering models and thus with the construction industry.</p><p>Ontologies are already the core of numerous intelligent systems such as building information modeling or research databases. This contribution gives an overview of the range of tools we are currently creating to communicate with them.</p>


Author(s):  
Bartosz Dobrzelecki ◽  
Amrey Krause ◽  
Alastair C. Hume ◽  
Alistair Grant ◽  
Mario Antonioletti ◽  
...  

OGSA-DAI (Open Grid Services Architecture Data Access and Integration) is a framework for building distributed data access and integration systems. Until recently, it lacked the built-in functionality that would allow easy creation of federations of distributed data sources. The latest release of the OGSA-DAI framework introduced the OGSA-DAI DQP (Distributed Query Processing) resource. The new resource encapsulates a distributed query processor, that is able to orchestrate distributed data sources when answering declarative user queries. The query processor has many extensibility points, making it easy to customize. We have also introduced a new OGSA-DAI V iews resource that provides a flexible method for defining views over relational data. The interoperability of the two new resources, together with the flexibility of the OGSA-DAI framework, allows the building of highly customized data integration solutions.


2018 ◽  
Author(s):  
Mohammad Sadnan Al Manir ◽  
Jon Haël Brenas ◽  
Christopher JO Baker ◽  
Arash Shaban-Nejad

BACKGROUND According to the World Health Organization, malaria surveillance is weakest in countries and regions with the highest malaria burden. A core obstacle is that the data required to perform malaria surveillance are fragmented in multiple data silos distributed across geographic regions. Furthermore, consistent integrated malaria data sources are few, and a low degree of interoperability exists between them. As a result, it is difficult to identify disease trends and to plan for effective interventions. OBJECTIVE We propose the Semantics, Interoperability, and Evolution for Malaria Analytics (SIEMA) platform for use in malaria surveillance based on semantic data federation. Using this approach, it is possible to access distributed data, extend and preserve interoperability between multiple dynamic distributed malaria sources, and facilitate detection of system changes that can interrupt mission-critical global surveillance activities. METHODS We used Semantic Automated Discovery and Integration (SADI) Semantic Web Services to enable data access and improve interoperability, and the graphical user interface-enabled semantic query engine HYDRA to implement the target queries typical of malaria programs. We implemented a custom algorithm to detect changes to community-developed terminologies, data sources, and services that are core to SIEMA. This algorithm reports to a dashboard. Valet SADI is used to mitigate the impact of changes by rebuilding affected services. RESULTS We developed a prototype surveillance and change management platform from a combination of third-party tools, community-developed terminologies, and custom algorithms. We illustrated a methodology and core infrastructure to facilitate interoperable access to distributed data sources using SADI Semantic Web services. This degree of access makes it possible to implement complex queries needed by our user community with minimal technical skill. We implemented a dashboard that reports on terminology changes that can render the services inactive, jeopardizing system interoperability. Using this information, end users can control and reactively rebuild services to preserve interoperability and minimize service downtime. CONCLUSIONS We introduce a framework suitable for use in malaria surveillance that supports the creation of flexible surveillance queries across distributed data resources. The platform provides interoperable access to target data sources, is domain agnostic, and with updates to core terminological resources is readily transferable to other surveillance activities. A dashboard enables users to review changes to the infrastructure and invoke system updates. The platform significantly extends the range of functionalities offered by malaria information systems, beyond the state-of-the-art.


2021 ◽  
Vol 13 (12) ◽  
pp. 2426
Author(s):  
Benjamí Moreno Torres ◽  
Christoph Völker ◽  
Sarah Mandy Nagel ◽  
Thomas Hanke ◽  
Sabine Kruschwitz

Although measurement data from the civil engineering sector are an important basis for scientific analyses in the field of non-destructive testing (NDT), there is still no uniform representation of these data. An analysis of data sets across different test objects or test types is therefore associated with a high manual effort. Ontologies and the semantic web are technologies already used in numerous intelligent systems such as material cyberinfrastructures or research databases. This contribution demonstrates the application of these technologies to the case of the 1H nuclear magnetic resonance relaxometry, which is commonly used to characterize water content and porosity distribution in solids. The methodology implemented for this purpose was developed specifically to be applied to materials science (MS) tests. The aim of this paper is to analyze such a methodology from the perspective of data interoperability using ontologies. Three benefits are expected from this approach to the study of the implementation of interoperability in the NDT domain: First, expanding knowledge of how the intrinsic characteristics of the NDT domain determine the application of semantic technologies. Second, to determine which aspects of such an implementation can be improved and in what ways. Finally, the baselines of future research in the field of data integration for NDT are drawn.


2017 ◽  
Vol 3 ◽  
pp. e115 ◽  
Author(s):  
Johannes M. Schleicher ◽  
Michael Vögler ◽  
Christian Inzinger ◽  
Schahram Dustdar

The ever-growing amount of data produced by and in today’s smart cities offers significant potential for novel applications created by city stakeholders as well as third parties. Current smart city application models mostly assume that data is exclusively managed by and bound to its original application and location. We argue that smart city data must not be constrained to such data silos so that future smart city applications can seamlessly access and integrate data from multiple sources across multiple cities. In this paper, we present a methodology and toolset to model available smart city data sources and enable efficient, distributed data access in smart city environments. We introduce a modeling abstraction to describe the structure and relevant properties, such as security and compliance constraints, of smart city data sources along with independently accessible subsets in a technology-agnostic way. Based on this abstraction, we present a middleware toolset for efficient and seamless data access through autonomous relocation of relevant subsets of available data sources to improve Quality of Service for smart city applications based on a configurable mechanism. We evaluate our approach using a case study in the context of a distributed city infrastructure decision support system and show that selective relocation of data subsets can significantly reduce application response times.


Author(s):  
Christian Luksch ◽  
Lukas Prost ◽  
Michael Wimmer

We present a real-time rendering technique for photometric polygonal lights. Our method uses a numerical integration technique based on a triangulation to calculate noise-free diffuse shading. We include a dynamic point in the triangulation that provides a continuous near-field illumination resembling the shape of the light emitter and its characteristics. We evaluate the accuracy of our approach with a diverse selection of photometric measurement data sets in a comprehensive benchmark framework. Furthermore, we provide an extension for specular reflection on surfaces with arbitrary roughness that facilitates the use of existing real-time shading techniques. Our technique is easy to integrate into real-time rendering systems and extends the range of possible applications with photometric area lights.


Sign in / Sign up

Export Citation Format

Share Document