IFC for Infrastructure

2017 ◽  
Vol 6 (3) ◽  
pp. 44-56 ◽  
Author(s):  
Pierre Benning

The industry foundation classes (IFCs) data model is a neutral and open data format defined by an international standard (ISO 16739), which allows the description of a construction as a collection of standard objects. These objects are quite well defined for describing a building, but their use is still far from being adapted (and then adopted) for infrastructure. The article presents a new methodology to enrich the IFC model for an infrastructure, in particular, for the scope of bridges, based on a system approach. The first step is to identify all the absent concepts and classes in the current IFC definition, procedural geometry, coordinate systems, etc., and then proposes “bridge oriented” new entities in order to enrich the current IFC model. The next IFC development phases, dedicated to other infrastructure domains, will be based on this experienced methodology.

CivilEng ◽  
2021 ◽  
Vol 2 (1) ◽  
pp. 174-192
Author(s):  
Alcinia Zita Sampaio ◽  
Augusto Martins Gomes

The building information modelling (BIM) methodology supports collaborative works, based on the centralization of all information in a federated BIM model and on an efficient level of interoperability between BIM-based platforms. Concerning the structure design, the interoperability capacity of the most used software presents limitations that must be identified and alternative solutions must be proposed. This study analyzes the process of transfer of structure models between modeling and structure analysis tools. Distinct building cases were performed in order to recognize the type of limitations verified in the transfer processes concerning two-way data flow between several software. The study involves the modeling software ArchiCAD 2020, Revit 2020, and AECOsim 2019 and the structure analyzes tools SAP 2020, Robot 2020, and ETABS 22020. The transfer processes are realized in two ways: using the native data format; using a universal standard data transfer, the Industry Foundation Classes (IFC) format. The level of maturity of BIM in structure design is still relatively low, caused essentially by interoperability problems, but despite the limitations detected, this study shows throughout the development of several building case, that the methodology has clear advantages in the development of the structure project.


2021 ◽  
Author(s):  
Olesya Yakovchuk ◽  
Jan Maik Wissing

<p>The Atmospheric Ionization during Substorm Model (AISstorm) is the successor of the Atmospheric Ionization Module Osnabrück (AIMOS) and thus may also be considered as AIMOS 2.0 - AISStorm.</p><p>The overall structure was kept mostly unaltered and splits up into an empirical model that determines the 2D precipitating particle flux and a numerical model that determines the ionization profile of single particles. The combination of these two results in a high resolution 3D particle ionization pattern.</p><p>The internal structure of the model has been completely revised with the main aspects being: a) an internal magnetic coordinate system, b) including substorms characteristics, c) higher time resolution, d) higher spatial resolution, e) energy specific separate handling of drift loss cone, auroal precipitation and polar cap precipitation, partly even in separate coordinate systems, f) better MLT resolution and g) covering a longer time period. All these tasks have been matched while keeping the output data format identical, allowing easy transition to the new version.</p>


Proceedings ◽  
2018 ◽  
Vol 2 (19) ◽  
pp. 1244
Author(s):  
Netzahualcoyotl Hernandez ◽  
Ian McChesney ◽  
Joe Rafferty ◽  
Chris Nugent ◽  
Jonathan Synnott ◽  
...  

The Open Data Initiative (ODI) has been previously proposed to facilitate the sharing of annotated datasets within the pervasive health care research community. This paper outlines the requirements for the ODI portal based on the ontological data model of the ODI and its typical usage scenarios. In the context of an action research framework, the paper outlines the ODI platform, the design of a prototype user interface for the purposes of initial evaluation and its technical review by third-party researchers (n = 3). The main findings from the technical review were found to be the need for a more flexible user interface to reflect the different experimental configurations in the research community, provision for describing dataset usage, and dissemination conditions. The technical review also identified the value of permitting datasets with variable quality, as noisy datasets are useful in the testing of activity recognition algorithms. Revisions to the ODI ontology and platform are proposed based on the findings from this study.


Author(s):  
Aatif Ahmad Khan ◽  
Sanjay Kumar Malik

Semantic Search refers to set of approaches dealing with usage of Semantic Web technologies for information retrieval in order to make the process machine understandable and fetch precise results. Knowledge Bases (KB) act as the backbone for semantic search approaches to provide machine interpretable information for query processing and retrieval of results. These KB include Resource Description Framework (RDF) datasets and populated ontologies. In this paper, an assessment of the largest cross-domain KB is presented that are exploited in large scale semantic search and are freely available on Linked Open Data Cloud. Analysis of these datasets is a prerequisite for modeling effective semantic search approaches because of their suitability for particular applications. Only the large scale, cross-domain datasets are considered, which are having sizes more than 10 million RDF triples. Survey of sizes of the datasets in triples count has been depicted along with triples data format(s) supported by them, which is quite significant to develop effective semantic search models.


Em Questão ◽  
2020 ◽  
Vol 27 (1) ◽  
pp. 185-209
Author(s):  
Maria Lígia Triques ◽  
Ana Carolina Simionato Arakaki

Com a finalidade de representar de forma abstrata as relações e entidades, os modelos de dados ganham destaque como melhores práticas nos processos de análise e representação da informação, especificamente no planejamento e desenvolvimento de sistemas interoperáveis e persistentes. Nesse sentido, apresenta-se um estudo sobre o Europeana Data Model (EDM), modelo de dados desenvolvido pela plataforma Europeana com base nas tecnologias semânticas e nos princípios do Linked Open Data. Por meio de uma pesquisa qualitativa, exploratória, bibliográfica e documental, discute-se como o uso de um modelo de dados, tal como o EDM, possibilita que as necessidades de representação informacional de coleções de dados de patrimônios culturais sejam atendidas no ambiente Web. Desse modo, o objetivo do estudo foi analisar a estruturação e representação proposta pela Europeana para as coleções de patrimônios culturais. Como resultado, destaca-se a modelagem de dados, que é o processo pelo qual o EDM pauta seu funcionamento, possibilitando que as necessidades informacionais sejam contempladas. Conclui-se, por fim que a importância dos modelos de dados, tal como o EDM, encontra-se na possibilidade de apreender o contexto semântico a qual um conjunto de dados de patrimônios culturais pertence ou está relacionado, garantindo a persistência de seus conceitos e relações nos ambientes informacionais.


2018 ◽  
Vol 7 (3.33) ◽  
pp. 225
Author(s):  
Hee-kyung Moon ◽  
Sung-kook Han ◽  
Chang-ho An

This paper describes Linked Open Data(LOD) development system and its application of medical information standard as Observational Medical Outcomes Partnership(OMOP) Common Data Model(CDM). The OMOP CDM allows for the systematic analysis of disparate observational database in each hospital. This paper describes a LOD instance development system based on SII. It can generate the application-specified instance development system automatically. Therefore, we applied by medical information standard as OMOP CDM to LOD development system. As a result, it was confirmed that there is no problem in applying to the standardization of medical information using the LOD development system.  


Author(s):  
Peter C. G. Veenstra

The Pipeline Open Data Standard (PODS) Association develops and advances global pipeline data standards and best practices supporting data management and reporting for the oil and gas industry. This presentation provides an overview of the PODS Association and a detailed overview of the transformed PODS Pipeline Data Model resulting from the PODS Next Generation initiative. The PODS Association’s Next Generation, or Next Gen, initiative is focused on a complete re-design and modernization of the PODS Pipeline Data Model. The re-design of the PODS Pipeline Data Model is driven by PODS Association Strategy objectives as defined in its 2016–2019 Strategic Plan and reflects nearly 20 years of PODS Pipeline Data Model implementation experience and lessons learned. The Next Gen Data Model is designed to be the system of record for pipeline centerlines and pressurized containment assets for the safe transport of product, allowing pipeline operators to: • Achieve greater agility to build and extend the data model, • respond to new business requirements, • interoperate through standard data models and consistent application interface, • share data within and between organizations using well defined data exchange specifications, • optimize performance for management of bulk loading, reroute, inspection data and history. The presentation will introduce the Next Gen Data Model design principles, conceptual, logical and physical structures with a focus on transformational changes from prior versions of the Model. Support for multiple platforms including but not limited to Esri ArcGIS, open source GIS and relational database management systems will be described. Alignment with Esri’s ArcGIS Platform and ArcGIS for Pipeline Referencing (APR) will be a main topic of discussion along with how PODS Next Gen can be leveraged to benefit pipeline integrity, risk assessment, reporting and data maintenance. The end goal of a PODS implementation is a realization of data management efficiency, data transfer and exchange, to make the operation of a pipeline safer and most cost effective.


Sign in / Sign up

Export Citation Format

Share Document