scholarly journals IFC and Monitoring Database System Based on Graph Data Models

2021 ◽  
Vol 2021 ◽  
pp. 1-9
Author(s):  
Luka Gradišar ◽  
Matevž Dolenc

An efficient database management system that supports the integration and interoperability of different information models is a foundation on which the higher levels of cyber-physical systems are built. In this paper, we address the problem of integrating monitoring data with building information models through the use of the graph data management system and the IFC standard (Industry Foundation Classes) to support the need for interoperability and collaborative work. The proposed workflow describes the conversion of IFC models into a graph database and the connection with data from sensors, which is then validated using the example of a bridge monitoring system. The presented IFC and sensor graph data models are structurally flexible and scalable to meet the challenges of smart cities and big data.

Sensors ◽  
2020 ◽  
Vol 20 (20) ◽  
pp. 5771
Author(s):  
Zhansheng Liu ◽  
Anshan Zhang ◽  
Wensi Wang

With the development of the next generation of information technology, an increasing amount of attention is being paid to smart residential spaces, including smart cities, smart buildings, and smart homes. Building indoor safety intelligence is an important research topic. However, current indoor safety management methods cannot comprehensively analyse safety data, owing to a poor combination of safety management and building information. Additionally, the judgement of danger depends significantly on the experience of the safety management staff. In this study, digital twins (DTs) are introduced to building indoor safety management. A framework for an indoor safety management system based on DT is proposed which exploits the Internet of Things (IoT), building information modelling (BIM), the Internet, and support vector machines (SVMs) to improve the level of intelligence for building indoor safety management. A DT model (DTM) is developed using BIM integrated with operation information collected by IoT sensors. The trained SVM model is used to automatically obtain the types and levels of danger by processing the data in the DTM. The Internet is a medium for interactions between people and systems. A building in the bobsleigh and sled stadium for the Beijing Winter Olympics is considered as an example; the proposed system realises the functions of the scene display of the operation status, danger warning and positioning, danger classification and level assessment, and danger handling suggestions.


2019 ◽  
Vol 9 (11) ◽  
pp. 2204 ◽  
Author(s):  
Ya-Qi Xiao ◽  
Sun-Wei Li ◽  
Zhen-Zhong Hu

In mechanical, electrical, and plumbing (MEP) systems, logic chains refer to the upstream and downstream connections between MEP components. Generating the logic chains of MEP systems can improve the efficiency of facility management (FM) activities, such as locating components and retrieving relevant maintenance information for prompt failure detection or for emergency responses. However, due to the amount of equipment and components in commercial MEP systems, manually creating such logic chains is tedious and fallible work. This paper proposes an approach to generate the logic chains of MEP systems using building information models (BIMs) semi-automatically. The approach consists of three steps: (1) the parametric and nonparametric spatial topological analysis within MEP models to generate a connection table, (2) the transformation of MEP systems and custom information requirements to generate the pre-defined and user-defined identification rules, and (3) the logic chain completion of MEP model based on the graph data structure. The approach was applied to a real-world project, which substantiated that the approach was able to generate logic chains of 15 MEP systems with an average accuracy of over 80%.


1979 ◽  
Vol 18 (04) ◽  
pp. 199-202 ◽  
Author(s):  
F. Lustman ◽  
P. Lanthier ◽  
D. Charbonneau

A patient-oriented data management system is described. The environment was cardiology with a heavy emphasis on research and the MEDIC system was designed to meet the day to day program needs. The data are organized in speciality files with dynamic patient records composed of subrecords of different types. The schema is described by a data definition language. Application packages include data quality control, medical reporting and general inquiry.After five years of extensive use in various clinical applications, its utility has been assessed as well as its low cost. The disadvantages, the main being the multifile structure, can now be stated as its advantages, like data independence and performance increase. Although the system is now partially outdated, the experience acquired with its use becomes very helpful in the selection process of the future database management system.


2018 ◽  
Vol 2 ◽  
pp. e26289
Author(s):  
Natasha Govender

The Durban Natural Science Museum (DNSM) is located in the city of Durban in KwaZulu-Natal province, South Africa. Its entomology collection is one of three main collections at the museum. The collection consists of 141,000 dried specimens and encompasses 25 of the 29 known insect orders. Most of the specimens originate from South Africa however there is also a small percentage which has international origins. Collection growth is perpetuated by field collection trips and donations. In the recent past, DNSM was afforded the opportunity, through the South African National Research Foundation (NRF) via the Natural History Collections (NHC) Funding Instrument, to digitise insect type specimens and move the entomology research database from Microsoft Access to the web-based data management system, Specify 7. These developments have improved accessibility to the collection especially by those who do not have direct contact and access to the collection. In preparation for the migration to Specify 7, the specimen data was cleaned and standardised by means of an open source online tool, OpenRefine. The tool enabled the analysis and correction of data using an automated process which allowed for maximum productivity. Henceforth, we will ensure that the errors encountered during the data cleaning process will not be repeated. This will be achieved by training data capturers on correct formatting standards and using pick lists in the new database management system to foster consistency. On-going collections care is a core component of the DNSM, however a collections management policy is lacking and therefore such procedures differ somewhat across the three core departments. With regards to the entomology department, temperature and humidity monitoring efforts and mould prevention, detection and collection recovery occur regularly. Durban is a coastal city, and the characteristic high humidity is of great concern because it facilitates mould development on the specimens. Regular monitoring procedures mitigate such outbreaks. The DNSM has joined South Africa’s newly launched Natural Science Collections Facility (NSCF) which is a network of institutions which maintain zoological, botanical and paleontological collections. The NSCF, in consultation with institution representatives, has initiated the development of a collections management policy document which will be adopted by the DNSM as one of its sub-policies once it has been passed. The Durban Natural Science Museum will continue to strive for international best practises in collections management.


Author(s):  
Mary Barkworth ◽  
Benjamin Brandt ◽  
Curtis Dyreson ◽  
Neil Cobb ◽  
Will Pearse

Symbiota, the most used biodiversity content management system in the United States, has helped mobilize over 35 million specimen records from over 750 natural history collections via 40+ separate installations. Most Symbiota records come from natural history collections but some Symbiota instances also incorporate records from observations, images, publications, and gardens. Symbiota serves as both a data management system for entering, annotating, and cleaning occurrence data, images and associated specimen data (e.g., genetic sequences, images, publications) and as a primary aggregator/publisher for data stored in any database system that can export to a comma separated value (csv) file. Symbiota integrates and displays data and images from many resources in multiple formats, some of which appeal primarily to researchers, others to land managers, educators, and the general public. After nearly 20 years, Symbiota is going through a major software revision through Symbiota2, a US National Science Foundation-funded project. The broad goals of Symbiota2 are to make it easier for developers to add new functionality, to improve usability, and to help site managers administer a site. Symbiota2 will have a plugin-based architecture that will allow developers to encapsulate functionality in a plugin. Symbiota2 will improve usability by supporting off-line use, enabling Wordpress (content-managment system) integration, and having a customizable user interface. Symbiota2 will help site managers by simplifying installation and management of a site. The three-year project is on-going, but so far we have created a Symbiota2 GithHub repository and a Docker image with all the necessary components for installing, configuring, and running Symbiota2, an object relational mapping (ORM) of the tables in the database management system (DBMS), and web services to connect to the DBMS via the ORM. We used Doctrine 2 for the ORM and API-Platform for the web services. By the third quarter of 2019, we anticipate deploying the plugin framework to encourage developers to create new functionality for biodiversity content management.


Author(s):  
Chet Wood

Abstract An Object Database Management System (ODMS) can be a very useful component when developing applications for use in engineering and manufacturing. Choosing the right product requires a thorough analysis of the data requirements of one’s application, and an equally thorough study of the characteristics of the vendor products. Over a period of about two years, data was gathered on the products of object oriented database manufacturers and researchers. As an example of how to analyze database requirements, an overview of the requirements of our application is presented, followed by a tutorial on the elements and features provided by an ODMS. A brief description is given of each of about a dozen products. Finally, there are tables comparing specific features of a number of these systems.


2021 ◽  
Vol 234 ◽  
pp. 00033
Author(s):  
Anass Majdoubi ◽  
Abdellatif El Abderrahmani ◽  
Rafik Lasri

The climatic atmosphere in which cattle live is an essential parameter of their environment because of its critical role in their productivity. An adapted cattle building must help to mitigate the effects of climatic stress and allow the farmer to properly control the climatic atmosphere during the production cycle. The most important factors influencing the climatic atmosphere inside a cattle building are temperature, humidity, and greenhouse gas emissions. We propose a case study for a wireless sensor network model placed on a cattle farm, in which each measurement node “mote” collects environmental data (temperature, humidity, and emission gas), in order to control the building's climate, this data is stored and managed in a remote database. We will present HBase, a NoSQL database management system, based on the concept of distributed storage, a column-oriented database that provides the read/write access to data on the HADOOP HDFS file system in real-time. The storage results presented in this paper are obtained via a java code that can connect with the HBase database, in order to store the received data at every second from each node constituting the measurement system via HTTP requests.


2020 ◽  
Vol 3 (2) ◽  
pp. 01
Author(s):  
Arif Saripudin ◽  
Cecep Kurnia Sastradipraja ◽  
Falentino Sembiring

This research aims to analyze and design an administrative data management information system in LKP3I, which currently manages it is still manual, namely using Ms. Excel office tools, so it often experiences problems such as accumulation of documents, difficulty finding data, and scattered data of course participants. The application development process using the waterfall model includes collecting data and information, at the analysis stage using the elicitation approach technique which is represented by system modeling, namely flow maps, context diagrams, and data flow diagrams. Construction of web-based application systems using the programming languages VB.Net and Ms. Access as database management system, testing and evaluating the feasibility of the system using the BlackBox and group discussions. Based on the results of the research and system evaluation conducted, the prototype of the LKP3I administrative data management system that was built through the elicitation approach received a fairly good assessment, which proves that the hypothesis in this study has been tested. Respondents can receive the results of the information system prototype that will be implemented with the provision of improved specifications and functions of user requirements.


Sign in / Sign up

Export Citation Format

Share Document