scholarly journals Asset logging in the energy sector: a scalable blockchain-based data platform

2021 ◽  
Vol 4 (S3) ◽  
Author(s):  
Alexander Djamali ◽  
Patrick Dossow ◽  
Michael Hinterstocker ◽  
Benjamin Schellinger ◽  
Johannes Sedlmeir ◽  
...  

AbstractDue to a steeply growing number of energy assets, the increasingly decentralized and segmented energy sector fuels the potential for new digital use cases. In this paper, we focus our attention on the application field of asset logging, which addresses the collection, documentation, and usage of relevant asset data for direct or later verification. We identified a number of promising use cases that so far have not been implemented; supposedly due to the lack of a suitable technical infrastructure. Besides the high degree of complexity associated with various stakeholders and the diversity of assets involved, the main challenge we found in asset logging use cases is to guarantee the tamper-resistance and integrity of the stored data while meeting scalability, addressing cost requirements, and protecting sensitive data. Against this backdrop, we present a blockchain-based platform and argue that it can meet all identified requirements. Our proposed technical solution hierarchically aggregates data in Merkle trees and leverages Merkle proofs for the efficient and privacy-preserving verification of data integrity, thereby ensuring scalability even for highly frequent data logging. By connecting all stakeholders and assets involved on the platform through bilateral and authenticated communication channels and adding a blockchain as a shared foundation of trust, we implement a wide range of asset logging use cases and provide the basis for leveraging platform effects in future use cases that build on verifiable data. Along with the technical aspects of our solution, we discuss the challenges of its practical implementation in the energy sector and the next steps for testing in a regulatory sandbox approach.

2019 ◽  
pp. 40-46 ◽  
Author(s):  
V.V. Savchenko ◽  
A.V. Savchenko

We consider the task of automated quality control of sound recordings containing voice samples of individuals. It is shown that in this task the most acute is the small sample size. In order to overcome this problem, we propose the novel method of acoustic measurements based on relative stability of the pitch frequency within a voice sample of short duration. An example of its practical implementation using aninter-periodic accumulation of a speech signal is considered. An experimental study with specially developed software provides statistical estimates of the effectiveness of the proposed method in noisy environments. It is shown that this method rejects the audio recording as unsuitable for a voice biometric identification with a probability of 0,95 or more for a signal to noise ratio below 15 dB. The obtained results are intended for use in the development of new and modifying existing systems of collecting and automated quality control of biometric personal data. The article is intended for a wide range of specialists in the field of acoustic measurements and digital processing of speech signals, as well as for practitioners who organize the work of authorized organizations in preparing for registration samples of biometric personal data.


Sensors ◽  
2021 ◽  
Vol 21 (11) ◽  
pp. 3871
Author(s):  
Jiri Pokorny ◽  
Khanh Ma ◽  
Salwa Saafi ◽  
Jakub Frolka ◽  
Jose Villa ◽  
...  

Automated systems have been seamlessly integrated into several industries as part of their industrial automation processes. Employing automated systems, such as autonomous vehicles, allows industries to increase productivity, benefit from a wide range of technologies and capabilities, and improve workplace safety. So far, most of the existing systems consider utilizing one type of autonomous vehicle. In this work, we propose a collaboration of different types of unmanned vehicles in maritime offshore scenarios. Providing high capacity, extended coverage, and better quality of services, autonomous collaborative systems can enable emerging maritime use cases, such as remote monitoring and navigation assistance. Motivated by these potential benefits, we propose the deployment of an Unmanned Surface Vehicle (USV) and an Unmanned Aerial Vehicle (UAV) in an autonomous collaborative communication system. Specifically, we design high-speed, directional communication links between a terrestrial control station and the two unmanned vehicles. Using measurement and simulation results, we evaluate the performance of the designed links in different communication scenarios and we show the benefits of employing multiple autonomous vehicles in the proposed communication system.


2004 ◽  
Vol 19 (2) ◽  
pp. 140-148 ◽  
Author(s):  
Kai Reimers

This case describes the experience of a wholly foreign-owned manufacturing company in Tianjin/China regarding the use of its ERP system in its main functional departments, purchasing, production planning, sales/distribution, and finance. The company is part of a group which is a global leader in the manufacturing and distribution of mechanical devices, called gearboxes, that are needed to drive a wide range of facilities such as escalators and baggage conveyor belts in airports. It has entered China in 1995 and the Tianjin manufacturing facility has soon become a hub for the Asian market. The main challenge confronting the management team is to support the breakneck growth rate of this young company. The company's ERP system plays a crucial role in this task. However, it seems that middle managers are frequently hitting an invisible wall when trying to expand the use of the ERP system in order to cope with ever-increasing workloads and coordination tasks. This case serves to illustrate cultural issues implicated in the use of an enterprise wide information system in a medium size company operating in an emerging market economy. In addition, issues of operations management, global management, and organizational behaviour are addressed.


2021 ◽  
Vol 1 (2) ◽  
pp. 27-33
Author(s):  
M.V. Lyashenko ◽  
◽  
V.V. Shekhovtsov ◽  
P.V. Potapov ◽  
A.I. Iskaliyev ◽  
...  

The pneumatic seat suspension is one of the most important, and in some situations, one of the key components of the vibration protection system for the human operator of the vehicle. At the present stage of scientific and technical activities of most developers, great emphasis is placed on controlled seat suspension systems, as the most promising systems. This article analyzes the methods of controlling the elastic damping characteristics of the air suspension of a vehicle seat. Ten dif-ferent and fairly well-known methods of changing the shape and parameters of elastic damping characteristics due to electro-pneumatic valves, throttles, motors, additional cavities, auxiliary mechanisms and other actuators were considered, the advantages, application limits and disad-vantages of each method were analyzed. Based on the results of the performed analytical procedure, as well as the recommendations known in the scientific and technical literature on improving the vibration-protective properties of suspension systems, the authors proposed and developed a new method for controlling the elastic-damping characteristic, which is implemented in the proposed technical solution for the air suspension of a vehicle seat. The method differs in the thing that it im-plements a cyclic controlled exchange of the working fluid between the cavities of the pneumatic elastic element and the additional volume of the receiver on the compression and rebound strokes, forming an almost symmetric elastic damping characteristic, and partial recuperation of vibrational energy by a pneumatic drive, presented in the form of a rotary type pneumatic motor. In addition, the method does not require an unregulated hydraulic shock absorber, while still having the ad-vantage of improved vibration-proof properties of the air suspension of a vehicle seat over a wide range of operating influences.


Vestnik IGEU ◽  
2019 ◽  
pp. 58-66
Author(s):  
I.Yu. Dolgikh ◽  
M.G. Markov

A wide range of technological advantages of induction crucible melting furnaces makes their use in various sectors of metallurgical production relevant. However, hard operation conditions of the refractory lining of such furnaces makes it necessary to constantly monitor its condition, with the aim to extend the crucible life and prevent emergencies. Moreover, traditional methods based on the use of a bottom electrode and indication of current leakage to earth do not provide a continuous display of the lining destruction degree and make it possible to register only a critical level that requires an emergency shutdown and emptying of the furnace. This circumstance makes it necessary to develop and implement specialized electrical systems with a monitoring and control system that ensures the determination and visualization of the lining wear level and, if necessary, makes an emergency shutdown of the equipment from the power source. The developed complex is based on a microprocessor system that continuously measures the temperature at the control points at the boundary between the bottom and crucible base layers and compares the obtained values with the settings, which are determined previously on a two-dimensional axisymmetric model of the designed furnace by solving the stationary heat conduction equation at various levels of lining failure. We have developed the structure, scheme, and program for a microprocessor-based monitoring and emergency shutdown system of an induction furnace, as well as a mathematical model of the control object, which allows determining the temperature settings. The reliability of the results is confirmed by the applicability of the models to real objects, and is verified by debugging the microprocessor part in the MPLab-Sim and Proteus programs. The obtained results can be used in the practical implementation of the monitoring system and emergency shutdown of induction melting furnaces, which allows increasing the safety of their operation and extending the lining life due to timely repair.


2019 ◽  
Vol 23 (1) ◽  
pp. 52-63 ◽  
Author(s):  
Elina Strade ◽  
Daina Kalnina

Abstract Pharmaceutical wastewater biological treatment plants are stressed with multi-component wastewater and unexpected variations in wastewater flow, composition and toxicity. To avoid operational problems and reduced wastewater treatment efficiency, accurate monitoring of influent toxicity on activated sludge microorganisms is essential. This paper outlines how to predict highly toxic streams, which should be avoided, using measurements of biochemical oxygen demand (BOD), if they are made in a wide range of initial concentration. The results indicated that wastewater containing multivalent Al3+ cations showed a strong toxic effect on activated sludge biocenosis irrespectively of dilutions, while toxicity of phenol and formaldehyde containing wastewater decreased considerably with increasing dilution. Activated sludge microorganisms were not sensitive to wastewater containing halogenated sodium salts (NaCl, NaF) and showed high treatment capacity of saline wastewater. Our findings confirm that combined indicators of contamination, such as chemical oxygen demand (COD), alone do not allow evaluating potential toxic influence of wastewater. Obtained results allow identifying key inhibitory substances in pharmaceutical wastewater and evaluating potential impact of new wastewater streams or increased loading on biological treatment system. Proposed method is sensitive and cost effective and has potential for practical implementation in multiproduct pharmaceutical wastewater biological treatment plants.


Author(s):  
Matt Woodburn ◽  
Gabriele Droege ◽  
Sharon Grant ◽  
Quentin Groom ◽  
Janeen Jones ◽  
...  

The utopian vision is of a future where a digital representation of each object in our collections is accessible through the internet and sustainably linked to other digital resources. This is a long term goal however, and in the meantime there is an urgent need to share data about our collections at a higher level with a range of stakeholders (Woodburn et al. 2020). To sustainably achieve this, and to aggregate this information across all natural science collections, the data need to be standardised (Johnston and Robinson 2002). To this end, the Biodiversity Information Standards (TDWG) Collection Descriptions (CD) Interest Group has developed a data standard for describing collections, which is approaching formal review for ratification as a new TDWG standard. It proposes 20 classes (Suppl. material 1) and over 100 properties that can be used to describe, categorise, quantify, link and track digital representations of natural science collections, from high-level approximations to detailed breakdowns depending on the purpose of a particular implementation. The wide range of use cases identified for representing collection description data means that a flexible approach to the standard and the underlying modelling concepts is essential. These are centered around the ‘ObjectGroup’ (Fig. 1), a class that may represent any group (of any size) of physical collection objects, which have one or more common characteristics. This generic definition of the ‘collection’ in ‘collection descriptions’ is an important factor in making the standard flexible enough to support the breadth of use cases. For any use case or implementation, only a subset of classes and properties within the standard are likely to be relevant. In some cases, this subset may have little overlap with those selected for other use cases. This additional need for flexibility means that very few classes and properties, representing the core concepts, are proposed to be mandatory. Metrics, facts and narratives are represented in a normalised structure using an extended MeasurementOrFact class, so that these can be user-defined rather than constrained to a set identified by the standard. Finally, rather than a rigid underlying data model as part of the normative standard, documentation will be developed to provide guidance on how the classes in the standard may be related and quantified according to relational, dimensional and graph-like models. So, in summary, the standard has, by design, been made flexible enough to be used in a number of different ways. The corresponding risk is that it could be used in ways that may not deliver what is needed in terms of outputs, manageability and interoperability with other resources of collection-level or object-level data. To mitigate this, it is key for any new implementer of the standard to establish how it should be used in that particular instance, and define any necessary constraints within the wider scope of the standard and model. This is the concept of the ‘collection description scheme,’ a profile that defines elements such as: which classes and properties should be included, which should be mandatory, and which should be repeatable; which controlled vocabularies and hierarchies should be used to make the data interoperable; how the collections should be broken down into individual ObjectGroups and interlinked, and how the various classes should be related to each other. which classes and properties should be included, which should be mandatory, and which should be repeatable; which controlled vocabularies and hierarchies should be used to make the data interoperable; how the collections should be broken down into individual ObjectGroups and interlinked, and how the various classes should be related to each other. Various factors might influence these decisions, including the types of information that are relevant to the use case, whether quantitative metrics need to be captured and aggregated across collection descriptions, and how many resources can be dedicated to amassing and maintaining the data. This process has particular relevance to the Distributed System of Scientific Collections (DiSSCo) consortium, the design of which incorporates use cases for storing, interlinking and reporting on the collections of its member institutions. These include helping users of the European Loans and Visits System (ELViS) (Islam 2020) to discover specimens for physical and digital loans by providing descriptions and breakdowns of the collections of holding institutions, and monitoring digitisation progress across European collections through a dynamic Collections Digitisation Dashboard. In addition, DiSSCo will be part of a global collections data ecosystem requiring interoperation with other infrastructures such as the GBIF (Global Biodiversity Information Facility) Registry of Scientific Collections, the CETAF (Consortium of European Taxonomic Facilities) Registry of Collections and Index Herbariorum. In this presentation, we will introduce the draft standard and discuss the process of defining new collection description schemes using the standard and data model, and focus on DiSSCo requirements as examples of real-world collection descriptions use cases.


2019 ◽  
Author(s):  
Helmut Spengler ◽  
Claudia Lang ◽  
Tanmaya Mahapatra ◽  
Ingrid Gatz ◽  
Klaus A Kuhn ◽  
...  

BACKGROUND Modern data-driven medical research provides new insights into the development and course of diseases and enables novel methods of clinical decision support. Clinical and translational data warehouses, such as Informatics for Integrating Biology and the Bedside (i2b2) and tranSMART, are important infrastructure components that provide users with unified access to the large heterogeneous data sets needed to realize this and support use cases such as cohort selection, hypothesis generation, and ad hoc data analysis. OBJECTIVE Often, different warehousing platforms are needed to support different use cases and different types of data. Moreover, to achieve an optimal data representation within the target systems, specific domain knowledge is needed when designing data-loading processes. Consequently, informaticians need to work closely with clinicians and researchers in short iterations. This is a challenging task as installing and maintaining warehousing platforms can be complex and time consuming. Furthermore, data loading typically requires significant effort in terms of data preprocessing, cleansing, and restructuring. The platform described in this study aims to address these challenges. METHODS We formulated system requirements to achieve agility in terms of platform management and data loading. The derived system architecture includes a cloud infrastructure with unified management interfaces for multiple warehouse platforms and a data-loading pipeline with a declarative configuration paradigm and meta-loading approach. The latter compiles data and configuration files into forms required by existing loading tools, thereby automating a wide range of data restructuring and cleansing tasks. We demonstrated the fulfillment of the requirements and the originality of our approach by an experimental evaluation and a comparison with previous work. RESULTS The platform supports both i2b2 and tranSMART with built-in security. Our experiments showed that the loading pipeline accepts input data that cannot be loaded with existing tools without preprocessing. Moreover, it lowered efforts significantly, reducing the size of configuration files required by factors of up to 22 for tranSMART and 1135 for i2b2. The time required to perform the compilation process was roughly equivalent to the time required for actual data loading. Comparison with other tools showed that our solution was the only tool fulfilling all requirements. CONCLUSIONS Our platform significantly reduces the efforts required for managing clinical and translational warehouses and for loading data in various formats and structures, such as complex entity-attribute-value structures often found in laboratory data. Moreover, it facilitates the iterative refinement of data representations in the target platforms, as the required configuration files are very compact. The quantitative measurements presented are consistent with our experiences of significantly reduced efforts for building warehousing platforms in close cooperation with medical researchers. Both the cloud-based hosting infrastructure and the data-loading pipeline are available to the community as open source software with comprehensive documentation. CLINICALTRIAL


2018 ◽  
Vol 10 (2) ◽  
pp. 10-17
Author(s):  
Donna L. Hoffman ◽  
Thomas P. Novak

Abstract Up to now, IoT device adoption is happening mainly in the niche segments of technologically sophisticated upscale consumers and technology-focused DIYers. To reach a broader range of users, marketers must do a better job of understanding and offering the inherent value of smart products. Current marketing approaches are fragmented and tend to focus on individual products and single use cases. They may actually be underselling the consumer IoT. The mass-market consumer is not buying a platform or devices controlled by an algorithm, they are buying an experience. We need to ask, in what ways consumers and devices will interact with each other to create the experience they actually seek. Therefore, the main challenge is to implement a bottom-up approach that encourages users to experiment with their devices and their interactions and to integrate their individual experiences into everyday routines.


One Ecosystem ◽  
2020 ◽  
Vol 5 ◽  
Author(s):  
Dirk Vrebos ◽  
Jan Staes ◽  
Steven Broekx ◽  
Leo de Nocker ◽  
Karen Gabriels ◽  
...  

Since the early 2000s, there have been substantial efforts to transform the concept of ecosystem services into practice. Spatial assessment tools are being developed to evaluate the impact of spatial planning on a wide range of ecosystem services. However, the actual implementation in decision-making remains limited. To improve implementation, tools that are tailored to local conditions can provide accurate, meaningful information. Instead of a generic and widely-applicable tool, we developed a regional, spatially-explicit tool (ECOPLAN-SE) to analyse the impact of changes in land use on the delivery of 18 ecosystem services in Flanders (Belgium). The tool incorporates ecosystem services relevant to policy-makers and managers and makes use of detailed local data and knowledge. By providing an easy-to-use tool, including the required spatial geodatasets, time investment and the learning curve remain limited for the user. With this tool, constraints to implement ecosystem service assessments in local decision-making are drastically reduced. We believe that region-specific decision support systems, like ECOPLAN-SE, are indispensable intermediates between the conceptual ecosystem service frameworks and the practical implementation in planning processes.


Sign in / Sign up

Export Citation Format

Share Document