scholarly journals Automatic generation of digital twin industrial system from a high level specification

2019 ◽  
Vol 38 ◽  
pp. 1095-1102 ◽  
Author(s):  
Julio Garrido Campos ◽  
Juan Sáez López ◽  
José Ignacio Armesto Quiroga ◽  
Angel Manuel Espada Seoane
2010 ◽  
Vol 29 (4) ◽  
pp. 171 ◽  
Author(s):  
Alessio Malizia ◽  
Paolo Bottoni ◽  
S. Levialdi

The design and development of a digital library involves different stakeholders, such as: information architects, librarians, and domain experts, who need to agree on a common language to describe, discuss, and negotiate the services the library has to offer. To this end, high-level, language-neutral models have to be devised. Metamodeling techniques favor the definition of domainspecific visual languages through which stakeholders can share their views and directly manipulate representations of the domain entities. This paper describes CRADLE (Cooperative-Relational Approach to Digital Library Environments), a metamodel-based framework and visual language for the definition of notions and services related to the development of digital libraries. A collection of tools allows the automatic generation of several services, defined with the CRADLE visual language, and of the graphical user interfaces providing access to them for the final user. The effectiveness of the approach is illustrated by presenting digital libraries generated with CRADLE, while the CRADLE environment has been evaluated by using the cognitive dimensions framework.


Author(s):  
Maja Radović ◽  
Nenad Petrović ◽  
Milorad Tošić

The requirements of state-of-the-art curricula and teaching processes in medical education have brought both new and improved the existing assessment methods. Recently, several promising methods have emerged, among them the Comprehensive Integrative Puzzle (CIP), which shows great potential. However, the construction of such questions requires high efforts of a team of experts and is time-consuming. Furthermore, despite the fact that English language is accepted as an international language, for educational purposes there is also a need for representing data and knowledge in native language. In this paper, we present an approach for automatic generation of CIP assessment questions based on using ontologies for knowledge representation. In this way, it is possible to provide multilingual support in the teaching and learning process because the same ontological concept can be applied to corresponding language expressions in different languages. The proposed approach shows promising results indicated by dramatic speeding up of construction of CIP questions compared to manual methods. The presented results represent a strong indication that adoption of ontologies for knowledge representation may enable scalability in multilingual domain-specific education regardless of the language used. High level of automation in the assessment process proven on the CIP method in medical education as one of the most challenging domains, promises high potential for new innovative teaching methodologies in other educational domains as well.


2000 ◽  
Vol 33 (20) ◽  
pp. 339-344
Author(s):  
J. Garrido ◽  
R. Marín ◽  
J.I. Armesto ◽  
J. Sáez

2020 ◽  
Vol 10 (19) ◽  
pp. 6959
Author(s):  
Seppo Sierla ◽  
Lotta Sorsamäki ◽  
Mohammad Azangoo ◽  
Antti Villberg ◽  
Eemeli Hytönen ◽  
...  

Researchers have proposed various models for assessing design alternatives for process plant retrofits. Due to the considerable engineering effort involved, no such models exist for the great majority of brownfield process plants, which have been in operation for years or decades. This article proposes a semi-automatic methodology for generating a digital twin of a brownfield plant. The methodology consists of: (1) extracting information from piping and instrumentation diagrams, (2) converting the information to a graph format, (3) applying graph algorithms to preprocess the graph, (4) generating a simulation model from the graph, (5) performing manual expert editing of the generated model, (6) configuring the calculations done by simulation model elements and (7) parameterizing the simulation model according to recent process measurements in order to obtain a digital twin. Since previous work exists for steps (1–2), this article focuses on defining the methodology for (3–5) and demonstrating it on a laboratory process. A discussion is provided for (6–7). The result of the case study was that only few manual edits needed to be made to the automatically generated simulation model. The paper is concluded with an assessment of open issues and topics of further research for this 7-step methodology.


VLSI Design ◽  
2012 ◽  
Vol 2012 ◽  
pp. 1-14 ◽  
Author(s):  
Khaled Jerbi ◽  
Mickaël Raulet ◽  
Olivier Déforges ◽  
Mohamed Abid

In this paper, we introduce the Reconfigurable Video Coding (RVC) standard based on the idea that video processing algorithms can be defined as a library of components that can be updated and standardized separately. MPEG RVC framework aims at providing a unified high-level specification of current MPEG coding technologies using a dataflow language called Cal Actor Language (CAL). CAL is associated with a set of tools to design dataflow applications and to generate hardware and software implementations. Before this work, the existing CAL hardware compilers did not support high-level features of the CAL. After presenting the main notions of the RVC standard, this paper introduces an automatic transformation process that analyses the non-compliant features and makes the required changes in the intermediate representation of the compiler while keeping the same behavior. Finally, the implementation results of the transformation on video and still image decoders are summarized. We show that the obtained results can largely satisfy the real time constraints for an embedded design on FPGA as we obtain a throughput of 73 FPS for MPEG 4 decoder and 34 FPS for coding and decoding process of the LAR coder using a video of CIF image size. This work resolves the main limitation of hardware generation from CAL designs.


2018 ◽  
Author(s):  
Mauricio Toro

Constraint Satisfaction Problems (CSPs) in computer music are used to solve harmonic, rhythmic or melodic problems. In addition,they can be used for automatic generation of musical structures satisfying a set of rules. Forinstance, the CSP proposed by compositor Michael Jarrell, which we explain in this document anddetail its implementation. Usually, a CSP is represented by a script defining the variables, theirdomain, and its constraints.Instead of writing a script, in Gelisp for OpenMusic (OM) we represent a program with a specialpatch. A patch is a visual algorithm, in which boxes represent functional calls, and connectionsare functional compositions. Inside this CSP patch, we can place special boxes: to connect eachconstraint in the CSP, to define variable and value heuristics, to define a time limit in the search,to connect the list of variables that we want to observe, and a box to connect the variable to bethe optimization criterion during the search.Furthermore, we provide a variety of boxes to represent simple constraints (e.g., a = b anda < 2) and high-level constraints (e.g., “the motive A occurs n times in the sequence S”). Theoutput of a CSP patch can be connected to three different kind of boxes: to find one solution, tofind all the solutions, and to perform propagation (narrow the domain of the variables) withoutsearch.


Author(s):  
Andrey Morozov ◽  
Mihai A. Diaconeasa ◽  
Mikael Steurer

Abstract Advanced classical Probabilistic Risk Assessment (PRA) effectively combines various methods for quantitative risk evaluation, such as event trees, fault trees, and Bayesian networks. PRA methods and tools provide the means for the qualitative reliability evaluation (e.g., cut sets) and the computation of quantitative reliability metrics (e.g., end states probabilities). Modern safety-critical systems from various industrial domains tend toward a high level of autonomy and demand not only reliability but also resilience, the ability to recover from degraded or failed states. The numerical resilience analysis of such dynamic systems requires more flexible methods. These methods shall enable the analysis of the systems with sophisticated software parts and dynamic feedback loops. A suitable candidate is the Dual-graph Error Propagation Model (DEPM) that can capture nontrivial failure scenarios and dynamic fault-tolerance mechanisms. The DEPM exploits the method for the automatic generation of Markov chain models and the application of probabilistic model checking techniques. Moreover, the DEPM enables the analysis of highly-customizable system resilience metrics, e.g., “the probability of system recovery to a particular state after a specified system failure during a defined time interval.” In this paper, we show how DEPM-based resilience analysis can be integrated with the general PRA methodology for resilience evaluations. The proposed methodology is demonstrated on a safety-critical autonomous UAV system.


Sign in / Sign up

Export Citation Format

Share Document