scholarly journals Beyond building predictive models: TwinOps in biomanufacturing

Author(s):  
Elisa Canzani ◽  
Sander W. Timmer

<div>On the wave of more and more manufacturers embracing the pervasive mission to build digital twins, also biopharmaceutical industry envisions a significant paradigm shift of digitalisation towards an intelligent factory where bioprocesses continuously learn from data to optimise and control productivity. While extensive efforts are made to build and combine the best mechanistic and data-driven models, there has not been a complete digital twin application in pharma. One of the main reasons is that production deployment becomes more complex regarding the possible impact such digital technologies could have on vaccine products and ultimately on patients. To address current technical challenges and fill regulatory gaps, this paper explores some best practices for TwinOps in biomanufacturing – from experiment to GxP validation – and discusses approaches to oversight and compliance that could work with these best practices towards building bioprocess digital twins at scale.</div>

2021 ◽  
Author(s):  
Elisa Canzani ◽  
Sander W. Timmer

<div>On the wave of more and more manufacturers embracing the pervasive mission to build digital twins, also biopharmaceutical industry envisions a significant paradigm shift of digitalisation towards an intelligent factory where bioprocesses continuously learn from data to optimise and control productivity. While extensive efforts are made to build and combine the best mechanistic and data-driven models, there has not been a complete digital twin application in pharma. One of the main reasons is that production deployment becomes more complex regarding the possible impact such digital technologies could have on vaccine products and ultimately on patients. To address current technical challenges and fill regulatory gaps, this paper explores some best practices for TwinOps in biomanufacturing – from experiment to GxP validation – and discusses approaches to oversight and compliance that could work with these best practices towards building bioprocess digital twins at scale.</div>


2020 ◽  
Vol 1 (2) ◽  
pp. 12-24
Author(s):  
Paolo Bongarzoni

As automation increasingly influences businesses, digitalization technologies and tools such as artificial intelligence, machine learning, etc., become essential to support the definition and implementation of strategy activities aimed at improving businesses' competitiveness in the digital, cloud-based, and data-driven world. Since this business growth corresponds to an enormous increase in the data volumes, it is fundamental for businesses to adopt several digital solutions in their strategy process together with a tailored digital strategy embedded in their strategic plan. The purpose of this article is to critically analyse the classic strategy activities' latest trends/needs and how they could be properly addressed by the available digital technologies. Finally, for every activity are mentioned some best practices tools and software, supported by management consultants, since they trigger a high return on investment in term of the time savings, less dedicated resources, and final business performance.


Processes ◽  
2020 ◽  
Vol 8 (4) ◽  
pp. 431
Author(s):  
Alexios Papacharalampopoulos

System identification has been a major advancement in the evolution of engineering. As it is by default the first step towards a significant set of adaptive control techniques, it is imperative for engineers to apply it in order to practice control. Given that system identification could be useful in creating a digital twin, this work focuses on the initial stage of the procedure by discussing simplistic system order identification. Through specific numerical examples, this study constitutes an investigation on the most “natural” method for estimating the order from responses in a convenient and seamless way in time-domain. The method itself, originally proposed by Ho and Kalman and utilizing linear algebra, is an intuitive tool retrieving information out of the data themselves. Finally, with the help of the limitations of the methods, the potential future outlook is discussed, under the prism of forming a digital twin.


Author(s):  
Linyu Lin ◽  
Paridhi Athe ◽  
Pascal Rouxelin ◽  
Nam Dinh ◽  
Jeffrey Lane

Abstract In this work, a Nearly Autonomous Management and Control (NAMAC) system is designed to diagnose the reactor state and provide recommendations to the operator for maintaining the safety and performance of the reactor. A three layer-hierarchical workflow is suggested to guide the design and development of the NAMAC system. The three layers in this workflow corresponds to knowledge base, digital twin developmental layer (for different NAMAC functions), and NAMAC operational layer. Digital twin in NAMAC is described as knowledge acquisition system to support different autonomous control functions. Therefore, based on the knowledge base, a set of digital twin models is trained to determine the plant state, predict behavior of physical components or systems, and rank available control options. The trained digital twin models are assembled according to NAMAC operational workflow to support decision-making process in selecting the optimal control actions during an accident scenario. To demonstrate the capability of the NAMAC system, a case study is designed, where a baseline NAMAC is implemented for operating a simulator of the Experimental Breeder Reactor II (EBR-II) during a single loss of flow accident. Training database for development of digital twin models is obtained by sampling the control parameters in the GOTHIC data generation engine. After the training and testing, the digital twins are assembled into a NAMAC system according to the operational workflow. This NAMAC system is coupled with the GOTHIC plant simulator, and a confusion matrix is generated to illustrate the accuracy and robustness of implemented NAMAC system. It is found that within the training databases, NAMAC can make reasonable recommendations with zero confusion rate. However, when the scenario is beyond the training cases, the confusion rate increases, especially when the scenarios are more severe. Therefore, a discrepancy checker is added to detect unexpected reactor states and alert operators for safety-minded actions.


Symmetry ◽  
2021 ◽  
Vol 13 (9) ◽  
pp. 1717
Author(s):  
Lei Wu ◽  
Jiewu Leng ◽  
Bingfeng Ju

Ultra-Precision Machining (UPM) is a kind of highly accurate processing technology developed to satisfy the manufacturing requirements of high-end cutting-edge products including nuclear energy producers, very large-scale integrated circuits, lasers, and aircraft. The information asymmetry phenomenon widely exists in the design and control of ultra-precision machining. It may lead to inconsistency between the designed performance and operational performance of the UPM equipment on stiffness, thermal stability, and motion accuracy, which result from its design, manufacturing, and control, and determine the form accuracy and surface roughness of machined parts. The performance of the UPM equipment should be improved continuously. It is still challenging to realize the real-time and self-adaptive control, in which building a high-fidelity and computationally efficient digital twin is a valuable solution. Nevertheless, the incorporation of the digital twin technology into the UPM design and control remains vague and sometimes contradictory. Based on a literature search in the Google Scholar database, the critical issues in the UPM design and control, and how to use the digital twin technologies to promote it, are reviewed. Firstly, the digital twins-based UPM design, including bearings module design, spindle-drive module design, stage system module design, servo module design, and clamping module design, are reviewed. Secondly, the digital twins-based UPM control studies, including voxel modeling, process planning, process monitoring, vibration control, and quality prediction, are reviewed. The key enabling technologies and research directions of digital twins-based design and control are discussed to deal with the information asymmetry phenomenon in UPM.


2020 ◽  
Vol 10 (13) ◽  
pp. 4482 ◽  
Author(s):  
Adrien Bécue ◽  
Eva Maia ◽  
Linda Feeken ◽  
Philipp Borchers ◽  
Isabel Praça

In the context of Industry 4.0, a growing use is being made of simulation-based decision-support tools commonly named Digital Twins. Digital Twins are replicas of the physical manufacturing assets, providing means for the monitoring and control of individual assets. Although extensive research on Digital Twins and their applications has been carried out, the majority of existing approaches are asset specific. Little consideration is made of human factors and interdependencies between different production assets are commonly ignored. In this paper, we address those limitations and propose innovations for cognitive modeling and co-simulation which may unleash novel uses of Digital Twins in Factories of the Future. We introduce a holistic Digital Twin approach, in which the factory is not represented by a set of separated Digital Twins but by a comprehensive modeling and simulation capacity embracing the full manufacturing process including external network dependencies. Furthermore, we introduce novel approaches for integrating models of human behavior and capacities for security testing with Digital Twins and show how the holistic Digital Twin can enable new services for the optimization and resilience of Factories of the Future. To illustrate this approach, we introduce a specific use-case implemented in field of Aerospace System Manufacturing.


2021 ◽  
Vol 11 (8) ◽  
pp. 3639
Author(s):  
Matevz Resman ◽  
Jernej Protner ◽  
Marko Simic ◽  
Niko Herakovic

A digital twin of a manufacturing system is a digital copy of the physical manufacturing system that consists of various digital models at multiple scales and levels. Digital twins that communicate with their physical counterparts throughout their lifecycle are the basis for data-driven factories. The problem with developing digital models that form the digital twin is that they operate with large amounts of heterogeneous data. Since the models represent simplifications of the physical world, managing the heterogeneous data and linking the data with the digital twin represent a challenge. The paper proposes a five-step approach to planning data-driven digital twins of manufacturing systems and their processes. The approach guides the user from breaking down the system and the underlying building blocks of the processes into four groups. The development of a digital model includes predefined necessary parameters that allow a digital model connecting with a real manufacturing system. The connection enables the control of the real manufacturing system and allows the creation of the digital twin. Presentation and visualization of a system functioning based on the digital twin for different participants is presented in the last step. The suitability of the approach for the industrial environment is illustrated using the case study of planning the digital twin for material logistics of the manufacturing system.


Processes ◽  
2021 ◽  
Vol 10 (1) ◽  
pp. 21
Author(s):  
Robert Kazała ◽  
Sławomir Luściński ◽  
Paweł Strączyński ◽  
Albena Taneva

This article presents the most valuable and applicable open-source tools and communication technologies that may be employed to create models of production processes by applying the concept of Digital Twins. In recent years, many open-source technologies, including tools and protocols, have been developed to create virtual models of production systems. The authors present the evolution and role of the Digital Twin concept as one of the key technologies for implementing the Industry 4.0 paradigm in automation and control. Based on the presented structured review of valuable open-source software dedicated to various phases and tasks that should be realised while creating the whole Digital Twin system, it was demonstrated that the available solutions cover all aspects. However, the dispersion, specialisation, and lack of integration cause this software to usually not be the first choice to implement DT. Therefore, to successfully create full-fledged models of Digital Twins by proceeding with proposed open-source solutions, it is necessary to make additional efforts due to integration requirements.


AI and Ethics ◽  
2021 ◽  
Author(s):  
Jacqui Ayling ◽  
Adriane Chapman

AbstractBias, unfairness and lack of transparency and accountability in Artificial Intelligence (AI) systems, and the potential for the misuse of predictive models for decision-making have raised concerns about the ethical impact and unintended consequences of new technologies for society across every sector where data-driven innovation is taking place. This paper reviews the landscape of suggested ethical frameworks with a focus on those which go beyond high-level statements of principles and offer practical tools for application of these principles in the production and deployment of systems. This work provides an assessment of these practical frameworks with the lens of known best practices for impact assessment and audit of technology. We review other historical uses of risk assessments and audits and create a typology that allows us to compare current AI ethics tools to Best Practices found in previous methodologies from technology, environment, privacy, finance and engineering. We analyse current AI ethics tools and their support for diverse stakeholders and components of the AI development and deployment lifecycle as well as the types of tools used to facilitate use. From this, we identify gaps in current AI ethics tools in auditing and risk assessment that should be considered going forward.


Sign in / Sign up

Export Citation Format

Share Document