The Application of the Positive Semi-Definite Kernel Space for SVM in Quality Prediction

2020 ◽  
Vol 13 (2) ◽  
pp. 228-233
Author(s):  
Wang Meng ◽  
Dui Hongyan ◽  
Zhou Shiyuan ◽  
Dong Zhankui ◽  
Wu Zige

Background: A transformation toward 4th Generation Industrial Revolution (Industry 4.0) is being led by Germany based on Cyber-Physical System-enabled manufacturing and service innovation. Smart manufacturing is an important feature of Industry 4.0 which uses the networked manufacturing systems for smart production. Current manufacturing systems (5M1E systems) require deeper mining of the data which is generated from manufacturing process. Objective: To map low-dimensional embedding into the input space would meet the requirement of “kernel trick” to solve a problem in feature space. On the other hand, the distance can be calculated more precisely. Methods: In this research, we proposed a positive semi-definite kernel space by using a constant additive method based on a kernel view of ISOMAP. There were 6 steps in the algorithm. Results: The classification precision of KMLSVM was better than SVM in the enterprise data set, in which SVM selected the RBF kernel and optimized its parameters. Conclusion: We adopted the additive constant method in kernel space construction and the positive semi-definite kernel was built. The typical mixed data set of an enterprise was used in simulation. We compared the SVM and KMLSVM in this data set and optimized the SVM kernel function parameters. The simulation results demonstrated the KMLSVM was a better algorithm in mix type data set than SVM.

Complexity ◽  
2018 ◽  
Vol 2018 ◽  
pp. 1-17 ◽  
Author(s):  
Usharani Hareesh Govindarajan ◽  
Amy J. C. Trappey ◽  
Charles V. Trappey

Immersive technology for human-centric cyberphysical systems includes broad concepts that enable users in the physical world to connect with the cyberworld with a sense of immersion. Complex systems such as virtual reality, augmented reality, brain-computer interfaces, and brain-machine interfaces are emerging as immersive technologies that have the potential for improving manufacturing systems. Industry 4.0 includes all technologies, standards, and frameworks for the fourth industrial revolution to facilitate intelligent manufacturing. Industrial immersive technologies will be used for smart manufacturing innovation in the context of Industry 4.0’s human machine interfaces. This research provides a thorough review of the literature, construction of a domain ontology, presentation of patent metatrend statistical analysis, and data mining analysis using a technology function matrix and highlights technical and functional development trends using latent Dirichlet allocation (LDA) models. A total of 179 references from the IEEE and IET databases and 2,672 patents are systematically analyzed to identify current trends. The paper establishes an essential foundation for the development of advanced human-centric cyberphysical systems in complex manufacturing processes.


2020 ◽  
Vol 7 (2) ◽  
pp. 129-144 ◽  
Author(s):  
Erwin Rauch ◽  
Andrew R Vickery

Abstract With the increasing trend of the Fourth Industrial Revolution, also known as Industry 4.0 or smart manufacturing, many companies are now facing the challenge of implementing Industry 4.0 methods and technologies. This is a challenge especially for small and medium-sized enterprises, as they have neither sufficient human nor financial resources to deal with the topic sufficiently. However, since small and medium-sized enterprises form the backbone of the economy, it is particularly important to support these companies in the introduction of Industry 4.0 and to develop appropriate tools. This work is intended to fill this gap and to enhance research on Industry 4.0 for small and medium-sized enterprises by presenting an exploratory study that has been used to systematically analyze and evaluate the needs and translate them into a final list of (functional) requirements and constraints using axiomatic design as scientific approach.


Electronics ◽  
2021 ◽  
Vol 10 (7) ◽  
pp. 869
Author(s):  
Pablo F. S. Melo ◽  
Eduardo P. Godoy ◽  
Paolo Ferrari ◽  
Emiliano Sisinni

The technical innovation of the fourth industrial revolution (Industry 4.0—I4.0) is based on the following respective conditions: horizontal and vertical integration of manufacturing systems, decentralization of computing resources and continuous digital engineering throughout the product life cycle. The reference architecture model for Industry 4.0 (RAMI 4.0) is a common model for systematizing, structuring and mapping the complex relationships and functionalities required in I4.0 applications. Despite its adoption in I4.0 projects, RAMI 4.0 is an abstract model, not an implementation guide, which hinders its current adoption and full deployment. As a result, many papers have recently studied the interactions required among the elements distributed along the three axes of RAMI 4.0 to develop a solution compatible with the model. This paper investigates RAMI 4.0 and describes our proposal for the development of an open-source control device for I4.0 applications. The control device is one of the elements in the hierarchy-level axis of RAMI 4.0. Its main contribution is the integration of open-source solutions of hardware, software, communication and programming, covering the relationships among three layers of RAMI 4.0 (assets, integration and communication). The implementation of a proof of concept of the control device is discussed. Experiments in an I4.0 scenario were used to validate the operation of the control device and demonstrated its effectiveness and robustness without interruption, failure or communication problems during the experiments.


2021 ◽  
Vol 11 (7) ◽  
pp. 3186
Author(s):  
Radhya Sahal ◽  
Saeed H. Alsamhi ◽  
John G. Breslin ◽  
Kenneth N. Brown ◽  
Muhammad Intizar Ali

Digital twin (DT) plays a pivotal role in the vision of Industry 4.0. The idea is that the real product and its virtual counterpart are twins that travel a parallel journey from design and development to production and service life. The intelligence that comes from DTs’ operational data supports the interactions between the DTs to pave the way for the cyber-physical integration of smart manufacturing. This paper presents a conceptual framework for digital twins collaboration to provide an auto-detection of erratic operational data by utilizing operational data intelligence in the manufacturing systems. The proposed framework provide an interaction mechanism to understand the DT status, interact with other DTs, learn from each other DTs, and share common semantic knowledge. In addition, it can detect the anomalies and understand the overall picture and conditions of the operational environments. Furthermore, the proposed framework is described in the workflow model, which breaks down into four phases: information extraction, change detection, synchronization, and notification. A use case of Energy 4.0 fault diagnosis for wind turbines is described to present the use of the proposed framework and DTs collaboration to identify and diagnose the potential failure, e.g., malfunctioning nodes within the energy industry.


2021 ◽  
Author(s):  
Rogini Runghen ◽  
Daniel B Stouffer ◽  
Giulio Valentino Dalla Riva

Collecting network interaction data is difficult. Non-exhaustive sampling and complex hidden processes often result in an incomplete data set. Thus, identifying potentially present but unobserved interactions is crucial both in understanding the structure of large scale data, and in predicting how previously unseen elements will interact. Recent studies in network analysis have shown that accounting for metadata (such as node attributes) can improve both our understanding of how nodes interact with one another, and the accuracy of link prediction. However, the dimension of the object we need to learn to predict interactions in a network grows quickly with the number of nodes. Therefore, it becomes computationally and conceptually challenging for large networks. Here, we present a new predictive procedure combining a graph embedding method with machine learning techniques to predict interactions on the base of nodes' metadata. Graph embedding methods project the nodes of a network onto a---low dimensional---latent feature space. The position of the nodes in the latent feature space can then be used to predict interactions between nodes. Learning a mapping of the nodes' metadata to their position in a latent feature space corresponds to a classic---and low dimensional---machine learning problem. In our current study we used the Random Dot Product Graph model to estimate the embedding of an observed network, and we tested different neural networks architectures to predict the position of nodes in the latent feature space. Flexible machine learning techniques to map the nodes onto their latent positions allow to account for multivariate and possibly complex nodes' metadata. To illustrate the utility of the proposed procedure, we apply it to a large dataset of tourist visits to destinations across New Zealand. We found that our procedure accurately predicts interactions for both existing nodes and nodes newly added to the network, while being computationally feasible even for very large networks. Overall, our study highlights that by exploiting the properties of a well understood statistical model for complex networks and combining it with standard machine learning techniques, we can simplify the link prediction problem when incorporating multivariate node metadata. Our procedure can be immediately applied to different types of networks, and to a wide variety of data from different systems. As such, both from a network science and data science perspective, our work offers a flexible and generalisable procedure for link prediction.


2021 ◽  
Vol 50 (1) ◽  
pp. 138-152
Author(s):  
Mujeeb Ur Rehman ◽  
Dost Muhammad Khan

Recently, anomaly detection has acquired a realistic response from data mining scientists as a graph of its reputation has increased smoothly in various practical domains like product marketing, fraud detection, medical diagnosis, fault detection and so many other fields. High dimensional data subjected to outlier detection poses exceptional challenges for data mining experts and it is because of natural problems of the curse of dimensionality and resemblance of distant and adjoining points. Traditional algorithms and techniques were experimented on full feature space regarding outlier detection. Customary methodologies concentrate largely on low dimensional data and hence show ineffectiveness while discovering anomalies in a data set comprised of a high number of dimensions. It becomes a very difficult and tiresome job to dig out anomalies present in high dimensional data set when all subsets of projections need to be explored. All data points in high dimensional data behave like similar observations because of its intrinsic feature i.e., the distance between observations approaches to zero as the number of dimensions extends towards infinity. This research work proposes a novel technique that explores deviation among all data points and embeds its findings inside well established density-based techniques. This is a state of art technique as it gives a new breadth of research towards resolving inherent problems of high dimensional data where outliers reside within clusters having different densities. A high dimensional dataset from UCI Machine Learning Repository is chosen to test the proposed technique and then its results are compared with that of density-based techniques to evaluate its efficiency.


2021 ◽  
Author(s):  
Muzaffar Rao ◽  
Thomas Newe

The current manufacturing transformation is represented by using different terms like; Industry 4.0, smart manufacturing, Industrial Internet of Things (IIoTs), and the Model-Based enterprise. This transformation involves integrated and collaborative manufacturing systems. These manufacturing systems should meet the demands changing in real-time in the smart factory environment. Here, this manufacturing transformation is represented by the term ‘Smart Manufacturing’. Smart manufacturing can optimize the manufacturing process using different technologies like IoT, Analytics, Manufacturing Intelligence, Cloud, Supplier Platforms, and Manufacturing Execution System (MES). In the cell-based manufacturing environment of the smart industry, the best way to transfer the goods between cells is through automation (mobile robots). That is why automation is the core of the smart industry i.e. industry 4.0. In a smart industrial environment, mobile-robots can safely operate with repeatability; also can take decisions based on detailed production sequences defined by Manufacturing Execution System (MES). This work focuses on the development of a middleware application using LabVIEW for mobile-robots, in a cell-based manufacturing environment. This application works as middleware to connect mobile robots with the MES system.


Author(s):  
Chetna Chauhan ◽  
Amol Singh

The pace of Industry 4.0 adoption in manufacturing industries has been slow as it is accompanied by several barriers, specifically in the emerging economies. The current study intends to identify and understand the landscape of these challenges. Further, this paper prioritizes the challenges on the basis of their relative importance. To achieve this objective, the authors combine the fuzzy delphi approach along with the fuzzy analytical hierarchy process. Additionally, a sensitivity analysis is done to enhance robustness of the findings. The global rankings of the challenges reveal that the most significant factors that hamper the full realization of smart manufacturing include cybersecurity, privacy risks, and enormously high number of technology choices available in the market. The analysis offers insights into the reasons for the slow diffusion of smart manufacturing systems and the results would assist managers, policymakers, and technology providers in the advent of manufacturing digitalization.


Computers ◽  
2020 ◽  
Vol 9 (2) ◽  
pp. 28 ◽  
Author(s):  
Salvatore Cavalieri ◽  
Marco Giuseppe Salafia

In the context of Industry 4.0, lot of effort is being put to achieve interoperability among industrial applications. As the definition and adoption of communication standards are of paramount importance for the realization of interoperability, during the last few years different organizations have developed reference architectures to align standards in the context of the fourth industrial revolution. One of the main examples is the reference architecture model for Industry 4.0, which defines the asset administration shell as the corner stone of the interoperability between applications managing manufacturing systems. Inside Industry 4.0 there is also so much interest behind the standard open platform communications unified architecture (OPC UA), which is listed as the one recommendation for realizing the communication layer of the reference architecture model. The contribution of this paper is to give some insights behind modelling techniques that should be adopted during the definition of OPC UA Information Model exposing information of the very recent metamodel defined for the asset administration shell. All the general rationales and solutions here provided are compared with the current OPC UA-based existing representation of asset administration shell provided by literature. Specifically, differences will be pointed out giving to the reader advantages and disadvantages behind each solution.


2020 ◽  
Vol 4 (3) ◽  
pp. 88 ◽  
Author(s):  
Vadim Kapp ◽  
Marvin Carl May ◽  
Gisela Lanza ◽  
Thorsten Wuest

This paper presents a framework to utilize multivariate time series data to automatically identify reoccurring events, e.g., resembling failure patterns in real-world manufacturing data by combining selected data mining techniques. The use case revolves around the auxiliary polymer manufacturing process of drying and feeding plastic granulate to extrusion or injection molding machines. The overall framework presented in this paper includes a comparison of two different approaches towards the identification of unique patterns in the real-world industrial data set. The first approach uses a subsequent heuristic segmentation and clustering approach, the second branch features a collaborative method with a built-in time dependency structure at its core (TICC). Both alternatives have been facilitated by a standard principle component analysis PCA (feature fusion) and a hyperparameter optimization (TPE) approach. The performance of the corresponding approaches was evaluated through established and commonly accepted metrics in the field of (unsupervised) machine learning. The results suggest the existence of several common failure sources (patterns) for the machine. Insights such as these automatically detected events can be harnessed to develop an advanced monitoring method to predict upcoming failures, ultimately reducing unplanned machine downtime in the future.


Sign in / Sign up

Export Citation Format

Share Document