scholarly journals Fractal Analysis Application Outlook for Improving Process Monitoring and Machine Maintenance in Manufacturing 4.0

2018 ◽  
Vol 2 (3) ◽  
pp. 62 ◽  
Author(s):  
Xavier Rimpault ◽  
Marek Balazinski ◽  
Jean-François Chatelain

Industry 4.0 has been advertised for a decade as the next disruptive evolution for production. It relies on automation growth and particularly on data exchange using numerous sensors in order to develop faster production with tight monitoring. The huge amount of data generated by clouds of sensors during production is often used to feed machine learning systems in order to detect faults, monitor and find possible ways for improvement. However, the artificial intelligence within machine learning requires finding and selecting key features, such as average and root mean square. While current machine learning has already proven its use in diverse applications, its efficiency could be further improved by generating better characteristics such as fractal parameters. In this paper, fractal analysis concept is presented and its current and future applications in machining are discussed. This sensitive and robust technique is already extracting high performance key features that could fill in monitoring and prediction systems. On top of improving features selection and, thus, improving the overall performance of monitoring and predictive systems in machining, this could lead to a more rapid artificial intelligence implementation into manufacturing.

Author(s):  
Shivangi Ruhela ◽  
Pragati Chaudhary ◽  
Rishija Shrivas ◽  
Deepti Chopra

Artificial Intelligence(AI) and Internet of Things(IoT) are popular domains in Computer Science. AIoT converges AI and IoT, thereby applying AI into IoT. When ‘things’ are programmed and connected to the Internet, IoT comes into place. But when these IoT systems, can analyze data and have decision-making potential without human intervention, AIoT is achieved. AI powers IoT through Decision-Making and Machine Learning, IoT powers AI through data exchange and connectivity. With the AI’s brain and IoT’s body, the systems can have shot-up efficiency, performance and learning from user interactions. Some studies show that, by 2022, AIoT devices such as drones to save rainforests or fully automated cars, would be ruling the computer industries. The paper discusses AIoT at a greater depth, focuses on few case studies of AIoT for better understanding on practical levels, and lastly, proposes an idea for a model which suggests food through emotion analysis.


2020 ◽  
Vol 35 (1) ◽  
pp. 299-308 ◽  
Author(s):  
Xinhua Liu ◽  
Kanghui Zhou ◽  
Yu Lan ◽  
Xu Mao ◽  
Robert J. Trapp

Abstract It is argued here that even with the development of objective algorithms, convection-allowing numerical models, and artificial intelligence/machine learning, conceptual models will still be useful for forecasters until all these methods can fully satisfy the forecast requirements in the future. Conceptual models can help forecasters form forecast ideas quickly. They also can make up for the deficiencies of the numerical model and other objective methods. Furthermore, they can help forecasters understand the weather, and then help the forecasters lock in on the key features affecting the forecast as soon as possible. Ultimately, conceptual models can help the forecaster serve the end users faster, and better understand the forecast results during the service process. Based on the above considerations, construction of new conceptual models should have the following characteristics: 1) be guided by purpose, 2) focus on improving the ability of forecasters, 3) have multiangle consideration, 4) have multiscale fusion, and 5) need to be tested and corrected continuously. The traditional conceptual models used for forecasts of severe convective weather should be replaced gradually by new models that incorporate these principles.


Author(s):  
J. Charles Victor ◽  
P. Alison Paprica ◽  
Michael Brudno ◽  
Carl Virtanen ◽  
Walter Wodchis ◽  
...  

IntroductionCanadian provincial health systems have a data advantage – longitudinal population-wide data for publicly funded health services, in many cases going back 20 years or more. With the addition of high performance computing (HPC), these data can serve as the foundation for leading-edge research using machine learning and artificial intelligence. Objectives and ApproachThe Institute for Clinical Evaluative Sciences (ICES) and HPC4Health are creating the Ontario Data Safe Haven (ODSH) – a secure HPC cloud located within the HPC4Health physical environment at the Hospital for Sick Children in Toronto. The ODSH will allow research teams to post, access and analyze individual datasets over which they have authority, and enable linkage to Ontario administrative and other data. To start, the ODSH is focused on creating a private cloud meeting ICES’ legislated privacy and security requirements to support HPC-intensive analyses of ICES data. The first ODSH projects are partnerships between ICES scientists and machine learning. ResultsAs of March 2018, the technological build of the ODSH was tested and completed and the privacy and security policy framework and documentation were completed. We will present the structure of the ODSH, including the architectural choices made when designing the environment, and planned functionality in the future. We will describe the experience to-date for the very first analysis done using the ODSH: the automatic mining of clinical terminology in primary care electronic medical records using deep neural networks. We will also present the plans for a high-cost user Risk Dashboard program of research, co-designed by ICES scientists and health faculty from the Vector Institute for artificial intelligence, that will make use of the ODSH beginning May 2018. Conclusion/ImplicationsThrough a partnership of ICES, HPC4Health and the Vector Institute, a secure private cloud ODSH has been created as is starting to be used in leading edge machine learning research studies that make use of Ontario’s population-wide data assets.


Author(s):  
Yaser AbdulAali Jasim

Nowadays, technology and computer science are rapidly developing many tools and algorithms, especially in the field of artificial intelligence.  Machine learning is involved in the development of new methodologies and models that have become a novel machine learning area of applications for artificial intelligence. In addition to the architectures of conventional neural network methodologies, deep learning refers to the use of artificial neural network architectures which include multiple processing layers. In this paper, models of the Convolutional neural network were designed to detect (diagnose) plant disorders by applying samples of healthy and unhealthy plant images analyzed by means of methods of deep learning. The models were trained using an open data set containing (18,000) images of ten different plants, including healthy plants. Several model architectures have been trained to achieve the best performance of (97 percent) when the respectively [plant, disease] paired are detected. This is a very useful information or early warning technique and a method that can be further improved with the substantially high-performance rate to support an automated plant disease detection system to work in actual farm conditions.


2018 ◽  
Vol 34 (S1) ◽  
pp. 154-155
Author(s):  
Randy Goebel ◽  
Mi-Young Kim ◽  
Egon Jonsson ◽  
Ulli Wolfaardt

Introduction:Rising costs and the rapidly increasing volume of findings from research in health care are driving the demand for comprehensive information to inform the allocation of resources. Health technology assessment (HTA) applies rigorous processes to provide high-quality synthesized information to policymakers and healthcare payers. HTA involves combining large amounts of research publications to systematically evaluate the properties, effects, and impacts on a topic of interest.Methods:The time and resources required to complete a full HTA are often demanding. There is an opportunity to apply high-performance computing (inclusive of artificial intelligence and machine learning disciplines) to HTA. This project applied high-computing technology to create a research synthesis tool to support HTA and then developed a service that integrates as much relevant data as possible to strengthen HTA. This was a joint project that combined expertise from the areas of health technology, machine learning, information technology, and innovation.Results:The information gathered for this phased project from HTA subject matter experts and other stakeholders was collated to inform a research synthesis tool and a broader concept of the project.Conclusions:The results of this study will inform the design of a research synthesis tool that covers the entire HTA process (literature search, screening titles and abstracts, data extraction, quality assessment, and analysis). The collaborators included Alberta Innovates, the Alberta Machine Intelligence Institute, the University of Alberta, Cybera, and PolicyWise. Alberta Innovates, which is an accelerator and innovator of research in the province of Alberta, Canada, was the primary source of funding for this project.


Author(s):  
Lisanne V. van Dijk ◽  
Clifton D. Fuller

The advent of large-scale high-performance computing has allowed the development of machine-learning techniques in oncologic applications. Among these, there has been substantial growth in radiomics (machine-learning texture analysis of images) and artificial intelligence (which uses deep-learning techniques for “learning algorithms”); however, clinical implementation has yet to be realized at scale. To improve implementation, opportunities, mechanics, and challenges, models of imaging-enabled artificial intelligence approaches need to be understood by clinicians who make the treatment decisions. This article aims to convey the basic conceptual premises of radiomics and artificial intelligence using head and neck cancer as a use case. This educational overview focuses on approaches for head and neck oncology imaging, detailing current research efforts and challenges to implementation.


Sign in / Sign up

Export Citation Format

Share Document