scholarly journals Design of Resources Allocation in 6G Cybertwin Technology Using the Fuzzy Neuro Model in Healthcare Systems

2022 ◽  
Vol 2022 ◽  
pp. 1-9
Author(s):  
Salman Ali Syed ◽  
K. Sheela Sobana Rani ◽  
Gouse Baig Mohammad ◽  
G. Anil kumar ◽  
Krishna Keerthi Chennam ◽  
...  

In 6G edge communication networks, the machine learning models play a major role in enabling intelligent decision-making in case of optimal resource allocation in case of the healthcare system. However, it causes a bottleneck, in the form of sophisticated memory calculations, between the hidden layers and the cost of communication between the edge devices/edge nodes and the cloud centres, while transmitting the data from the healthcare management system to the cloud centre via edge nodes. In order to reduce these hurdles, it is important to share workloads to further eliminate the problems related to complicated memory calculations and transmission costs. The effort aims mainly to reduce storage costs and cloud computing associated with neural networks as the complexity of the computations increases with increasing numbers of hidden layers. This study modifies federated teaching to function with distributed assignment resource settings as a distributed deep learning model. It improves the capacity to learn from the data and assigns an ideal workload depending on the limited available resources, slow network connection, and more edge devices. Current network status can be sent to the cloud centre by the edge devices and edge nodes autonomously using cybertwin, meaning that local data are often updated to calculate global data. The simulation shows how effective resource management and allocation is better than standard approaches. It is seen from the results that the proposed method achieves higher resource utilization and success rate than existing methods. Index Terms are fuzzy, healthcare, bioinformatics, 6G wireless communication, cybertwin, machine learning, neural network, and edge.

Author(s):  
Alessandro Simeone ◽  
Yunfeng Zeng ◽  
Alessandra Caggiano

AbstractCloud manufacturing represents a valuable tool to enable wide sharing of manufacturing services and solutions by connecting suppliers and customers in large-scale manufacturing networks through a cloud platform. In this context, with increasing manufacturing network size at global scale, the elevated number of manufacturing solutions offered via cloud platform to connected customers can increase the complexity of decision-making, resulting in poor user experience from a customer perspective. To tackle this issue, in this paper, an intelligent decision-making support tool based on a manufacturing service recommendation system (RS) is designed and developed to provide for tailored manufacturing solution recommendation to customers in a cloud manufacturing system. A machine learning procedure based on neural networks for data regression is employed to process historical data on user manufacturing solution preferences and to carry out the automatic extraction of key features from incoming user instances and compatible manufacturing solutions generated by the cloud platform. In this way, the machine learning procedure is able to perform a customer segmentation and build a recommendation list characterized by a ranking of manufacturing solutions which is tailored to the specific customer profile. With the aim to validate the proposed intelligent decision-making support system, a case study is simulated within the framework of a cloud manufacturing platform delivering dynamic sharing of sheet metal cutting manufacturing solutions. The system capability is discussed in terms of machine learning performance as well as industrial applicability and user selection likelihood.


Author(s):  
Wajid Hassan ◽  
Te-Shun Chou ◽  
Omar Tamer ◽  
John Pickard ◽  
Patrick Appiah-Kubi ◽  
...  

<p>Cloud computing has sweeping impact on the human productivity. Today it’s used for Computing, Storage, Predictions and Intelligent Decision Making, among others. Intelligent Decision Making using Machine Learning has pushed for the Cloud Services to be even more fast, robust and accurate. Security remains one of the major concerns which affect the cloud computing growth however there exist various research challenges in cloud computing adoption such as lack of well managed service level agreement (SLA), frequent disconnections, resource scarcity, interoperability, privacy, and reliability. Tremendous amount of work still needs to be done to explore the security challenges arising due to widespread usage of cloud deployment using Containers. We also discuss Impact of Cloud Computing and Cloud Standards. Hence in this research paper, a detailed survey of cloud computing, concepts, architectural principles, key services, and implementation, design and deployment challenges of cloud computing are discussed in detail and important future research directions in the era of Machine Learning and Data Science have been identified.</p>


Author(s):  
Iqbal H. Sarker

In a computing context, cybersecurity is undergoing massive shifts in technology and its operations in recent days, and data science is driving the change. Extracting security incident patterns or insights from cybersecurity data and building corresponding data-driven model, is the key to make a security system automated and intelligent. To understand and analyze the actual phenomena with data, various scientific methods, machine learning techniques, processes, and systems are used, which is commonly known as data science. In this paper, we focus and briefly discuss cybersecurity data science, where the data is being gathered from relevant cybersecurity sources, and the analytics complement the latest data-driven patterns for providing more effective security solutions. The concept of cybersecurity data science allows making the computing process more actionable and intelligent as compared to traditional ones in the domain of cybersecurity. We then discuss and summarize a number of associated research issues and future directions. Furthermore, we provide a machine learning-based multi-layered framework for the purpose of cybersecurity modeling. Overall, our goal is not only to discuss cybersecurity data science and relevant methods but also to focus the applicability towards data-driven intelligent decision making for protecting the systems from cyber-attacks.


Electronics ◽  
2021 ◽  
Vol 10 (8) ◽  
pp. 895
Author(s):  
Kah Phooi Seng ◽  
Paik Jen Lee ◽  
Li Minn Ang

Embedded intelligence (EI) is an emerging research field and has the objective to incorporate machine learning algorithms and intelligent decision-making capabilities into mobile and embedded devices or systems. There are several challenges to be addressed to realize efficient EI implementations in hardware such as the need for: (1) high computational processing; (2) low power consumption (or high energy efficiency); and (3) scalability to accommodate different network sizes and topologies. In recent years, an emerging hardware technology which has demonstrated strong potential and capabilities for EI implementations is the FPGA (field programmable gate array) technology. This paper presents an overview and review of embedded intelligence on FPGA with a focus on applications, platforms and challenges. There are four main classification and thematic descriptors which are reviewed and discussed in this paper for EI: (1) EI techniques including machine learning and neural networks, deep learning, expert systems, fuzzy intelligence, swarm intelligence, self-organizing map (SOM) and extreme learning; (2) applications for EI including object detection and recognition, indoor localization and surveillance monitoring, and other EI applications; (3) hardware and platforms for EI; and (4) challenges for EI. The paper aims to introduce interested researchers to this area and motivate the development of practical FPGA solutions for EI deployment.


2021 ◽  
Author(s):  
Jingyuan Liu

Abstract Cognitive computing is the field of intelligent computational study that imitates the brain process for computational intelligence. Decision-making is part of the cognitive process in which opportunities based on certain criteria are selected for a course of action. The choice is generally made using the intelligent assistance system that can turn human decision-making into Artificial Intelligence, system engineering, machine learning approaches. Many complicated real-world problems have been solved by the desire to replicate human intelligence into robots and progress in artificial intelligent technologies. Autonomous systems with machine cognition continuously develop by using enormous data volume and processing power. The cognitive computing system uses skill and awareness derived from knowledge and intelligent decision-making. In this paper, the cognitive computing-based human speech recognition framework (CC-HSRF) takes advantage of next-generation technologies to assist smart decision-making effectively. The proposed methods overview cognitive calculation and its historical perspectives, followed by several strategies to implement algorithms for intelligent decision-making using machine learning. Methods for effective knowledge processing are explored based on cognitive computing models such as Object-Attribute-Relation (OAR). It offers visual and cognitive analytics information, highlighting the framework of conceptual vision and its difficulties. This framework aims to increase the quality of artificial intelligent decision-making based on human perceptions, comprehensions, and actions to reduce business mistakes in the real world and ensure right, accurate, informed, and timely human decisions.


2020 ◽  
Author(s):  
Karthik Muthineni

The new industrial revolution Industry 4.0, connecting manufacturing process with digital technologies that can communicate, analyze, and use information for intelligent decision making includes Industrial Internet of Things (IIoT) to help manufactures and consumers for efficient controlling and monitoring. This work presents the design and implementation of an IIoT ecosystem for smart factories. The design is based on Siemens Simatic IoT2040, an intelligent industrial gateway that is connected to modbus sensors publishing data onto Network Platform for Internet of Everything (NETPIE). The design demonstrates the capabilities of Simatic IoT2040 by taking Python, Node-Red, and Mosca into account that works simultaneously on the device.


Polymers ◽  
2021 ◽  
Vol 13 (3) ◽  
pp. 353
Author(s):  
Kun-Cheng Ke ◽  
Ming-Shyan Huang

Conventional methods for assessing the quality of components mass produced using injection molding are expensive and time-consuming or involve imprecise statistical process control parameters. A suitable alternative would be to employ machine learning to classify the quality of parts by using quality indices and quality grading. In this study, we used a multilayer perceptron (MLP) neural network along with a few quality indices to accurately predict the quality of “qualified” and “unqualified” geometric shapes of a finished product. These quality indices, which exhibited a strong correlation with part quality, were extracted from pressure curves and input into the MLP model for learning and prediction. By filtering outliers from the input data and converting the measured quality into quality grades used as output data, we increased the prediction accuracy of the MLP model and classified the quality of finished parts into various quality levels. The MLP model may misjudge datapoints in the “to-be-confirmed” area, which is located between the “qualified” and “unqualified” areas. We classified the “to-be-confirmed” area, and only the quality of products in this area were evaluated further, which reduced the cost of quality control considerably. An integrated circuit tray was manufactured to experimentally demonstrate the feasibility of the proposed method.


2021 ◽  
Vol 3 (1) ◽  
Author(s):  
Zhikuan Zhao ◽  
Jack K. Fitzsimons ◽  
Patrick Rebentrost ◽  
Vedran Dunjko ◽  
Joseph F. Fitzsimons

AbstractMachine learning has recently emerged as a fruitful area for finding potential quantum computational advantage. Many of the quantum-enhanced machine learning algorithms critically hinge upon the ability to efficiently produce states proportional to high-dimensional data points stored in a quantum accessible memory. Even given query access to exponentially many entries stored in a database, the construction of which is considered a one-off overhead, it has been argued that the cost of preparing such amplitude-encoded states may offset any exponential quantum advantage. Here we prove using smoothed analysis that if the data analysis algorithm is robust against small entry-wise input perturbation, state preparation can always be achieved with constant queries. This criterion is typically satisfied in realistic machine learning applications, where input data is subjective to moderate noise. Our results are equally applicable to the recent seminal progress in quantum-inspired algorithms, where specially constructed databases suffice for polylogarithmic classical algorithm in low-rank cases. The consequence of our finding is that for the purpose of practical machine learning, polylogarithmic processing time is possible under a general and flexible input model with quantum algorithms or quantum-inspired classical algorithms in the low-rank cases.


Sign in / Sign up

Export Citation Format

Share Document