scholarly journals Machine Learning-based Orchestration of Containers: A Taxonomy and Future Directions

2022 ◽  
Author(s):  
Zhiheng Zhong ◽  
Minxian Xu ◽  
Maria Alejandra Rodriguez ◽  
Chengzhong Xu ◽  
Rajkumar Buyya

Containerization is a lightweight application virtualization technology, providing high environmental consistency, operating system distribution portability, and resource isolation. Existing mainstream cloud service providers have prevalently adopted container technologies in their distributed system infrastructures for automated application management. To handle the automation of deployment, maintenance, autoscaling, and networking of containerized applications, container orchestration is proposed as an essential research problem. However, the highly dynamic and diverse feature of cloud workloads and environments considerably raises the complexity of orchestration mechanisms. Machine learning algorithms are accordingly employed by container orchestration systems for behavior modelling and prediction of multi-dimensional performance metrics. Such insights could further improve the quality of resource provisioning decisions in response to the changing workloads under complex environments. In this paper, we present a comprehensive literature review of existing machine learning-based container orchestration approaches. Detailed taxonomies are proposed to classify the current researches by their common features. Moreover, the evolution of machine learning-based container orchestration technologies from the year 2016 to 2021 has been designed based on objectives and metrics. A comparative analysis of the reviewed techniques is conducted according to the proposed taxonomies, with emphasis on their key characteristics. Finally, various open research challenges and potential future directions are highlighted.

Author(s):  
Narander Kumar ◽  
Surendra Kumar

Background: Cloud Computing can utilize processing and efficient resources on a metered premise. This feature is a significant research problem, like giving great Quality-of-Services (QoS) to the cloud clients. Objective: Quality of Services confirmation with minimum utilization of resource and their time/costs, cloud service providers ought to receive self-versatile of the resource provisioning at each level. Currently, various guidelines, as well as model-based methodologies, have been intended to the management of resources aspects in the cloud computing services. Method: In this Research article, manage resource allocations dependent optimization Salp Swarm Algorithm (SSA) areused to merge various numbers of VMs on lessening Data Centers to SLA as well as required Quality-of-Service (QoS) with most extreme data centers use. Result: We compared with the various approaches like the First fit (FF), greedy crow search (GCS), and hybrid crow search with the response time and resource utilization. Conclusion: The proposed mechanism is simulated on Cloudsim Simulator, the simulation results show less migration time that improves the QoS as well minimize the energy consumssion in a cloud computing and IoT environment.


Author(s):  
Amandeep Singh Arora ◽  
Linesh Raja ◽  
Barkha Bahl

Cloud Security is a strong hindrance which discourage organizations to move toward cloud despite huge benefits. Denial of Service attacks [1] operated via distributed systems compromise availability of cloud services. Techniques to identify distributed denial of service attacks with minimized false positives is highly required to ensure availability of cloud services to genuine users. Classification of incoming requests and outgoing responses using machine learning algorithms is a quite effective way of detection and prevention. In this paper, Ten algorithms of machine learning have been evaluated for performance and detection accuracies. An estimation accuracy method known as F-Hold cross validation [2] is used for time efficient analysis.


Author(s):  
Ranjan Kumar Behera ◽  
Kshira Sagar Sahoo ◽  
Debadatt Naik ◽  
Santanu Kumar Rath ◽  
Bibhudatta Sahoo

Link prediction is an emerging research problem in social network analysis, where future possible links are predicted based on the structural or the content information associated with the network. In this paper, various machine learning (ML) techniques have been utilized for predicting the future possible links based on the features extracted from the topological structure. Moreover, feature sets have been prepared by measuring different similarity metrics between all pair of nodes between which no link exists. For predicting the future possible links various supervised ML algorithms like K-NN, MLP, bagging, SVM, decision tree have been implemented. The feature set for each instance in the dataset has been prepared by measuring the similarity index between the non-existence links. The model has been trained to identify the new links which are likely to appear in the future but currently do not exist in the network. Further, the proposed model is validated through various performance metrics.


2021 ◽  
Vol 11 (4) ◽  
pp. 1627
Author(s):  
Yanbin Li ◽  
Gang Lei ◽  
Gerd Bramerdorfer ◽  
Sheng Peng ◽  
Xiaodong Sun ◽  
...  

This paper reviews the recent developments of design optimization methods for electromagnetic devices, with a focus on machine learning methods. First, the recent advances in multi-objective, multidisciplinary, multilevel, topology, fuzzy, and robust design optimization of electromagnetic devices are overviewed. Second, a review is presented to the performance prediction and design optimization of electromagnetic devices based on the machine learning algorithms, including artificial neural network, support vector machine, extreme learning machine, random forest, and deep learning. Last, to meet modern requirements of high manufacturing/production quality and lifetime reliability, several promising topics, including the application of cloud services and digital twin, are discussed as future directions for design optimization of electromagnetic devices.


2021 ◽  
Vol 13 (3) ◽  
pp. 63
Author(s):  
Maghsoud Morshedi ◽  
Josef Noll

Video conferencing services based on web real-time communication (WebRTC) protocol are growing in popularity among Internet users as multi-platform solutions enabling interactive communication from anywhere, especially during this pandemic era. Meanwhile, Internet service providers (ISPs) have deployed fiber links and customer premises equipment that operate according to recent 802.11ac/ax standards and promise users the ability to establish uninterrupted video conferencing calls with ultra-high-definition video and audio quality. However, the best-effort nature of 802.11 networks and the high variability of wireless medium conditions hinder users experiencing uninterrupted high-quality video conferencing. This paper presents a novel approach to estimate the perceived quality of service (PQoS) of video conferencing using only 802.11-specific network performance parameters collected from Wi-Fi access points (APs) on customer premises. This study produced datasets comprising 802.11-specific network performance parameters collected from off-the-shelf Wi-Fi APs operating at 802.11g/n/ac/ax standards on both 2.4 and 5 GHz frequency bands to train machine learning algorithms. In this way, we achieved classification accuracies of 92–98% in estimating the level of PQoS of video conferencing services on various Wi-Fi networks. To efficiently troubleshoot wireless issues, we further analyzed the machine learning model to correlate features in the model with the root cause of quality degradation. Thus, ISPs can utilize the approach presented in this study to provide predictable and measurable wireless quality by implementing a non-intrusive quality monitoring approach in the form of edge computing that preserves customers’ privacy while reducing the operational costs of monitoring and data analytics.


2021 ◽  
Vol 10 (4) ◽  
pp. 58-75
Author(s):  
Vivek Sen Saxena ◽  
Prashant Johri ◽  
Avneesh Kumar

Skin lesion melanoma is the deadliest type of cancer. Artificial intelligence provides the power to classify skin lesions as melanoma and non-melanoma. The proposed system for melanoma detection and classification involves four steps: pre-processing, resizing all the images, removing noise and hair from dermoscopic images; image segmentation, identifying the lesion area; feature extraction, extracting features from segmented lesion and classification; and categorizing lesion as malignant (melanoma) and benign (non-melanoma). Modified GrabCut algorithm is employed to generate skin lesion. Segmented lesions are classified using machine learning algorithms such as SVM, k-NN, ANN, and logistic regression and evaluated on performance metrics like accuracy, sensitivity, and specificity. Results are compared with existing systems and achieved higher similarity index and accuracy.


2021 ◽  
Vol 35 (1) ◽  
pp. 11-21
Author(s):  
Himani Tyagi ◽  
Rajendra Kumar

IoT is characterized by communication between things (devices) that constantly share data, analyze, and make decisions while connected to the internet. This interconnected architecture is attracting cyber criminals to expose the IoT system to failure. Therefore, it becomes imperative to develop a system that can accurately and automatically detect anomalies and attacks occurring in IoT networks. Therefore, in this paper, an Intrsuion Detection System (IDS) based on extracted novel feature set synthesizing BoT-IoT dataset is developed that can swiftly, accurately and automatically differentiate benign and malicious traffic. Instead of using available feature reduction techniques like PCA that can change the core meaning of variables, a unique feature set consisting of only seven lightweight features is developed that is also IoT specific and attack traffic independent. Also, the results shown in the study demonstrates the effectiveness of fabricated seven features in detecting four wide variety of attacks namely DDoS, DoS, Reconnaissance, and Information Theft. Furthermore, this study also proves the applicability and efficiency of supervised machine learning algorithms (KNN, LR, SVM, MLP, DT, RF) in IoT security. The performance of the proposed system is validated using performance Metrics like accuracy, precision, recall, F-Score and ROC. Though the accuracy of Decision Tree (99.9%) and Randon Forest (99.9%) Classifiers are same but other metrics like training and testing time shows Random Forest comparatively better.


Sales forecasting is an important when it comes to companies who are engaged in retailing, logistics, manufacturing, marketing and wholesaling. It allows companies to allocate resources efficiently, to estimate revenue of the sales and to plan strategies which are better for company’s future. In this paper, predicting product sales from a particular store is done in a way that produces better performance compared to any machine learning algorithms. The dataset used for this project is Big Mart Sales data of the 2013.Nowadays shopping malls and Supermarkets keep track of the sales data of the each and every individual item for predicting the future demand of the customer. It contains large amount of customer data and the item attributes. Further, the frequent patterns are detected by mining the data from the data warehouse. Then the data can be used for predicting the sales of the future with the help of several machine learning techniques (algorithms) for the companies like Big Mart. In this project, we propose a model using the Xgboost algorithm for predicting sales of companies like Big Mart and founded that it produces better performance compared to other existing models. An analysis of this model with other models in terms of their performance metrics is made in this project. Big Mart is an online marketplace where people can buy or sell or advertise your merchandise at low cost. The goal of the paper is to make Big Mart the shopping paradise for the buyers and a marketing solutions for the sellers as well. The ultimate aim is the complete satisfaction of the customers. The project “SUPERMARKET SALES PREDICTION” builds a predictive model and finds out the sales of each of the product at a particular store. The Big Mart use this model to under the properties of the products which plays a major role in increasing the sales. This can also be done on the basis hypothesis that should be done before looking at the data


Author(s):  
Shanthi Thangam Manukumar ◽  
Vijayalakshmi Muthuswamy

With the development of edge devices and mobile devices, the authenticated fast access for the networks is necessary and important. To make the edge and mobile devices smart, fast, and for the better quality of service (QoS), fog computing is an efficient way. Fog computing is providing the way for resource provisioning, service providers, high response time, and the best solution for mobile network traffic. In this chapter, the proposed method is for handling the fog resource management using efficient offloading mechanism. Offloading is done based on machine learning prediction technology and also by using the KNN algorithm to identify the nearest fog nodes to offload. The proposed method minimizes the energy consumption, latency and improves the QoS for edge devices, IoT devices, and mobile devices.


2019 ◽  
Vol 9 (18) ◽  
pp. 3665 ◽  
Author(s):  
Ahmet Çağdaş Seçkin ◽  
Aysun Coşkun

Wi-Fi-based indoor positioning offers significant opportunities for numerous applications. Examining the Wi-Fi positioning systems, it was observed that hundreds of variables were used even when variable reduction was applied. This reveals a structure that is difficult to repeat and is far from producing a common solution for real-life applications. It aims to create a common and standardized dataset for indoor positioning and localization and present a system that can perform estimations using this dataset. To that end, machine learning (ML) methods are compared and the results of successful methods with hierarchical inclusion are then investigated. Further, new features are generated according to the measurement point obtained from the dataset. Subsequently, learning models are selected according to the performance metrics for the estimation of location and position. These learning models are then fused hierarchically using deductive reasoning. Using the proposed method, estimation of location and position has proved to be more successful by using fewer variables than the current studies. This paper, thus, identifies a lack of applicability present in the research community and solves it using the proposed method. It suggests that the proposed method results in a significant improvement for the estimation of floor and longitude.


Sign in / Sign up

Export Citation Format

Share Document