scholarly journals Ultra-Reliable and Low-Latency Computing in the Edge with Kubernetes

2021 ◽  
Vol 19 (3) ◽  
Author(s):  
László Toka

AbstractNovel applications will require extending traditional cloud computing infrastructure with compute resources deployed close to the end user. Edge and fog computing tightly integrated with carrier networks can fulfill this demand. The emphasis is on integration: the rigorous delay constraints, ensuring reliability on the distributed, remote compute nodes, and the sheer scale of the system altogether call for a powerful resource provisioning platform that offers the applications the best of the underlying infrastructure. We therefore propose Kubernetes-edge-scheduler that provides high reliability for applications in the edge, while provisioning less than 10% of resources for this purpose, and at the same time, it guarantees compliance with the latency requirements that end users expect. We present a novel topology clustering method that considers application latency requirements, and enables scheduling applications even on a worldwide scale of edge clusters. We demonstrate that in a potential use case, a distributed stream analytics application, our orchestration system can reduce the job completion time to 40% of the baseline provided by the default Kubernetes scheduler.

Author(s):  
Minal Moharir ◽  
Bharat Rahuldhev Patil

The demerits of cloud computing lie in the velocity, bandwidth, and privacy of data. This chapter focuses on why fog computing presents an effective solution to cloud computing. It first explains the primary motivation behind the use of fog computing. Fog computing, in essence, extends the services of the cloud towards the edge of the network (i.e., towards the devices nearer to the customer or the end user). Doing so offers several advantages. Some of the discussed advantages are scalability, low latency, reducing network traffic, and increasing efficiency. The chapter then explains the architecture to implement a fog network, followed by its applications. Some commercial fog products are also discussed, and a use case for an airport security system is presented.


2018 ◽  
Vol 7 (2.7) ◽  
pp. 345
Author(s):  
Chandra Sekhar Maganty ◽  
Kothamasu Kiran Kumar

Cloud computing is the transformation, which involves storing large applications where data or information is exchanged among differ-ent platforms for giving good service to clients who belong to different organizations. It assures great use of resources by making data, software and infrastructure available with minimal cost along with security and reliability. Even though cloud computing gives many advantages, it has certain limitations like network congestion, fault tolerance, less bandwidth etc. To come out of this issue a new era computing model is introduced called Fog Computing. This new computing model can transfer fragile data without any delay to other devices in the network. The only difference between both is fog is located more close to the end user or the device and gives response to the client instantly. Moreover, it is beneficial to the real time streaming applications, internet of things which need reliable internet con-nectivity along with high speed. This paper is a review on Fog Computing, differences in edge and fog computing, use cases of fog and the architecture.


2022 ◽  
Author(s):  
Anupama Mampage ◽  
Shanika Karunasekera ◽  
Rajkumar Buyya

Serverless computing has emerged as an attractive deployment option for cloud applications in recent times. The unique features of this computing model include rapid auto-scaling, strong isolation, fine-grained billing options and access to a massive service ecosystem which autonomously handles resource management decisions. This model is increasingly being explored for deployments in geographically distributed edge and fog computing networks as well, due to these characteristics. Effective management of computing resources has always gained a lot of attention among researchers. The need to automate the entire process of resource provisioning, allocation, scheduling, monitoring and scaling, has resulted in the need for specialized focus on resource management under the serverless model. In this article, we identify the major aspects covering the broader concept of resource management in serverless environments and propose a taxonomy of elements which influence these aspects, encompassing characteristics of system design, workload attributes and stakeholder expectations. We take a holistic view on serverless environments deployed across edge, fog and cloud computing networks. We also analyse existing works discussing aspects of serverless resource management using this taxonomy. This article further identifies gaps in literature and highlights future research directions for improving capabilities of this computing model.


Author(s):  
Zhuo Zou ◽  
Yi Jin ◽  
Paavo Nevalainen ◽  
Yuxiang Huan ◽  
Jukka Heikkonen ◽  
...  

2019 ◽  
Vol 154 ◽  
pp. 22-36 ◽  
Author(s):  
Shreshth Tuli ◽  
Redowan Mahmud ◽  
Shikhar Tuli ◽  
Rajkumar Buyya

Sign in / Sign up

Export Citation Format

Share Document