scholarly journals Computational off loading through 5G Enabled Edge Computing in IIOT

2020 ◽  
Vol 8 (6) ◽  
pp. 1417-1420

Computational offloading is the active research nowadays. To improve the computational offloading and security of data we use game theory approach and 5G enabled edge computing. Edge computing is providing solution across various sectors Such arrangements not just decreases load on the cloud by preparing information on the edge, yet in addition assume a significant job in information security by guaranteeing information correspondence is privately meant system which legitimately interfaces the user equipment, at that point the nearby server sends to the organization’s network core. With organizations and other institutes looking to integrate this edge-focused approach to their communication infrastructure, it is used in collection and managing data securely. This with a combination of 5G and game theory approach we can easily manage the data transmission. Game theory approach helps IoT devices to take decision autonomously and reduce computational offloading.5G is on the rapid development than ever before. Due to its super-fast speeds, high bandwidth, reduced latency and increased capacity over 4G, 5G has the ability to provide greater security and computational offloading than 4G. The 5G network’s speed likely to reach 20-30 times faster than what the 4G network allows. That improvement opens possibilities for far-away sensors to connect and reduce latency through local servers. 5G uses distributed network of base station in small cell infrastructure because of its shorter range. Due to its higher frequency 5G uses new radio spectrum (N-RAM) structure.5G further improves the data security by using network slicing. Thus our experiment in this paper use dynamic computational offloading algorithm to the user equipment which transfer data via 5G.

Author(s):  
Xianyu Meng ◽  
Wei Lu

Mobile edge computing (MEC) provides users with low-latency, high-bandwidth, and high-reliability services by migrating the computing power of the cloud computing center to the edge of the network. It is thus being considered an effective solution for the contradiction between the limited computing capabilities of Internet of Things (IoT) devices and the rapid development of delay-sensitive real-time applications. In this study, we propose and design a container union file system based on the differencing hard disk and dynamic loading strategy to address the excessively long migration time caused by the bundling transmission of the file system and container images during container-based service migration. The proposed method involves designing a mechanism based on a remote dynamic loading strategy to avoid the downloading of all container images, thereby reducing the long preparation time before which stateless migration can begin. Furthermore, in view of the excessive latency of the edge service during the stateful migration process, a strategy for avoiding the transmission of the underlying file system and container images is designed to optimize the service interruption time and service quality degradation time. Experiments show that the proposed method and strategy can effectively reduce the migration time of container-based services.


2020 ◽  
Vol 2 (1) ◽  
pp. 50-61
Author(s):  
Dr. Smys S. ◽  
Dr. Ranganathan G.

The edge paradigm that is intended as prominent computing due to its low computation latencies faces multiple issues and challenges due to the restrictions in the computing capabilities and its resource availability especially in the huge populace scenarios. To examine the problems faced during the task scheduling when the edge computing is called up by multiple users at time, the paper puts forward the game theory approach. Utilizing the game theory strategy the paper puts forth the a novel multi-tasking scheduling in the edge computing from the user perception developing an algorithm taking into consideration the consistency of the stable tasks. The analysis of the proposed algorithm used in the allocation of the tasks is done on terms of average time consumed for the execution of the task and the waiting time. The results acquired showed that the proposed method provides a maximized throughput, minimizing the waiting time compared to the conventional methods used in optimizing the parameters of scheduling.


2021 ◽  
Vol 2021 ◽  
pp. 1-11
Author(s):  
Ying Yu

With the popularization of mobile terminals and the rapid development of mobile communication technology, many PC-based services have placed high demands on data processing and storage functions. Cloud laptops that transfer data processing tasks to the cloud cannot meet the needs of users due to low latency and high-quality services. In view of this, some researchers have proposed the concept of mobile edge computing. Mobile edge computing (MEC) is based on the 5G evolution architecture. By deploying multiple service servers on the base station side near the edge of the user’s mobile core network, it provides nearby computing and processing services for user business. This article is aimed at studying the use of caching and MEC processing functions to design an effective caching and distribution mechanism across the network edge and apply it to civil aviation express marketing. This paper proposes to focus on mobile edge computing technology, combining it with data warehouse technology, clustering algorithm, and other methods to build an experimental model of MEC-based caching mechanism applied to civil aviation express marketing. The experimental results in this paper show that when the cache space and the number of service contents are constant, the LECC mechanism among the five cache mechanisms is more effective than LENC, LRU, and RR in cache hit rate, average content transmission delay, and transmission overhead. For example, with the same cache space, ATC under the LECC mechanism is about 4%~9%, 8%~13%, and 18%~22% lower than that of LENC, LRU, and RR, respectively.


Author(s):  
Jaber Almutairi ◽  
Mohammad Aldossary

AbstractRecently, the number of Internet of Things (IoT) devices connected to the Internet has increased dramatically as well as the data produced by these devices. This would require offloading IoT tasks to release heavy computation and storage to the resource-rich nodes such as Edge Computing and Cloud Computing. Although Edge Computing is a promising enabler for latency-sensitive related issues, its deployment produces new challenges. Besides, different service architectures and offloading strategies have a different impact on the service time performance of IoT applications. Therefore, this paper presents a novel approach for task offloading in an Edge-Cloud system in order to minimize the overall service time for latency-sensitive applications. This approach adopts fuzzy logic algorithms, considering application characteristics (e.g., CPU demand, network demand and delay sensitivity) as well as resource utilization and resource heterogeneity. A number of simulation experiments are conducted to evaluate the proposed approach with other related approaches, where it was found to improve the overall service time for latency-sensitive applications and utilize the edge-cloud resources effectively. Also, the results show that different offloading decisions within the Edge-Cloud system can lead to various service time due to the computational resources and communications types.


2021 ◽  
Vol 107 ◽  
pp. 105495
Author(s):  
Nima Pournabi ◽  
Somaye Janatrostami ◽  
Afshin Ashrafzadeh ◽  
Kourosh Mohammadi

Sign in / Sign up

Export Citation Format

Share Document