scholarly journals An Algorithm to Minimize Energy Consumption and Elapsed Time for IoT Workloads in a Hybrid Architecture

Sensors ◽  
2021 ◽  
Vol 21 (9) ◽  
pp. 2914
Author(s):  
Julio C. S. dos Anjos ◽  
João L. G. Gross ◽  
Kassiano J. Matteussi ◽  
Gabriel V. González ◽  
Valderi R. Q. Leithardt ◽  
...  

Advances in communication technologies have made the interaction of small devices, such as smartphones, wearables, and sensors, scattered on the Internet, bringing a whole new set of complex applications with ever greater task processing needs. These Internet of things (IoT) devices run on batteries with strict energy restrictions. They tend to offload task processing to remote servers, usually to cloud computing (CC) in datacenters geographically located away from the IoT device. In such a context, this work proposes a dynamic cost model to minimize energy consumption and task processing time for IoT scenarios in mobile edge computing environments. Our approach allows for a detailed cost model, with an algorithm called TEMS that considers energy, time consumed during processing, the cost of data transmission, and energy in idle devices. The task scheduling chooses among cloud or mobile edge computing (MEC) server or local IoT devices to achieve better execution time with lower cost. The simulated environment evaluation saved up to 51.6% energy consumption and improved task completion time up to 86.6%.

Author(s):  
Julio Cesar Santos dos Anjos ◽  
João Luiz Grave Gross ◽  
Kassiano Jose Matteussi ◽  
Gabriel Villarrubia González ◽  
Valderi Reis Quietinho Leithardt ◽  
...  

Advances in communication technologies have made the interaction of small devices, such as smartphones, wearables, and sensors, scattered on the Internet, bringing a whole new set of complex applications with ever greater task processing needs. These Internet of Things (IoT) devices run on batteries with strict energy restrictions. They tend to offload task processing to remote servers, usually to Cloud Computing (CC) in datacenters geographically located away from the IoT device. In such a context, this work proposes a dynamic cost model to minimize energy consumption and task processing time for IoT scenarios in Mobile Edge Computing environments. Our approach allows for a detailed cost model, with an algorithm called TEMS that considers energy, time consumed during processing, the cost of data transmission, and energy in idle devices. The task scheduling chooses among Cloud or Mobile Edge Computing (MEC) server or local IoT devices to better execution time with lower cost. The simulated environment evaluation saved up to 51.6% energy consumption and improved task completion time up to 86.6%.


Sensors ◽  
2021 ◽  
Vol 21 (14) ◽  
pp. 4798
Author(s):  
Fangni Chen ◽  
Anding Wang ◽  
Yu Zhang ◽  
Zhengwei Ni ◽  
Jingyu Hua

With the increasing deployment of IoT devices and applications, a large number of devices that can sense and monitor the environment in IoT network are needed. This trend also brings great challenges, such as data explosion and energy insufficiency. This paper proposes a system that integrates mobile edge computing (MEC) technology and simultaneous wireless information and power transfer (SWIPT) technology to improve the service supply capability of WSN-assisted IoT applications. A novel optimization problem is formulated to minimize the total system energy consumption under the constraints of data transmission rate and transmitting power requirements by jointly considering power allocation, CPU frequency, offloading weight factor and energy harvest weight factor. Since the problem is non-convex, we propose a novel alternate group iteration optimization (AGIO) algorithm, which decomposes the original problem into three subproblems, and alternately optimizes each subproblem using the group interior point iterative algorithm. Numerical simulations validate that the energy consumption of our proposed design is much lower than the two benchmark algorithms. The relationship between system variables and energy consumption of the system is also discussed.


2020 ◽  
Author(s):  
João Luiz Grave Gross ◽  
Cláudio Fernando Fernando Resin Geyer

In a scenario with increasingly mobile devices connected to the Internet, data-intensive applications and energy consumption limited by battery capacity, we propose a cost minimization model for IoT devices in a Mobile Edge Computing (MEC) architecture with the main objective of reducing total energy consumption and total elapsed times from task creation to conclusion. The cost model is implemented using the TEMS (Time and Energy Minimization Scheduler) scheduling algorithm and validated with simulation. The results show that it is possible to reduce the energy consumed in the system by up to 51.61% and the total elapsed time by up to 86.65% in the simulated cases with the parameters and characteristics defined in each experiment.


Author(s):  
Michael P. J. Mahenge ◽  
Chunlin Li ◽  
Camilius A. Sanga

The overwhelming growth of resource-intensive and latency-sensitive applications trigger challenges in legacy systems of mobile cloud computing (MCC) architecture. Such challenges include congestion in the backhaul link, high latency, inefficient bandwidth usage, insufficient performance, and quality of service (QoS) metrics. The objective of this study was to find out the cost-efficient design that maximizes resource utilization at the edge of the mobile network which in return minimizes the task processing costs. Thus, this study proposes a cooperative mobile edge computing (coopMEC) to address the aforementioned challenges in MCC architecture. Also, in the proposed approach, resource-intensive jobs can be unloaded from users' equipment to MEC layer which is potential for enhancing performance in resource-constrained mobile devices. The simulation results demonstrate the potential gain from the proposed approach in terms of reducing response delay and resource consumption. This, in turn, improves performance, QoS, and guarantees cost-effectiveness in meeting users' demands.


2022 ◽  
Vol 18 (2) ◽  
pp. 1-25
Author(s):  
Jing Li ◽  
Weifa Liang ◽  
Zichuan Xu ◽  
Xiaohua Jia ◽  
Wanlei Zhou

We are embracing an era of Internet of Things (IoT). The latency brought by unstable wireless networks caused by limited resources of IoT devices seriously impacts the quality of services of users, particularly the service delay they experienced. Mobile Edge Computing (MEC) technology provides promising solutions to delay-sensitive IoT applications, where cloudlets (edge servers) are co-located with wireless access points in the proximity of IoT devices. The service response latency for IoT applications can be significantly shortened due to that their data processing can be performed in a local MEC network. Meanwhile, most IoT applications usually impose Service Function Chain (SFC) enforcement on their data transmission, where each data packet from its source gateway of an IoT device to the destination (a cloudlet) of the IoT application must pass through each Virtual Network Function (VNF) in the SFC in an MEC network. However, little attention has been paid on such a service provisioning of multi-source IoT applications in an MEC network with SFC enforcement. In this article, we study service provisioning in an MEC network for multi-source IoT applications with SFC requirements and aiming at minimizing the cost of such service provisioning, where each IoT application has multiple data streams from different sources to be uploaded to a location (cloudlet) in the MEC network for aggregation, processing, and storage purposes. To this end, we first formulate two novel optimization problems: the cost minimization problem of service provisioning for a single multi-source IoT application, and the service provisioning problem for a set of multi-source IoT applications, respectively, and show that both problems are NP-hard. Second, we propose a service provisioning framework in the MEC network for multi-source IoT applications that consists of uploading stream data from multiple sources of the IoT application to the MEC network, data stream aggregation and routing through the VNF instance placement and sharing, and workload balancing among cloudlets. Third, we devise an efficient algorithm for the cost minimization problem built upon the proposed service provisioning framework, and further extend the solution for the service provisioning problem of a set of multi-source IoT applications. We finally evaluate the performance of the proposed algorithms through experimental simulations. Simulation results demonstrate that the proposed algorithms are promising.


Author(s):  
Zhuofan Liao ◽  
Jingsheng Peng ◽  
Bing Xiong ◽  
Jiawei Huang

AbstractWith the combination of Mobile Edge Computing (MEC) and the next generation cellular networks, computation requests from end devices can be offloaded promptly and accurately by edge servers equipped on Base Stations (BSs). However, due to the densified heterogeneous deployment of BSs, the end device may be covered by more than one BS, which brings new challenges for offloading decision, that is whether and where to offload computing tasks for low latency and energy cost. This paper formulates a multi-user-to-multi-servers (MUMS) edge computing problem in ultra-dense cellular networks. The MUMS problem is divided and conquered by two phases, which are server selection and offloading decision. For the server selection phases, mobile users are grouped to one BS considering both physical distance and workload. After the grouping, the original problem is divided into parallel multi-user-to-one-server offloading decision subproblems. To get fast and near-optimal solutions for these subproblems, a distributed offloading strategy based on a binary-coded genetic algorithm is designed to get an adaptive offloading decision. Convergence analysis of the genetic algorithm is given and extensive simulations show that the proposed strategy significantly reduces the average latency and energy consumption of mobile devices. Compared with the state-of-the-art offloading researches, our strategy reduces the average delay by 56% and total energy consumption by 14% in the ultra-dense cellular networks.


Electronics ◽  
2021 ◽  
Vol 10 (2) ◽  
pp. 190
Author(s):  
Wu Ouyang ◽  
Zhigang Chen ◽  
Jia Wu ◽  
Genghua Yu ◽  
Heng Zhang

As transportation becomes more convenient and efficient, users move faster and faster. When a user leaves the service range of the original edge server, the original edge server needs to migrate the tasks offloaded by the user to other edge servers. An effective task migration strategy needs to fully consider the location of users, the load status of edge servers, and energy consumption, which make designing an effective task migration strategy a challenge. In this paper, we innovatively proposed a mobile edge computing (MEC) system architecture consisting of multiple smart mobile devices (SMDs), multiple unmanned aerial vehicle (UAV), and a base station (BS). Moreover, we establish the model of the Markov decision process with unknown rewards (MDPUR) based on the traditional Markov decision process (MDP), which comprehensively considers the three aspects of the migration distance, the residual energy status of the UAVs, and the load status of the UAVs. Based on the MDPUR model, we propose a advantage-based value iteration (ABVI) algorithm to obtain the effective task migration strategy, which can help the UAV group to achieve load balancing and reduce the total energy consumption of the UAV group under the premise of ensuring user service quality. Finally, the results of simulation experiments show that the ABVI algorithm is effective. In particular, the ABVI algorithm has better performance than the traditional value iterative algorithm. And in a dynamic environment, the ABVI algorithm is also very robust.


Symmetry ◽  
2018 ◽  
Vol 10 (11) ◽  
pp. 594 ◽  
Author(s):  
Tri Nguyen ◽  
Tien-Dung Nguyen ◽  
Van Nguyen ◽  
Xuan-Qui Pham ◽  
Eui-Nam Huh

By bringing the computation and storage resources close proximity to the mobile network edge, mobile edge computing (MEC) is a key enabling technology for satisfying the Internet of Vehicles (IoV) infotainment applications’ requirements, e.g., video streaming service (VSA). However, the explosive growth of mobile video traffic brings challenges for video streaming providers (VSPs). One known issue is that a huge traffic burden on the vehicular network leads to increasing VSP costs for providing VSA to mobile users (i.e., autonomous vehicles). To address this issue, an efficient resource sharing scheme between underutilized vehicular resources is a promising solution to reduce the cost of serving VSA in the vehicular network. Therefore, we propose a new VSA model based on the lower cost of obtaining data from vehicles and then minimize the VSP’s cost. By using existing data resources from nearby vehicles, our proposal can reduce the cost of providing video service to mobile users. Specifically, we formulate our problem as mixed integer nonlinear programming (MINP) in order to calculate the total payment of the VSP. In addition, we introduce an incentive mechanism to encourage users to rent its resources. Our solution represents a strategy to optimize the VSP serving cost under the quality of service (QoS) requirements. Simulation results demonstrate that our proposed mechanism is possible to achieve up to 21% and 11% cost-savings in terms of the request arrival rate and vehicle speed, in comparison with other existing schemes, respectively.


Sign in / Sign up

Export Citation Format

Share Document