scholarly journals SAP: An IoT Application Module Placement Strategy Based on Simulated Annealing Algorithm in Edge-Cloud Computing

2021 ◽  
Vol 2021 ◽  
pp. 1-12
Author(s):  
Juan Fang ◽  
Kai Li ◽  
Juntao Hu ◽  
Xiaobin Xu ◽  
Ziyi Teng ◽  
...  

The Internet of Things (IoT) is rapidly growing and provides the foundation for the development of smart cities, smart home, and health care. With more and more devices connecting to the Internet, huge amounts of data are produced, creating a great challenge for data processing. Traditional cloud computing has the problems of long delays. Edge computing is an extension of cloud computing, processing data at the edge of the network can reduce the long processing delay of cloud computing. Due to the limited computing resources of edge servers, resource management of edge servers has become a critical research problem. However, the structural characteristics of the subtask chain between each pair of sensors and actuators are not considered to address the task scheduling problem in most existing research. To reduce processing latency and energy consumption of the edge-cloud system, we propose a multilayer edge computing system. The application deployed in the system is based on directed digraph. To fully use the edge servers, we proposed an application module placement strategy using Simulated Annealing module Placement (SAP) algorithm. The modules in an application are bounded to each sensor. The SAP algorithm is designed to find a module placement scheme for each sensor and to generate a module chain including the mapping of the module and servers for each sensor. Thus, the edge servers can transmit the tuples in the network with the module chain. To evaluate the efficacy of our algorithm, we simulate the strategy in iFogSim. Results show the scheme is able to achieve significant reductions in latency and energy consumption.

Author(s):  
C. Anuradha, M. Ponnavaikko

Cloud computing provides a platform for services and resources over the internet for users. The large pool of data resources and services has enabled the emergence of several novel applications such as smart grids, smart environments, and virtual reality. However, the state-of-the-art of cloud computing faces a delay constraint, which becomes a major barrier for reliable cloud services. This constraint is mostly highlighted in the case of smart cities (SC) and the Internet of Things (IoT). Therefore, the recent cloud computing paradigm has poor performance and cannot meet the low delay, navigation, and mobility support requirements.Machine-to-machine (M2M) connectivity has drawn considerable interest from both academia and industry with a growing number of machine-type communication devices (MTCDs). The data links with M2M communications are usually small but high bandwidth, unlike conventional networking networks, demanding performance management of both energy consumption and computing. The main challenges faced in mobile edge computing are task offloading, congestion control, Resource allocation, security and privacy issue, mobility and standardization .Our work mainly focus on offloading based resource allocation and security issues by analyzing the network parameters like reduction of latency and improvisation of bandwidth involved in cloud environment. The cloudsim simulation tool has been utilized to implement the offload balancing mechanism to decrease the energy consumption and optimize the computing resource allocation as well as improve computing capability.


2021 ◽  
Vol 9 (1) ◽  
pp. 912-931
Author(s):  
Pavan Madduru

To meet the growing demand for mobile data traffic and the stringent requirements for Internet of Things (IoT) applications in emerging cities such as smart cities, healthcare, augmented / virtual reality (AR / VR), fifth-generation assistive technologies generation (5G) Suggest and use on the web. As a major emerging 5G technology and a major driver of the Internet of Things, Multiple Access Edge Computing (MEC), which integrates telecommunications and IT services, provides cloud computing capabilities at the edge of an access network. wireless (RAN). By providing maximum compute and storage resources, MEC can reduce end-user latency. Therefore, in this article we will take a closer look at 5G MEC and the Internet of Things. Analyze the main functions of MEC in 5G and IoT environments. It offers several core technologies that enable the use of MEC in 5G and IoT, such as cloud computing, SDN / NFV, information-oriented networks, virtual machines (VMs) and containers, smart devices, shared networks and computing offload. This article also provides an overview of MEC's ​​role in 5G and IoT, a detailed introduction to MEC-enabled 5G and IoT applications, and future perspectives for MEC integration with 5G and IoT. Additionally, this article will take a closer look at the MEC research challenges and unresolved issues around 5G and the Internet of Things. Finally, we propose a use case that MEC uses to obtain advanced intelligence in IoT scenarios.


2021 ◽  
pp. 08-25
Author(s):  
Mustafa El .. ◽  
◽  
◽  
Aaras Y Y.Kraidi

The crowd-creation space is a manifestation of the development of innovation theory to a certain stage. With the creation of the crowd-creation space, the problem of optimizing the resource allocation of the crowd-creation space has become a research hotspot. The emergence of cloud computing provides a new idea for solving the problem of resource allocation. Common cloud computing resource allocation algorithms include genetic algorithms, simulated annealing algorithms, and ant colony algorithms. These algorithms have their obvious shortcomings, which are not conducive to solving the problem of optimal resource allocation for crowd-creation space computing. Based on this, this paper proposes an In the cloud computing environment, the algorithm for optimizing resource allocation for crowd-creation space computing adopts a combination of genetic algorithm and ant colony algorithm and optimizes it by citing some mechanisms of simulated annealing algorithm. The algorithm in this paper is an improved genetic ant colony algorithm (HGAACO). In this paper, the feasibility of the algorithm is verified through experiments. The experimental results show that with 20 tasks, the ant colony algorithm task allocation time is 93ms, the genetic ant colony algorithm time is 90ms, and the improved algorithm task allocation time proposed in this paper is 74ms, obviously superior. The algorithm proposed in this paper has a certain reference value for solving the creative space computing optimization resource allocation.


Sensors ◽  
2020 ◽  
Vol 20 (14) ◽  
pp. 3897 ◽  
Author(s):  
Nina Cvar ◽  
Jure Trilar ◽  
Andrej Kos ◽  
Mojca Volk ◽  
Emilija Stojmenova Duh

Initially, the concept of Smart Cities (urban settlement) originated from the Internet of Things (IoT) technology, however, the use of IoT technology can be extended to the concept of Smart Villages (rural settlement) as well, improving the life of the villagers, and the communities as a whole. Yet, the rural settlements have slightly different requirements than the urban like settlements. If application of IoT in Smart Cities can be characterized by densification of IoT to day-to-day life, following cities’ structural characteristics of being densely settled places, IoT empowered Smart Villages are usually a system of dispersion and deficiency. In this manner, this research paper will address and discuss different application areas of IoT technology, identifying differences, but also similarities in both ecosystems, while trying to illuminate the standardization efforts that can be applicable in both contexts. In our text we will propose the following IoT application domains, which will also serve as a base for research on smart villages: 1. Natural Resources and Energy, 2. Transport and Mobility, 3. Smart Building, 4. Daily Life, 5. Government, and 6. Economy and Society. By providing an overview of technical solutions that support smart solutions in Smart Cities and Smart Villages this research paper will evaluate how, with IoT empowered Smart Villages and Smart Cities, an overall improvement of quality of life of their inhabitants can be achieved.


2020 ◽  
Vol 2020 ◽  
pp. 1-17 ◽  
Author(s):  
Ibrahim Attiya ◽  
Mohamed Abd Elaziz ◽  
Shengwu Xiong

In recent years, cloud computing technology has attracted extensive attention from both academia and industry. The popularity of cloud computing was originated from its ability to deliver global IT services such as core infrastructure, platforms, and applications to cloud customers over the web. Furthermore, it promises on-demand services with new forms of the pricing package. However, cloud job scheduling is still NP-complete and became more complicated due to some factors such as resource dynamicity and on-demand consumer application requirements. To fill this gap, this paper presents a modified Harris hawks optimization (HHO) algorithm based on the simulated annealing (SA) for scheduling jobs in the cloud environment. In the proposed HHOSA approach, SA is employed as a local search algorithm to improve the rate of convergence and quality of solution generated by the standard HHO algorithm. The performance of the HHOSA method is compared with that of state-of-the-art job scheduling algorithms, by having them all implemented on the CloudSim toolkit. Both standard and synthetic workloads are employed to analyze the performance of the proposed HHOSA algorithm. The obtained results demonstrate that HHOSA can achieve significant reductions in makespan of the job scheduling problem as compared to the standard HHO and other existing scheduling algorithms. Moreover, it converges faster when the search space becomes larger which makes it appropriate for large-scale scheduling problems.


2021 ◽  
Vol 5 (2) ◽  
pp. 105
Author(s):  
Wasswa Shafik ◽  
S. Mojtaba Matinkhah ◽  
Mamman Nur Sanda ◽  
Fawad Shokoor

In recent years, the IoT) Internet of Things (IoT) allows devices to connect to the Internet that has become a promising research area mainly due to the constant emerging of the dynamic improvement of technologies and their associated challenges. In an approach to solve these challenges, fog computing came to play since it closely manages IoT connectivity. Fog-Enabled Smart Cities (IoT-ESC) portrays equitable energy consumption of a 7% reduction from 18.2% renewable energy contribution, which extends resource computation as a great advantage. The initialization of IoT-Enabled Smart Grids including (FESC) like fog nodes in fog computing, reduced workload in Terminal Nodes services (TNs) that are the sensors and actuators of the Internet of Things (IoT) set up. This paper proposes an integrated energy-efficiency model computation about the response time and delays service minimization delay in FESC. The FESC gives an impression of an auspicious computing model for location, time, and delay-sensitive applications supporting vertically -isolated, service delay, sensitive solicitations by providing abundant, ascendable, and scattered figuring stowage and system associativity. We first reviewed the persisting challenges in the proposed state-of-the models and based on them. We introduce a new model to address mainly energy efficiency about response time and the service delays in IoT-ESC. The iFogsim simulated results demonstrated that the proposed model minimized service delay and reduced energy consumption during computation. We employed IoT-ESC to decide autonomously or semi-autonomously whether the computation is to be made on Fog nodes or its transfer to the cloud.


2022 ◽  
Vol 2022 ◽  
pp. 1-14
Author(s):  
Zhenzhong Zhang ◽  
Wei Sun ◽  
Yanliang Yu

With the vigorous development of the Internet of Things, the Internet, cloud computing, and mobile terminals, edge computing has emerged as a new type of Internet of Things technology, which is one of the important components of the Industrial Internet of Things. In the face of large-scale data processing and calculations, traditional cloud computing is facing tremendous pressure, and the demand for new low-latency computing technologies is imminent. As a supplementary expansion of cloud computing technology, mobile edge computing will sink the computing power from the previous cloud to a network edge node. Through the mutual cooperation between computing nodes, the number of nodes that can be calculated is more, the types are more comprehensive, and the computing range is even greater. Broadly, it makes up for the shortcomings of cloud computing technology. Although edge computing technology has many advantages and has certain research and application results, how to allocate a large number of computing tasks and computing resources to computing nodes and how to schedule computing tasks at edge nodes are still challenges for edge computing. In view of the problems encountered by edge computing technology in resource allocation and task scheduling, this paper designs a dynamic task scheduling strategy for edge computing with delay-aware characteristics, which realizes the reasonable utilization of computing resources and is required for edge computing systems. This paper proposes a resource allocation scheme combined with the simulated annealing algorithm, which minimizes the overall performance loss of the system while keeping the system low delay. Finally, it is verified through experiments that the task scheduling and resource allocation methods proposed in this paper can significantly reduce the response delay of the application.


Algorithms ◽  
2019 ◽  
Vol 12 (2) ◽  
pp. 48 ◽  
Author(s):  
Ming Zhao ◽  
Ke Zhou

Mobile Edge Computing (MEC) is an innovative technique, which can provide cloud-computing near mobile devices on the edge of networks. Based on the MEC architecture, this paper proposes an ARIMA-BP-based Selective Offloading (ABSO) strategy, which minimizes the energy consumption of mobile devices while meeting the delay requirements. In ABSO, we exploit an ARIMA-BP model for estimating computation capacity of the edge cloud, and then design a Selective Offloading Algorithm for obtaining offloading strategy. Simulation results reveal that the ABSO can apparently decrease the energy consumption of mobile devices in comparison with other offloading methods.


2012 ◽  
Vol 462 ◽  
pp. 392-397 ◽  
Author(s):  
Bin Fei Li ◽  
Yan Min Song ◽  
Jun Pei ◽  
Jing Min Yang

This paper based on the energy consumption simulation model of chiller system, by analyzing the composition of chiller energy consumption, proposes to compute the best optimal controlling parameters by using simulated annealing algorithm in order to reduce energy consumption. The optimizing process of the best settings of each controllable variable is introduced which use the simulated annealing algorithm in the running chiller system, and the effects of simulated annealing running has been simulated.


Sign in / Sign up

Export Citation Format

Share Document