scholarly journals Cognitive Load Balancing Approach for 6G MEC Serving IoT Mashups

Mathematics ◽  
2021 ◽  
Vol 10 (1) ◽  
pp. 101
Author(s):  
Barbara Attanasio ◽  
Andriy Mazayev ◽  
Shani du Plessis ◽  
Noélia Correia

The sixth generation (6G) of communication networks represents more of a revolution than an evolution of the previous generations, providing new directions and innovative approaches to face the network challenges of the future. A crucial aspect is to make the best use of available resources for the support of an entirely new generation of services. From this viewpoint, the Web of Things (WoT), which enables Things to become Web Things to chain, use and re-use in IoT mashups, allows interoperability among IoT platforms. At the same time, Multi-access Edge Computing (MEC) brings computing and data storage to the edge of the network, which creates the so-called distributed and collective edge intelligence. Such intelligence is created in order to deal with the huge amount of data to be collected, analyzed and processed, from real word contexts, such as smart cities, which are evolving into dynamic and networked systems of people and things. To better exploit this architecture, it is crucial to break monolithic applications into modular microservices, which can be executed independently. Here, we propose an approach based on complex network theory and two weighted and interdependent multiplex networks to address the Microservices-compliant Load Balancing (McLB) problem in MEC infrastructure. Our findings show that the multiplex network representation represents an extra dimension of analysis, allowing to capture the complexity in WoT mashup organization and its impact on the organizational aspect of MEC servers. The impact of this extracted knowledge on the cognitive organization of MEC is quantified, through the use of heuristics that are engineered to guarantee load balancing and, consequently, QoS.

Sensors ◽  
2021 ◽  
Vol 21 (4) ◽  
pp. 1387
Author(s):  
Oswaldo Sebastian Peñaherrera-Pulla ◽  
Carlos Baena ◽  
Sergio Fortes ◽  
Eduardo Baena ◽  
Raquel Barco

Cloud Gaming is a cutting-edge paradigm in the video game provision where the graphics rendering and logic are computed in the cloud. This allows a user’s thin client systems with much more limited capabilities to offer a comparable experience with traditional local and online gaming but using reduced hardware requirements. In contrast, this approach stresses the communication networks between the client and the cloud. In this context, it is necessary to know how to configure the network in order to provide service with the best quality. To that end, the present work defines a novel framework for Cloud Gaming performance evaluation. This system is implemented in a real testbed and evaluates the Cloud Gaming approach for different transport networks (Ethernet, WiFi, and LTE (Long Term Evolution)) and scenarios, automating the acquisition of the gaming metrics. From this, the impact on the overall gaming experience is analyzed identifying the main parameters involved in its performance. Hence, the future lines for Cloud Gaming QoE-based (Quality of Experience) optimization are established, this way being of configuration, a trendy paradigm in the new-generation networks, such as 4G and 5G (Fourth and Fifth Generation of Mobile Networks).


Author(s):  
Faiz-ul Hassan ◽  
Wim Vanderbauwhede ◽  
Fernando Rodríguez-Salazar

On-chip communication is becoming an important bottleneck in the design and operation of high performance systems where it has to face additional challenges due to device variability. Communication structures such as tapered buffer drivers, interconnects, repeaters, and data storage elements are vulnerable to variability, which can limit the performance of the on-chip communication networks. In this regard, it becomes important to have a complete understanding of the impact that variability will have on the performance of these circuit elements in order to design high yield and reliable systems. In this paper, the authors have characterized the performance of the communication structures under the impact of random dopant fluctuation (RDF) for the future technology generations of 25, 18, and 13 nm. For accurate characterization of their performance, a Monte Carlo simulation method has been used along with predictive device models for the given technologies. Analytical models have been developed for the link failure probability of a repeater inserted interconnect which uses characterization data of all communication structures to give an accurate prediction of the link failure probability. The model has also been extended to calculate the link failure probability of a wider communication link.


Author(s):  
Faiz-ul Hassan ◽  
Wim Vanderbauwhede ◽  
Fernando Rodríguez-Salazar

On-chip communication is becoming an important bottleneck in the design and operation of high performance systems where it has to face additional challenges due to device variability. Communication structures such as tapered buffer drivers, interconnects, repeaters, and data storage elements are vulnerable to variability, which can limit the performance of the on-chip communication networks. In this regard, it becomes important to have a complete understanding of the impact that variability will have on the performance of these circuit elements in order to design high yield and reliable systems. In this paper, the authors have characterized the performance of the communication structures under the impact of random dopant fluctuation (RDF) for the future technology generations of 25, 18, and 13 nm. For accurate characterization of their performance, a Monte Carlo simulation method has been used along with predictive device models for the given technologies. Analytical models have been developed for the link failure probability of a repeater inserted interconnect which uses characterization data of all communication structures to give an accurate prediction of the link failure probability. The model has also been extended to calculate the link failure probability of a wider communication link.


2021 ◽  
Vol 13 (16) ◽  
pp. 8910
Author(s):  
Himanshi Babbar ◽  
Shalli Rani ◽  
Aman Singh ◽  
Mohammed Abd-Elnaby ◽  
Bong Jun Choi

The network session constraints for Industrial Internet of Things (IIoT) applications are different and challenging. These constraints necessitates a high level of reconfigurability, so that the system can assess the impact of an event and adjust the network effectively. Software Defined Networking (SDN) in contrast to existing networks segregates the control and data plane to support network configuration which is programmable with smart cities requirement that shows the highest impact on the system but faces the problem of reliability. To address this issue, the SDN-IIoT based load balancing algorithm is proposed in this article and it is not application specific.Quality of service (QoS) aware architecture i.e., SDN-IIoT load balancing scheme is proposed and it deals with load on the servers. Huge load on the servers, makes them vulnerable to halt the system and hence leads to faults which creates the reliability problem for real time applications. In this article, load is migrated from one server to another server, if load on one server is more than threshold value. Load distribution has made the proposed scheme more reliable than already existing schemes. Further, the topology used for the implementation has been designed using POX controller and the results has been evaluated using Mininet emulator with its support in python programming. Lastly, the performance is evaluated based on the various Quality of Service (QoS) metrics; data transmission, response time and CPU utilization which shows that the proposed algorithm has shown 10% improvement over the existing LBBSRT, Random, Round-robin, Heuristic algorithms.


2019 ◽  
pp. 124-136
Author(s):  
Victor D. Gazman

The article considers prerequisites for the formation of a new paradigm in the energy sector. The factors that may affect the imminent change of leadership among the energy generation are analyzed. The variability of the projects of creation and functioning of power stations is examined. The focus is made on problematic aspects of the new generation, especially, storage and supply of energy, achieving a system of parity that ensures balance in pricing generations. The author substantiates the principles of forming system of parities arising when comparing traditional and new generations. The article presents the results of an empirical analysis of the 215 projects for the construction of facilities for renewable energy. The significance and direction of the impact of these factors on the growth in investment volumes of transactions are determined. The author considers leasing as an effective financial instrument for overcoming stereotypes of renewable energy and as a promising direction for accelerated implementation of investment projects.


Author(s):  
Subhranshu Sekhar Tripathy ◽  
Diptendu Sinha Roy ◽  
Rabindra K. Barik

Nowadays, cities are intended to change to a smart city. According to recent studies, the use of data from contributors and physical objects in many cities play a key element in the transformation towards a smart city. The ‘smart city’ standard is characterized by omnipresent computing resources for the observing and critical control of such city’s framework, healthcare management, environment, transportation, and utilities. Mist computing is considered a computing prototype that performs IoT applications at the edge of the network. To maintain the Quality of Service (QoS), it is impressive to employ context-aware computing as well as fog computing simultaneously. In this article, the author implements an optimization strategy applying a dynamic resource allocation method based upon genetic algorithm and reinforcement learning in combination with a load balancing procedure. The proposed model comprises four layers i.e. IoT layer, Mist layer, Fog layer, and Cloud layer. Authors have proposed a load balancing technique called M2F balancer which regulates the traffic in the network incessantly, accumulates the information about each server load, transfer the incoming query, and disseminate them among accessible servers equally using dynamic resources allocation method. To validate the efficacy of the proposed algorithm makespan, resource utilization, and the degree of imbalance (DOI) are considered as the scheduling parameter. The proposed method is being compared with the Least count, Round Robin, and Weighted Round Robin. In the end, the results demonstrate that the solutions enhance QoS in the mist assisted cloud environment concerning maximization resource utilization and minimizing the makespan. Therefore, M2FBalancer is an effective method to utilize the resources efficiently by ensuring uninterrupted service. Consequently, it improves performance even at peak times.


2021 ◽  
Author(s):  
Antonio Ornatelli ◽  
Andrea Tortorelli ◽  
Alessandro Giuseppi ◽  
Francesco Delli Priscoli

Smart Cities ◽  
2021 ◽  
Vol 4 (2) ◽  
pp. 919-937
Author(s):  
Nikos Papadakis ◽  
Nikos Koukoulas ◽  
Ioannis Christakis ◽  
Ilias Stavrakas ◽  
Dionisis Kandris

The risk of theft of goods is certainly an important source of negative influence in human psychology. This article focuses on the development of a scheme that, despite its low cost, acts as a smart antitheft system that achieves small property detection. Specifically, an Internet of Things (IoT)-based participatory platform was developed in order to allow asset-tracking tasks to be crowd-sourced to a community. Stolen objects are traced by using a prototype Bluetooth Low Energy (BLE)-based system, which sends signals, thus becoming a beacon. Once such an item (e.g., a bicycle) is stolen, the owner informs the authorities, which, in turn, broadcast an alert signal to activate the BLE sensor. To trace the asset with the antitheft tag, participants use their GPS-enabled smart phones to scan BLE tags through a specific smartphone client application and report the location of the asset to an operation center so that owners can locate their assets. A stolen item tracking simulator was created to support and optimize the aforementioned tracking process and to produce the best possible outcome, evaluating the impact of different parameters and strategies regarding the selection of how many and which users to activate when searching for a stolen item within a given area.


Author(s):  
E. Thilliez ◽  
S. T. Maddison

AbstractNumerical simulations are a crucial tool to understand the relationship between debris discs and planetary companions. As debris disc observations are now reaching unprecedented levels of precision over a wide range of wavelengths, an appropriate level of accuracy and consistency is required in numerical simulations to confidently interpret this new generation of observations. However, simulations throughout the literature have been conducted with various initial conditions often with little or no justification. In this paper, we aim to study the dependence on the initial conditions of N-body simulations modelling the interaction between a massive and eccentric planet on an exterior debris disc. To achieve this, we first classify three broad approaches used in the literature and provide some physical context for when each category should be used. We then run a series of N-body simulations, that include radiation forces acting on small grains, with varying initial conditions across the three categories. We test the influence of the initial parent body belt width, eccentricity, and alignment with the planet on the resulting debris disc structure and compare the final peak emission location, disc width and offset of synthetic disc images produced with a radiative transfer code. We also track the evolution of the forced eccentricity of the dust grains induced by the planet, as well as resonance dust trapping. We find that an initially broad parent body belt always results in a broader debris disc than an initially narrow parent body belt. While simulations with a parent body belt with low initial eccentricity (e ~ 0) and high initial eccentricity (0 < e < 0.3) resulted in similar broad discs, we find that purely secular forced initial conditions, where the initial disc eccentricity is set to the forced value and the disc is aligned with the planet, always result in a narrower disc. We conclude that broad debris discs can be modelled by using either a dynamically cold or dynamically warm parent belt, while in contrast eccentric narrow debris rings are reproduced using a secularly forced parent body belt.


2020 ◽  
Vol 12 (14) ◽  
pp. 5595 ◽  
Author(s):  
Ana Lavalle ◽  
Miguel A. Teruel ◽  
Alejandro Maté ◽  
Juan Trujillo

Fostering sustainability is paramount for Smart Cities development. Lately, Smart Cities are benefiting from the rising of Big Data coming from IoT devices, leading to improvements on monitoring and prevention. However, monitoring and prevention processes require visualization techniques as a key component. Indeed, in order to prevent possible hazards (such as fires, leaks, etc.) and optimize their resources, Smart Cities require adequate visualizations that provide insights to decision makers. Nevertheless, visualization of Big Data has always been a challenging issue, especially when such data are originated in real-time. This problem becomes even bigger in Smart City environments since we have to deal with many different groups of users and multiple heterogeneous data sources. Without a proper visualization methodology, complex dashboards including data from different nature are difficult to understand. In order to tackle this issue, we propose a methodology based on visualization techniques for Big Data, aimed at improving the evidence-gathering process by assisting users in the decision making in the context of Smart Cities. Moreover, in order to assess the impact of our proposal, a case study based on service calls for a fire department is presented. In this sense, our findings will be applied to data coming from citizen calls. Thus, the results of this work will contribute to the optimization of resources, namely fire extinguishing battalions, helping to improve their effectiveness and, as a result, the sustainability of a Smart City, operating better with less resources. Finally, in order to evaluate the impact of our proposal, we have performed an experiment, with non-expert users in data visualization.


Sign in / Sign up

Export Citation Format

Share Document