Design and Development of Framework for Platform Level Issues in Fog Computing

Author(s):  
Sejal Atit Bhavsar ◽  
Kirit J Modi

Fog computing is a paradigm that extends cloud computing services to the edge of the network. Fog computing provides data, storage, compute and application services to end users. The distinguishing characteristics of fog computing are its proximity to the end users. The application services are hosted on network edges like on routers, switches, etc. The goal of fog computing is to improve the efficiency and reduce the amount of data that needs to be transported to cloud for analysis, processing and storage. Due to heterogeneous characteristics of fog computing, there are some issues, i.e. security, fault tolerance, resource scheduling and allocation. To better understand fault tolerance, we highlighted the basic concepts of fault tolerance by understanding different fault tolerance techniques i.e. Reactive, Proactive and the hybrid. In addition to the fault tolerance, how to balance resource utilization and security in fog computing are also discussed here. Furthermore, to overcome platform level issues of fog computing, Hybrid fault tolerance model using resource management and security is presented by us.

Author(s):  
Sejal Atit Bhavsar ◽  
Kirit J Modi

Fog computing is a paradigm that extends cloud computing services to the edge of the network. Fog computing provides data, storage, compute and application services to end users. The distinguishing characteristics of fog computing are its proximity to the end users. The application services are hosted on network edges like on routers, switches, etc. The goal of fog computing is to improve the efficiency and reduce the amount of data that needs to be transported to cloud for analysis, processing and storage. Due to heterogeneous characteristics of fog computing, there are some issues, i.e. security, fault tolerance, resource scheduling and allocation. To better understand fault tolerance, we highlighted the basic concepts of fault tolerance by understanding different fault tolerance techniques i.e. Reactive, Proactive and the hybrid. In addition to the fault tolerance, how to balance resource utilization and security in fog computing are also discussed here. Furthermore, to overcome platform level issues of fog computing, Hybrid fault tolerance model using resource management and security is presented by us.


2021 ◽  
pp. 308-318
Author(s):  
Hadeel T. Rajab ◽  
Manal F. Younis

 Internet of Things (IoT) contributes to improve the quality of life as it supports many applications, especially healthcare systems. Data generated from IoT devices is sent to the Cloud Computing (CC) for processing and storage, despite the latency caused by the distance. Because of the revolution in IoT devices, data sent to CC has been increasing. As a result, another problem added to the latency was increasing congestion on the cloud network. Fog Computing (FC) was used to solve these problems because of its proximity to IoT devices, while filtering data is sent to the CC. FC is a middle layer located between IoT devices and the CC layer. Due to the massive data generated by IoT devices on FC, Dynamic Weighted Round Robin (DWRR) algorithm was used, which represents a load balancing (LB) algorithm that is applied to schedule and distributes data among fog servers by reading CPU and memory values of these servers in order to improve system performance. The results proved that DWRR algorithm provides high throughput which reaches 3290 req/sec at 919 users. A lot of research is concerned with distribution of workload by using LB techniques without paying much attention to Fault Tolerance (FT), which implies that the system continues to operate even when fault occurs. Therefore, we proposed a replication FT technique called primary-backup replication based on dynamic checkpoint interval on FC. Checkpoint was used to replicate new data from a primary server to a backup server dynamically by monitoring CPU values of primary fog server, so that checkpoint occurs only when the CPU value is larger than 0.2 to reduce overhead. The results showed that the execution time of data filtering process on the FC with a dynamic checkpoint is less than the time spent in the case of the static checkpoint that is independent on the CPU status.


Author(s):  
Babangida Zubairu

The emergence of new innovations in technology changes the rate of data generated in health-related institutions and the way data should be handled. As such, the amount of data generated is always on the increase, which demands the need of advanced, automated management systems and storage platforms for handling large biomedical data. Cloud computing has emerged as the promising technology for present and future that can handle large amount of data and enhance processing and management of the data remotely. One of the disturbance concerns of the technology is the security of the data. Data in the cloud is subject to security threats, and this has highlighted the need for exploring security measures against the threats. The chapter provides detailed analysis of cloud computing deployment strategies and risks associated with the technology and tips for biomedical data storage and processing through cloud computing services.


Author(s):  
Zainab Javed ◽  
◽  
Waqas Mahmood

In this day and age, the rise in technological advancements has the potential to improve and transform our lives every day. The rapid technology innovation can have a great impact on our business operations. Currently, Cloud computing services are popular and offer a wide range of opportunities for their customers. This paper presents a survey on a more recent computing architecture paradigm known as Fog Computing. Fog networking is a beneficial solution that offers the greater facility of data storage, enhanced computing, and networking resources. This new concept of fog complements cloud solution by facilitating its customers with better security, real-time analysis improved efficiency. To get a clear picture and understanding of how fog computing functions, we have performed an extensive literature review. We also presented a comparative study of fog computing with cloud and grid computing architectures. In this study, we have conducted a survey that led us to the conclusion that fog computing solution is still not applicable and implemented in most of the IoT industries due to the lack of awareness and the high architecture’s cost. Results of the study also indicate that optimized data storage and security are a few of the factors that can motivate organizations to implement the Fog computing architecture. Furthermore, the challenges related to fog computing solution are reviewed for progressive developments in the future.


2019 ◽  
pp. 1748-1768
Author(s):  
Babangida Zubairu

The emergence of new innovations in technology changes the rate of data generated in health-related institutions and the way data should be handled. As such, the amount of data generated is always on the increase, which demands the need of advanced, automated management systems and storage platforms for handling large biomedical data. Cloud computing has emerged as the promising technology for present and future that can handle large amount of data and enhance processing and management of the data remotely. One of the disturbance concerns of the technology is the security of the data. Data in the cloud is subject to security threats, and this has highlighted the need for exploring security measures against the threats. The chapter provides detailed analysis of cloud computing deployment strategies and risks associated with the technology and tips for biomedical data storage and processing through cloud computing services.


2012 ◽  
Vol 3 (2) ◽  
pp. 51-59 ◽  
Author(s):  
Nawsher Khan ◽  
A. Noraziah ◽  
Elrasheed I. Ismail ◽  
Mustafa Mat Deris ◽  
Tutut Herawan

Cloud computing is fundamentally altering the expectations for how and when computing, storage, and networking resources should be allocated, managed, consumed, and allow users to utilize services globally. Due to the powerful computing and storage, high availability and security, easy accessibility and adaptability, reliable scalability and interoperability, cost and time effective cloud computing is the top, needed for current fast growing business world. A client, organization or a trade that adopting emerging cloud environment can choose a well suitable infrastructure, platform, software, and a network resource, for any business, where each one has some exclusive features and advantages. The authors first develop a comprehensive classification for describing cloud computing architecture. This classification help in survey of several existing cloud computing services developed by various projects globally such as Amazon, Google, Microsoft, Sun and Force.com and by using this survey’s results the authors identified similarities and differences of the architecture approaches of cloud computing.


Author(s):  
R. Priyadarshini ◽  
N. Malarvizhi ◽  
E. A. Neeba

Fog computing is a new paradigm believed to be an extension of cloud computing and services to the sting of the network. Similarly, like Cloud, Fog provides computing, data, storage, and various application services to the connected end-users. Fog computing uses one or a lot of combined end users or nearby end users edge devices to perform the configuration, communication, storage, control activity, and management functions over the infrastructure supported. This new paradigm solves the latency and information measure limitation issues encountered from the cloud computing. Primarily, the architecture of the fog computing is discussed and analyzed during this work and then indicates the connected potential security and trust problems. Then, however such problems are tackled within the existing literature is systematically reportable. Finally, the open challenges, analysis, trends, and future topics of security and trust in fog computing are mentioned.


Author(s):  
Stojan Kitanov ◽  
Toni Janevski

Pushing computing, control, data storage, and processing into the cloud has been a key trend in the past decade. However, the cloud alone encounters growing limitations, such as reduced latency, high mobility, high scalability, and real-time execution in order to meet the upcoming computing and intelligent networking demands. A new paradigm called fog computing has emerged to overcome these limits. Fog extends cloud computing and services to the edge of the network. It provides data, computing, storage, and application services to end-users that can be hosted at the network edge. It reduces service latency, and improves QoS/QoE, that results in superior user experience. This chapter is about introduction and overview of fog computing, comparison between fog computing and cloud computing, fog computing and mobile edge computing, possible fog computing architecture, applications of fog computing, and possible research directions.


Author(s):  
Michael Davis ◽  
Alice Sedsman

Cloud computing has been heralded as a new era in the evolution of information and communications technologies. ICT giants have invested heavily in developing technologies and mega server facilities, which allow end users to access web-based software applications and store their data off-site. Businesses using cloud computing services will benefit from reduced operating costs as they cut back on ICT infrastructure and personnel. Individuals will no longer need to buy and install software and will have universal access to their data through any internet-ready device. Yet, hidden amongst the host of benefits are inherent legal risks. The global nature of cloud computing raises questions about privacy, security, confidentiality and access to data. Current terms of use do not adequately address the multitude of legal issues unique to cloud computing. In the face of this legal uncertainty, end users should be educated about the risks involved in entering the cloud.


2017 ◽  
Vol 27 (03n04) ◽  
pp. 1750010 ◽  
Author(s):  
Amedeo Sapio ◽  
Mario Baldi ◽  
Fulvio Risso ◽  
Narendra Anand ◽  
Antonio Nucci

Traffic capture and analysis is key to many domains including network management, security and network forensics. Traditionally, it is performed by a dedicated device accessing traffic at a specific point within the network through a link tap or a port of a node mirroring packets. This approach is problematic because the dedicated device must be equipped with a large amount of computation and storage resources to store and analyze packets. Alternatively, in order to achieve scalability, analysis can be performed by a cluster of hosts. However, this is normally located at a remote location with respect to the observation point, hence requiring to move across the network a large volume of captured traffic. To address this problem, this paper presents an algorithm to distribute the task of capturing, processing and storing packets traversing a network across multiple packet forwarding nodes (e.g., IP routers). Essentially, our solution allows individual nodes on the path of a flow to operate on subsets of packets of that flow in a completely distributed and decentralized manner. The algorithm ensures that each packet is processed by n nodes, where n can be set to 1 to minimize overhead or to a higher value to achieve redundancy. Nodes create a distributed index that enables efficient retrieval of packets they store (e.g., for forensics applications). Finally, the basic principles of the presented solution can also be applied, with minimal changes, to the distributed execution of generic tasks on data flowing through a network of nodes with processing and storage capabilities. This has applications in various fields ranging from Fog Computing, to microservice architectures and the Internet of Things.


Sign in / Sign up

Export Citation Format

Share Document