scholarly journals A Survey Based Study on Fog Computing Awareness

Author(s):  
Zainab Javed ◽  
◽  
Waqas Mahmood

In this day and age, the rise in technological advancements has the potential to improve and transform our lives every day. The rapid technology innovation can have a great impact on our business operations. Currently, Cloud computing services are popular and offer a wide range of opportunities for their customers. This paper presents a survey on a more recent computing architecture paradigm known as Fog Computing. Fog networking is a beneficial solution that offers the greater facility of data storage, enhanced computing, and networking resources. This new concept of fog complements cloud solution by facilitating its customers with better security, real-time analysis improved efficiency. To get a clear picture and understanding of how fog computing functions, we have performed an extensive literature review. We also presented a comparative study of fog computing with cloud and grid computing architectures. In this study, we have conducted a survey that led us to the conclusion that fog computing solution is still not applicable and implemented in most of the IoT industries due to the lack of awareness and the high architecture’s cost. Results of the study also indicate that optimized data storage and security are a few of the factors that can motivate organizations to implement the Fog computing architecture. Furthermore, the challenges related to fog computing solution are reviewed for progressive developments in the future.

Author(s):  
Sejal Atit Bhavsar ◽  
Kirit J Modi

Fog computing is a paradigm that extends cloud computing services to the edge of the network. Fog computing provides data, storage, compute and application services to end users. The distinguishing characteristics of fog computing are its proximity to the end users. The application services are hosted on network edges like on routers, switches, etc. The goal of fog computing is to improve the efficiency and reduce the amount of data that needs to be transported to cloud for analysis, processing and storage. Due to heterogeneous characteristics of fog computing, there are some issues, i.e. security, fault tolerance, resource scheduling and allocation. To better understand fault tolerance, we highlighted the basic concepts of fault tolerance by understanding different fault tolerance techniques i.e. Reactive, Proactive and the hybrid. In addition to the fault tolerance, how to balance resource utilization and security in fog computing are also discussed here. Furthermore, to overcome platform level issues of fog computing, Hybrid fault tolerance model using resource management and security is presented by us.


Author(s):  
Sejal Atit Bhavsar ◽  
Kirit J Modi

Fog computing is a paradigm that extends cloud computing services to the edge of the network. Fog computing provides data, storage, compute and application services to end users. The distinguishing characteristics of fog computing are its proximity to the end users. The application services are hosted on network edges like on routers, switches, etc. The goal of fog computing is to improve the efficiency and reduce the amount of data that needs to be transported to cloud for analysis, processing and storage. Due to heterogeneous characteristics of fog computing, there are some issues, i.e. security, fault tolerance, resource scheduling and allocation. To better understand fault tolerance, we highlighted the basic concepts of fault tolerance by understanding different fault tolerance techniques i.e. Reactive, Proactive and the hybrid. In addition to the fault tolerance, how to balance resource utilization and security in fog computing are also discussed here. Furthermore, to overcome platform level issues of fog computing, Hybrid fault tolerance model using resource management and security is presented by us.


Author(s):  
Louay Karadsheh ◽  
Samer Alhawari

Over a decade ago, cloud computing became an important topic for small and large businesses alike. The new concept promises scalability, security, cost reduction, portability, and availability. While addressing this issue over the past several years, there have been intensive discussions about the importance of cloud computing technologies. Therefore, this paper reviews the transition from traditional computing to cloud computing and the benefit for businesses, cloud computing architecture, cloud computing services classification, and deployment models. Furthermore, this paper discusses the security policies and types of internal risks that a small business might encounter implementing cloud computing technologies. It addresses initiatives towards employing certain types of security policies in small businesses implementing cloud computing technologies to encourage small business to migrate to cloud computing by portraying what is needed to secure their infrastructure using traditional security policies without the complexity used in large corporations.


2011 ◽  
Vol 1 (2) ◽  
pp. 29-40 ◽  
Author(s):  
Louay Karadsheh ◽  
Samer Alhawari

Over a decade ago, cloud computing became an important topic for small and large businesses alike. The new concept promises scalability, security, cost reduction, portability, and availability. While addressing this issue over the past several years, there have been intensive discussions about the importance of cloud computing technologies. Therefore, this paper reviews the transition from traditional computing to cloud computing and the benefit for businesses, cloud computing architecture, cloud computing services classification, and deployment models. Furthermore, this paper discusses the security policies and types of internal risks that a small business might encounter implementing cloud computing technologies. It addresses initiatives towards employing certain types of security policies in small businesses implementing cloud computing technologies to encourage small business to migrate to cloud computing by portraying what is needed to secure their infrastructure using traditional security policies without the complexity used in large corporations.


2021 ◽  
Author(s):  
Rory James Munro ◽  
Nadine Holmes ◽  
Christopher Moore ◽  
Matthew Carlile ◽  
Alex Payne ◽  
...  

Motivation: The ongoing SARS-CoV-2 pandemic has demonstrated the utility of real-time analysis of sequencing data, with a wide range of databases and resources for analysis now available. Here we show how the real-time nature of Oxford Nanopore Technologies sequencers can accelerate consensus generation, lineage and variant status assignment. We exploit the fact that multiplexed viral sequencing libraries quickly generate sufficient data for the majority of samples, with diminishing returns on remaining samples as the sequencing run progresses. We demonstrate methods to determine when a sequencing run has passed this point in order to reduce the time required and cost of sequencing. Results: We extended MinoTour, our real-time analysis and monitoring platform for nanopore sequencers, to provide SARS-CoV2 analysis using ARTIC network pipelines. We additionally developed an algorithm to predict which samples will achieve sufficient coverage, automatically running the ARTIC medaka informatics pipeline once specific coverage thresholds have been reached on these samples. After testing on run data, we find significant run time savings are possible, enabling flow cells to be used more efficiently and enabling higher throughput data analysis. The resultant consensus genomes are assigned both PANGO lineage and variant status as defined by Public Health England. Samples from within individual runs are used to generate phylogenetic trees incorporating optional background samples as well as summaries of individual SNPs. As minoTour uses ARTIC pipelines, new primer schemes and pathogens can be added to allow minoTour to aid in real-time analysis of pathogens in the future.


Electronics ◽  
2021 ◽  
Vol 10 (17) ◽  
pp. 2110
Author(s):  
Desire Ngabo ◽  
Dong Wang ◽  
Celestine Iwendi ◽  
Joseph Henry Anajemba ◽  
Lukman Adewale Ajao ◽  
...  

The recent developments in fog computing architecture and cloud of things (CoT) technology includes data mining management and artificial intelligence operations. However, one of the major challenges of this model is vulnerability to security threats and cyber-attacks against the fog computing layers. In such a scenario, each of the layers are susceptible to different intimidations, including the sensed data (edge layer), computing and processing of data (fog (layer), and storage and management for public users (cloud). The conventional data storage and security mechanisms that are currently in use appear to not be suitable for such a huge amount of generated data in the fog computing architecture. Thus, the major focus of this research is to provide security countermeasures against medical data mining threats, which are generated from the sensing layer (a human wearable device) and storage of data in the cloud database of internet of things (IoT). Therefore, we propose a public-permissioned blockchain security mechanism using elliptic curve crypto (ECC) digital signature that that supports a distributed ledger database (server) to provide an immutable security solution, transaction transparency and prevent the patient records tampering at the IoTs fog layer. The blockchain technology approach also helps to mitigate these issues of latency, centralization, and scalability in the fog model.


2020 ◽  
Vol 63 (4) ◽  
pp. 567-592
Author(s):  
Jiafu Jiang ◽  
Linyu Tang ◽  
Ke Gu ◽  
WeiJia Jia

Abstract Fog computing has become an emerging environment that provides data storage, computing and some other services on the edge of network. It not only can acquire data from terminal devices, but also can provide computing services to users by opening computing resources. Compared with cloud computing, fog devices can collaborate to provide users with powerful computing services through resource allocation. However, as many of fog devices are not monitored, there are some security problems. For example, since fog server processes and maintains user information, device information, task parameters and so on, fog server is easy to perform illegal resource allocation for extra benefits. In this paper, we propose a secure computing resource allocation framework for open fog computing. In our scheme, the fog server is responsible for processing computing requests and resource allocations, and the cloud audit center is responsible for auditing the behaviors of the fog servers and fog nodes. Based on the proposed security framework, our proposed scheme can resist the attack of single malicious node and the collusion attack of fog server and computing devices. Furthermore, the experiments show our proposed scheme is efficient. For example, when the number of initial idle service devices is 40, the rejection rate of allocated tasks is 10% and the total number of sub-tasks is changed from 150 to 200, the total allocation time of our scheme is only changed from 15 ms to 25 ms; additionally, when the task of 5000 order matrix multiplication is tested on 10 service devices, the total computing time of our scheme is $\sim$250 s, which is better than that of single computer (where single computer needs more than 1500 s). Therefore, our proposed scheme has obvious advantages when it faces some tasks that require more computational cost, such as complex scientific computing, distributed massive data query, distributed image processing and so on.


Author(s):  
Niall Sclater

The procurement of cloud computing services involves a wide range of issues and risks for educational institutions. The technologies and services available are rapidly evolving, differ greatly between providers, and are subject to complex contractual arrangements with potentially serious legal and business implications. There is no specific cloud computing legislation, but the area is subject to a wide and growing range of laws relating to Internet-based services, some written decades ago (Baker, 2009). Resolution of the new issues relating to security, privacy, and regulation in the cloud will take many years (Kaufman, 2009). This chapter outlines the key issues institutions need to investigate when considering the deployment of services in the cloud to students, faculty, and staff.


Sign in / Sign up

Export Citation Format

Share Document