scholarly journals Blockchain Processing Technique Based on Multiple Hash Chains for Minimizing Integrity Errors of IoT Data in Cloud Environments

Sensors ◽  
2021 ◽  
Vol 21 (14) ◽  
pp. 4679
Author(s):  
Yoon-Su Jeong

As IoT (Internet of Things) devices are diversified in the fields of use (manufacturing, health, medical, energy, home, automobile, transportation, etc.), it is becoming important to analyze and process data sent and received from IoT devices connected to the Internet. Data collected from IoT devices is highly dependent on secure storage in databases located in cloud environments. However, storing directly in a database located in a cloud environment makes it not only difficult to directly control IoT data, but also does not guarantee the integrity of IoT data due to a number of hazards (error and error handling, security attacks, etc.) that can arise from natural disasters and management neglect. In this paper, we propose an optimized hash processing technique that enables hierarchical distributed processing with an n-bit-size blockchain to minimize the loss of data generated from IoT devices deployed in distributed cloud environments. The proposed technique minimizes IoT data integrity errors as well as strengthening the role of intermediate media acting as gateways by interactively authenticating blockchains of n bits into n + 1 and n − 1 layers to normally validate IoT data sent and received from IoT data integrity errors. In particular, the proposed technique ensures the reliability of IoT information by validating hash values of IoT data in the process of storing index information of IoT data distributed in different locations in a blockchain in order to maintain the integrity of the data. Furthermore, the proposed technique ensures the linkage of IoT data by allowing minimal errors in the collected IoT data while simultaneously grouping their linkage information, thus optimizing the load balance after hash processing. In performance evaluation, the proposed technique reduced IoT data processing time by an average of 2.54 times. Blockchain generation time improved on average by 17.3% when linking IoT data. The asymmetric storage efficiency of IoT data according to hash code length is improved by 6.9% on average over existing techniques. Asymmetric storage speed according to the hash code length of the IoT data block was shown to be 10.3% faster on average than existing techniques. Integrity accuracy of IoT data is improved by 18.3% on average over existing techniques.

Sensors ◽  
2021 ◽  
Vol 21 (6) ◽  
pp. 2049
Author(s):  
Yoon-Su Jeong ◽  
Sung-Ho Sim

As cloud technology advances, devices such as IoT (Internet of Things) are being utilized in various areas ranging from transportation, manufacturing, energy, automation, space, defense, and healthcare. As the number of IoT devices increases, the safety of IoT information, which is vulnerable to cyber attacks, is emerging as an important area of interest in distributed cloud environments. However, integrity techniques are not guaranteed to easily identify the integrity threats and attacks on IoT information operating in the distributed cloud associated with IoT systems and CPS (Cyber-Physical System). In this paper, we propose a blockchain-based integrity verification technique in which large amounts of IoT information processed in distributed cloud environments can be guaranteed integrity in security threats related to IoT systems and CPS. The proposed technique aims to ensure the integrity of IoT information by linking information from IoT devices belonging to subgroups in distributed cloud environments to information from specific non-adjacent IoT devices and blockchain. This is because existing techniques rely on third-party organizations that the data owner can trust to verify the integrity of the data. The proposed technique identifies IoT information by connecting the paths of IoT pre- and subsequent blocks into block chains so that synchronization can be achieved between subgroups in distributed cloud environments. Furthermore, the proposed technique uses probabilistic similarity information between IoT information blocks to react flexibly to subgroups that constitute distributed clouds so that IoT information blocks are not exploited maliciously by third parties. As a result of performance evaluation, the proposed technique averaged 12.3% improvement in integrity processing time over existing techniques depending on blockchain size. Furthermore, the proposed technique has to hash the IoT information that constitutes a subgroup with probability-linked information, validating the integrity of large-capacity IoT information, resulting in an average of 8.8% lower overhead than existing techniques. In addition, the proposed technique has an average improvement of 14.3% in blockchain-based integrity verification accuracy over existing techniques, depending on the hash chain length.


Author(s):  
Jaber Almutairi ◽  
Mohammad Aldossary

AbstractRecently, the number of Internet of Things (IoT) devices connected to the Internet has increased dramatically as well as the data produced by these devices. This would require offloading IoT tasks to release heavy computation and storage to the resource-rich nodes such as Edge Computing and Cloud Computing. Although Edge Computing is a promising enabler for latency-sensitive related issues, its deployment produces new challenges. Besides, different service architectures and offloading strategies have a different impact on the service time performance of IoT applications. Therefore, this paper presents a novel approach for task offloading in an Edge-Cloud system in order to minimize the overall service time for latency-sensitive applications. This approach adopts fuzzy logic algorithms, considering application characteristics (e.g., CPU demand, network demand and delay sensitivity) as well as resource utilization and resource heterogeneity. A number of simulation experiments are conducted to evaluate the proposed approach with other related approaches, where it was found to improve the overall service time for latency-sensitive applications and utilize the edge-cloud resources effectively. Also, the results show that different offloading decisions within the Edge-Cloud system can lead to various service time due to the computational resources and communications types.


2019 ◽  
Vol 6 (1) ◽  
Author(s):  
Mahdi Torabzadehkashi ◽  
Siavash Rezaei ◽  
Ali HeydariGorji ◽  
Hosein Bobarshad ◽  
Vladimir Alves ◽  
...  

AbstractIn the era of big data applications, the demand for more sophisticated data centers and high-performance data processing mechanisms is increasing drastically. Data are originally stored in storage systems. To process data, application servers need to fetch them from storage devices, which imposes the cost of moving data to the system. This cost has a direct relation with the distance of processing engines from the data. This is the key motivation for the emergence of distributed processing platforms such as Hadoop, which move process closer to data. Computational storage devices (CSDs) push the “move process to data” paradigm to its ultimate boundaries by deploying embedded processing engines inside storage devices to process data. In this paper, we introduce Catalina, an efficient and flexible computational storage platform, that provides a seamless environment to process data in-place. Catalina is the first CSD equipped with a dedicated application processor running a full-fledged operating system that provides filesystem-level data access for the applications. Thus, a vast spectrum of applications can be ported for running on Catalina CSDs. Due to these unique features, to the best of our knowledge, Catalina CSD is the only in-storage processing platform that can be seamlessly deployed in clusters to run distributed applications such as Hadoop MapReduce and HPC applications in-place without any modifications on the underlying distributed processing framework. For the proof of concept, we build a fully functional Catalina prototype and a CSD-equipped platform using 16 Catalina CSDs to run Intel HiBench Hadoop and HPC benchmarks to investigate the benefits of deploying Catalina CSDs in the distributed processing environments. The experimental results show up to 2.2× improvement in performance and 4.3× reduction in energy consumption, respectively, for running Hadoop MapReduce benchmarks. Additionally, thanks to the Neon SIMD engines, the performance and energy efficiency of DFT algorithms are improved up to 5.4× and 8.9×, respectively.


2018 ◽  
Vol 10 (3) ◽  
pp. 61-83 ◽  
Author(s):  
Deepali Chaudhary ◽  
Kriti Bhushan ◽  
B.B. Gupta

This article describes how cloud computing has emerged as a strong competitor against traditional IT platforms by offering low-cost and “pay-as-you-go” computing potential and on-demand provisioning of services. Governments, as well as organizations, have migrated their entire or most of the IT infrastructure to the cloud. With the emergence of IoT devices and big data, the amount of data forwarded to the cloud has increased to a huge extent. Therefore, the paradigm of cloud computing is no longer sufficient. Furthermore, with the growth of demand for IoT solutions in organizations, it has become essential to process data quickly, substantially and on-site. Hence, Fog computing is introduced to overcome these drawbacks of cloud computing by bringing intelligence to the edge of the network using smart devices. One major security issue related to the cloud is the DDoS attack. This article discusses in detail about the DDoS attack, cloud computing, fog computing, how DDoS affect cloud environment and how fog computing can be used in a cloud environment to solve a variety of problems.


2021 ◽  
Author(s):  
Hamed Hasibi ◽  
Saeed Sedighian Kashi

Fog computing brings cloud capabilities closer to the Internet of Things (IoT) devices. IoT devices generate a tremendous amount of stream data towards the cloud via hierarchical fog nodes. To process data streams, many Stream Processing Engines (SPEs) have been developed. Without the fog layer, the stream query processing executes on the cloud, which forwards much traffic toward the cloud. When a hierarchical fog layer is available, a complex query can be divided into simple queries to run on fog nodes by using distributed stream processing. In this paper, we propose an approach to assign stream queries to fog nodes using container technology. We name this approach Stream Queries Placement in Fog (SQPF). Our goal is to minimize end-to-end delay to achieve a better quality of service. At first, in the emulation step, we make docker container instances from SPEs and evaluate their processing delay and throughput under different resource configurations and queries with varying input rates. Then in the placement step, we assign queries among fog nodes by using a genetic algorithm. The practical approach used in SQPF achieves a near-the-best assignment based on the lowest application deadline in real scenarios, and evaluation results are evidence of this goal.


Author(s):  
Mohd Javaid ◽  
Abid Haleem ◽  
Ravi Pratap Singh ◽  
Rajiv Suman

Artificial intelligence (AI) contributes to the recent developments in Industry 4.0. Industries are focusing on improving product consistency, productivity and reducing operating costs, and they want to achieve this with the collaborative partnership between robotics and people. In smart industries, hyperconnected manufacturing processes depend on different machines that interact using AI automation systems by capturing and interpreting all data types. Smart platforms of automation can play a decisive role in transforming modern production. AI provides appropriate information to take decision-making and alert people of possible malfunctions. Industries will use AI to process data transmitted from the Internet of things (IoT) devices and connected machines based on their desire to integrate them into their equipment. It provides companies with the ability to track their entire end-to-end activities and processes fully. This literature review-based paper aims to brief the vital role of AI in successfully implementing Industry 4.0. Accordingly, the research objectives are crafted to facilitate researchers, practitioners, students and industry professionals in this paper. First, it discusses the significant technological features and traits of AI, critical for Industry 4.0. Second, this paper identifies the significant advancements and various challenges enabling the implementation of AI for Industry 4.0. Finally, the paper identifies and discusses significant applications of AI for Industry 4.0. With an extensive review-based exploration, we see that the advantages of AI are widespread and the need for stakeholders in understanding the kind of automation platform they require in the new manufacturing order. Furthermore, this technology seeks correlations to avoid errors and eventually to anticipate them. Thus, AI technology is gradually accomplishing various goals of Industry 4.0.


Sensors ◽  
2019 ◽  
Vol 19 (5) ◽  
pp. 1006 ◽  
Author(s):  
Charikleia Papatsimpa ◽  
Jean-Paul Linnartz

Smart buildings with connected lighting and sensors are likely to become one of the first large-scale applications of the Internet of Things (IoT). However, as the number of interconnected IoT devices is expected to rise exponentially, the amount of collected data will be enormous but highly redundant. Devices will be required to pre-process data locally or at least in their vicinity. Thus, local data fusion, subject to constraint communications will become necessary. In that sense, distributed architectures will become increasingly unavoidable. Anticipating this trend, this paper addresses the problem of presence detection in a building as a distributed sensing of a hidden Markov model (DS-HMM) with limitations on the communication. The key idea in our work is the use of a posteriori probabilities or likelihood ratios (LR) as an appropriate “interface” between heterogeneous sensors with different error profiles. We propose an efficient transmission policy, jointly with a fusion algorithm, to merge data from various HMMs running separately on all sensor nodes but with all the models observing the same Markovian process. To test the feasibility of our DS-HMM concept, a simple proof-of-concept prototype was used in a typical office environment. The experimental results show full functionality and validate the benefits. Our proposed scheme achieved high accuracy while reducing the communication requirements. The concept of DS-HMM and a posteriori probabilities as an interface is suitable for many other applications for distributed information fusion in wireless sensor networks.


2021 ◽  
Vol 2021 ◽  
pp. 1-14
Author(s):  
Sabeeh Ahmad Saeed ◽  
Farrukh Zeeshan Khan ◽  
Zeshan Iqbal ◽  
Roobaea Alroobaea ◽  
Muneer Ahmad ◽  
...  

Internet of Things (IoT) is considered one of the world’s ruling technologies. Billions of IoT devices connected together through IoT forming smart cities. As the concept grows, it is very challenging to design an infrastructure that is capable of handling large number of devices and process data effectively in a smart city paradigm. This paper proposed a structure for smart cities. It is implemented using a lightweight easy to implement network design and a simpler data format for information exchange that is suitable for developing countries like Pakistan. Using MQTT as network protocol, different sensor nodes were deployed for collecting data from the environment. Environmental factors like temperature, moisture, humidity, and percentage of CO2 and methane gas were recorded and transferred to sink node for information sharing over the IoT cloud using an MQTT broker that can be accessed any time using Mosquitto client. The experiment results provide the performance analysis of the proposed network at different QoS levels for the MQTT protocol for IoT-based smart cities. JSON structure is used to formulate the communication data structure for the proposed system.


Sign in / Sign up

Export Citation Format

Share Document