scholarly journals Data Integrity Protection in Cloud

2021 ◽  
Vol 10 (02) ◽  
pp. 211-218
Author(s):  
Indira G ◽  
Sujitha S ◽  
S.Ganapathy Subramanian

In cloud computing integrity of data and access control are challenging issues. Protection of outsourced data in cloud storage becomes critical. Codes which are regenerating of data provide fault tolerance. Therefore, remotely checking the integrity of data against corruptions and other issues under a real time cloud storage setting is our problem of study. It practically design and implement Data Integrity Protection (DIP) environment.

Author(s):  
PALLAVI R ◽  
DR. R APARNA

It has been widely observed that the concept of cloud computing has become one of the major theory in the world of IT industry. Data owner decides to release their burden of storing and maintaining the data locally by storing it over the cloud. Cloud storage moves the owner’s data to large data centers which are remotely located on which data owner does not have any control. However, this unique feature of the cloud poses many new security challenges. One of the important concerns that need to be addressed is access control and integrity of outsourced data in cloud. Number of schemes has been proposed to achieve the access control of outsourced data like hierarchical attribute set based encryption [HASBE] by extending cipher-text-policy attribute set based encryption [CP-ABE]. Even though HASBE scheme achieves scalability, flexibility and fine grained access control, it fails to prove the data integrity in the cloud. Hence integrity checking concept has been proposed for HASBE scheme to achieve integrity. Though the scheme achieves integrity it fails to provide the availability of data to the user even when fault had occurred to data in the cloud. However, the fact that owner no longer have physical possession of data indicates that they are facing a potentially formidable risk for missing or corrupted data, because sometimes the cloud service provider deletes the data which are either not used by client from long-time and which occupies large space in the cloud without the knowledge or permission of data owner. Hence in order to avoid this security risk, in this paper we propose a hybrid cloud concept. Hybrid cloud is a cloud computing environment in which an organization provides and manages some internal resources and external resources. A hybrid cloud is a composition of at least one private cloud and at least one public cloud. This concept provides the availability and data integrity proof for HASBE scheme.


Author(s):  
Neha Thakur ◽  
Aman Kumar Sharma

Cloud computing has been envisioned as the definite and concerning solution to the rising storage costs of IT Enterprises. There are many cloud computing initiatives from IT giants such as Google, Amazon, Microsoft, IBM. Integrity monitoring is essential in cloud storage for the same reasons that data integrity is critical for any data centre. Data integrity is defined as the accuracy and consistency of stored data, in absence of any alteration to the data between two updates of a file or record.  In order to ensure the integrity and availability of data in Cloud and enforce the quality of cloud storage service, efficient methods that enable on-demand data correctness verification on behalf of cloud users have to be designed. To overcome data integrity problem, many techniques are proposed under different systems and security models. This paper will focus on some of the integrity proving techniques in detail along with their advantages and disadvantages.


2018 ◽  
Vol 16 (1) ◽  
pp. 1-16 ◽  
Author(s):  
Mbarek Marwan ◽  
Ali Kartit ◽  
Hassan Ouahmane

Nowadays, modern healthcare providers create massive medical images every day because of the recent progress in imaging tools. This is generally due to the increasing number of patients demanding medical services. This has resulted in a continuous demand of a large storage space. Unfortunately, healthcare domains still use local data centers for storing medical data and managing business processes. This has significant negative impacts on operating costs associated with licensing fees and maintenance. To overcome these challenges, healthcare organizations are interested in adopting cloud storage rather than on-premise hosted solutions. This is mainly justified by the scalability, cost savings and availability of cloud services. The primary objective of this model is to outsource data and delegate IT computations to an external party. The latter delivers needed storage systems via the Internet to fulfill client's demands. Even though this model provides significant cost advantages, using cloud storage raises security challenges. To this aim, this article describes several solutions which were proposed to ensure data protection. The existing implementations suffer from many limitations. The authors propose a framework to secure the storage of medical images over cloud computing. In this regard, they use multi-region segmentation and watermarking techniques to maintain both confidentiality and integrity. In addition, they rely on an ABAC model to ensure access control to cloud storage. This solution mainly includes four functions, i.e., (1) split data for privacy protection, (2) authentication for medical dataset accessing, (3) integrity checking, and (4) access control to enforce security measures. Hence, the proposal is an appropriate solution to meet privacy requirements.


2016 ◽  
Vol 11 (2) ◽  
pp. 126-134
Author(s):  
Ma Haifeng ◽  
Gao Zhenguo ◽  
Yao Nianmin

Cloud storage service enables users to migrate their data and applications to the cloud, which saves the local data maintenance and brings great convenience to the users. But in cloud storage, the storage servers may not be fully trustworthy. How to verify the integrity of cloud data with lower overhead for users has become an increasingly concerned problem. Many remote data integrity protection methods have been proposed, but these methods authenticated cloud files one by one when verifying multiple files. Therefore, the computation and communication overhead are still high. Aiming at this problem, a hierarchical remote data possession checking (hierarchical-remote data possession checking (H-RDPC)) method is proposed, which can provide efficient and secure remote data integrity protection and can support dynamic data operations. This paper gives the algorithm descriptions, security, and false negative rate analysis of H-RDPC. The security analysis and experimental performance evaluation results show that the proposed H-RDPC is efficient and reliable in verifying massive cloud files, and it has 32–81% improvement in performance compared with RDPC.


Information ◽  
2020 ◽  
Vol 11 (9) ◽  
pp. 409
Author(s):  
Yuan Ping ◽  
Yu Zhan ◽  
Ke Lu ◽  
Baocang Wang

Although cloud storage provides convenient data outsourcing services, an untrusted cloud server frequently threatens the integrity and security of the outsourced data. Therefore, it is extremely urgent to design security schemes allowing the users to check the integrity of data with acceptable computational and communication overheads. In this paper, we first propose a public data integrity verification scheme based on the algebraic signature and elliptic curve cryptography. This scheme not only allows the third party authority deputize for users to verify the outsourced data integrity, but also resists malicious attacks such as replay attacks, replacing attack and forgery attacks. Data privacy is guaranteed by symmetric encryption. Furthermore, we construct a novel data structure named divide and conquer hash list, which can efficiently perform data updating operations, such as deletion, insertion, and modification. Compared with the relevant schemes in the literature, security analysis and performance evaluations show that the proposed scheme gains some advantages in integrity verification and dynamic updating.


2014 ◽  
Vol 496-500 ◽  
pp. 1905-1908
Author(s):  
Hong Xia Mao

Cloud storage has so many advantages that it has been an ideal way to store data. So, the technical questions of security and reliability should be researched. Access control and encryption are used for security of data storage. Redundancy technology which can improve the data reliability in cloud storage includes replication technology and erasure code. Under the specific experimental environment, the erasure code has an absolute advantage in fault tolerance and storage capacity.


Cloud Computing is an essential podium for workable and impracticable users. It achieves high quality reliable services provided to the users via data storage servers. The key challenge in cloud architecture is to run the facilities without any hazel to the users. In today’s world of information technology, most of the applications are real time. The major constraint of the systems used in real time applications is that, they are prone to failure. The failure may be due to the following reasons: a) Failure to complete the task in prescribed time threshold value. b) Failure to achieve prescribed reliability value. Virtualization and Internet-based Cloud computing causes diverse types of failures to occur and so necessity for reliability and availability have turn out to be of vital concern. To ensure reliability and availability of cloud technologies, methods for fault tolerance need to be developed and deployed. The proposed work will focus on adaptive behavior during the assortment of replication and fine-grained check pointing methods for achieving a reliable cloud platform that can grip diverse client requests. In addition to this, the proposed work will also conclude the best suitable fault tolerance scheme to each chosen virtual machine.


Sign in / Sign up

Export Citation Format

Share Document