scholarly journals An extensive research survey on data integrity and deduplication towards privacy in cloud storage

Author(s):  
Anil Kumar G. ◽  
Shantala C. P.

Owing to the highly distributed nature of the cloud storage system, it is one of the challenging tasks to incorporate a higher degree of security towards the vulnerable data. Apart from various security concerns, data privacy is still one of the unsolved problems in this regards. The prime reason is that existing approaches of data privacy doesn't offer data integrity and secure data deduplication process at the same time, which is highly essential to ensure a higher degree of resistance against all form of dynamic threats over cloud and internet systems. Therefore, data integrity, as well as data deduplication is such associated phenomena which influence data privacy. Therefore, this manuscript discusses the explicit research contribution toward data integrity, data privacy, and data deduplication. The manuscript also contributes towards highlighting the potential open research issues followed by a discussion of the possible future direction of work towards addressing the existing problems.

2014 ◽  
Vol 668-669 ◽  
pp. 1257-1262
Author(s):  
Jun Zuo ◽  
Zhen Zhu ◽  
Jing Yan Wang

In order to meet growing demand of storage resources, improve the security and integrity of enterprise information resources, this paper analyzes storage resources status and existing problems of Foshan XY Elec-Mech Limited Corp in the first. Based on detailed analysis of cloud storage hierarchy and its advantages, the cloud storage framework is designed to meet enterprise’s demands, security control strategies and backup strategies are also discussed. After implementation, the integration of information resources in multi-systems and multi-platforms may realized for providing users with massive information storage and access services. Furthermore, when disaster coming, multiple backups may be used to recovery for ensuring the continuity of enterprise’s business processing.


2018 ◽  
Vol 10 (2) ◽  
pp. 70-89 ◽  
Author(s):  
Jun Li ◽  
Mengshu Hou

This article describes how in order to reduce the amount of data, deduplication technology is introduced in the cloud storage. Adopting this technology, the duplicated data can be eliminated, users can conserve the storage requirement. However, deduplication technology also increases the data unavailability. To solve this problem, the authors propose a method to improve data availability in the deduplication storage system. It is based on the data chunk reference count and access frequency, and increases redundant information for the data chunks, to ensure data availability and minimize storage overhead. Extensive experiments are conducted to evaluate effectiveness of the improved method. WFD, CDC, and sliding block deduplication technology are used for comparison. The experimental results show that the proposed method can achieve higher data availability than the conventional method and increase little storage overhead.


2019 ◽  
Vol 30 (04) ◽  
pp. 551-570 ◽  
Author(s):  
Wenjuan Meng ◽  
Jianhua Ge ◽  
Tao Jiang

A cloud storage system which incorporates the deletion and deduplication functionalities will have both security and efficiency advantages over exiting solutions which provide only one of them. However, the security models of secure data deletion and data deduplication functionalities are not compatible with each other, which will cause security and efficiency vulnerability under coercive adversaries. To solve these security and efficiency challenges, we define and construct a scheme, whose security relies on the proper erasure of keys in the wrapped key tree and periodical update of the deduplication encryption keys. Moreover, we enhance the efficiency of the proposed scheme by introducing incremental data update, where only the changed part is encrypted/decrypted and uploaded/downloaded in data updating. Further security analysis shows that the proposed scheme is secure against coercive attack. Finally, the practical implementation shows that our scheme is performance efficient in computation, storage and communication for both the cloud storage server and users.


2021 ◽  
Vol 2021 ◽  
pp. 1-5
Author(s):  
K. Mahalakshmi ◽  
K. Kousalya ◽  
Himanshu Shekhar ◽  
Aby K. Thomas ◽  
L. Bhagyalakshmi ◽  
...  

Cloud storage provides a potential solution replacing physical disk drives in terms of prominent outsourcing services. A threaten from an untrusted server affects the security and integrity of the data. However, the major problem between the data integrity and cost of communication and computation is directly proportional to each other. It is hence necessary to develop a model that provides the trade-off between the data integrity and cost metrics in cloud environment. In this paper, we develop an integrity verification mechanism that enables the utilisation of cryptographic solution with algebraic signature. The model utilises elliptic curve digital signature algorithm (ECDSA) to verify the data outsources. The study further resists the malicious attacks including forgery attacks, replacing attacks and replay attacks. The symmetric encryption guarantees the privacy of the data. The simulation is conducted to test the efficacy of the algorithm in maintaining the data integrity with reduced cost. The performance of the entire model is tested against the existing methods in terms of their communication cost, computation cost, and overhead cost. The results of simulation show that the proposed method obtains reduced computational of 0.25% and communication cost of 0.21% than other public auditing schemes.


IJARCCE ◽  
2017 ◽  
Vol 6 (4) ◽  
pp. 316-323
Author(s):  
Bhos Komal ◽  
Ingale Karuna ◽  
Hattikatti Susmita ◽  
Jadhav Sachin ◽  
Mirajkar SS ◽  
...  

Cloud computing, an efficient technology that utilizes huge amount of data file storage with security. However, the content owner does not controlling data access for unauthorized clients and does not control data storage and usage of data. Some previous approaches data access control to help data de-duplication concurrently for cloud storage system. Encrypted data for cloud storage is not effectively handled by current industrial de-duplication solutions. The deduplication is unguarded from brute-force attacks and fails in supporting control of data access .An efficient data confining technique that eliminates redundant data’s multiple copies which is commonly used is Data-Deduplication. It reduces the space needed to store these data and thus bandwidth is saved. An efficient content discovery and preserving De-duplication (ECDPD) algorithm that detects client file range and block range of de-duplication in storing data files in the cloud storage system was proposed to overpower the above problems.Data access control is supported by ECDPD actively. Based on Experimental evaluations, proposed ECDPD method reduces 3.802 milliseconds of DUT (Data Uploading Time) and 3.318 milliseconds of DDT (Data Downloading Time) compared than existing approaches


Information ◽  
2020 ◽  
Vol 11 (9) ◽  
pp. 409
Author(s):  
Yuan Ping ◽  
Yu Zhan ◽  
Ke Lu ◽  
Baocang Wang

Although cloud storage provides convenient data outsourcing services, an untrusted cloud server frequently threatens the integrity and security of the outsourced data. Therefore, it is extremely urgent to design security schemes allowing the users to check the integrity of data with acceptable computational and communication overheads. In this paper, we first propose a public data integrity verification scheme based on the algebraic signature and elliptic curve cryptography. This scheme not only allows the third party authority deputize for users to verify the outsourced data integrity, but also resists malicious attacks such as replay attacks, replacing attack and forgery attacks. Data privacy is guaranteed by symmetric encryption. Furthermore, we construct a novel data structure named divide and conquer hash list, which can efficiently perform data updating operations, such as deletion, insertion, and modification. Compared with the relevant schemes in the literature, security analysis and performance evaluations show that the proposed scheme gains some advantages in integrity verification and dynamic updating.


Webology ◽  
2021 ◽  
Vol 18 (Special Issue 01) ◽  
pp. 288-301
Author(s):  
G. Sujatha ◽  
Dr. Jeberson Retna Raj

Data storage is one of the significant cloud services available to the cloud users. Since the magnitude of information outsourced grows extremely high, there is a need of implementing data deduplication technique in the cloud storage space for efficient utilization. The cloud storage space supports all kind of digital data like text, audio, video and image. In the hash-based deduplication system, cryptographic hash value should be calculated for all data irrespective of its type and stored in the memory for future reference. Using these hash value only, duplicate copies can be identified. The problem in this existing scenario is size of the hash table. To find a duplicate copy, all the hash values should be checked in the worst case irrespective of its data type. At the same time, all kind of digital data does not suit with same structure of hash table. In this study we proposed an approach to have multiple hash tables for different digital data. By having dedicated hash table for each digital data type will improve the searching time of duplicate data.


2018 ◽  
Vol 7 (S1) ◽  
pp. 16-19
Author(s):  
B. Rasina Begum ◽  
P. Chithra

Cloud computing provides a scalable platform for large amount of data and processes that work on various applications and services by means of on-demand service. The storage services offered by clouds have become a new profit growth by providing a comparable cheaper, scalable, location-independent platform for managing users’ data. The client uses the cloud storage and enjoys the high end applications and services from a shared group of configurable computing resources using cloud services. It reduces the difficulty of local data storage and maintenance. But it gives severe security issues toward users’ outsourced data. Data Redundancy promotes the data reliability in Cloud Storage. At the same time, it increases storage space, Bandwidth and Security threats due to some server vulnerability. Data Deduplication helps to improve storage utilization. Backup is also less which means less Hardware and Backup media. But it has lots of security issues. Data reliability is a very risky issue in a Deduplication storage system because there is single copy for each file stored in the server which is shared by all the data owners. If such a shared file/chunk was missing, large amount of data becomes unreachable. The main aim of this work is to implement Deduplication System without sacrificing Security in cloud storage. It combines both Deduplication and convergent key cryptography with reduced overhead.


Sign in / Sign up

Export Citation Format

Share Document