Secure Deduplication with Encrypted Data for Cloud Storage

Author(s):  
Pasquale Puzio ◽  
Refik Molva ◽  
Melek Önen ◽  
Sergio Loureiro

With the continuous increase of the number of users and the size of their data, data deduplication becomes a necessity for cloud storage providers. By storing a unique copy of duplicate data, cloud providers greatly reduce their storage and data transfer costs. The advantages of deduplication unfortunately come with a high cost in terms of new security and privacy challenges. In this chapter we propose ClouDedup, a secure and efficient storage service which assures block-level deduplication and data confidentiality at the same time. Although ClouDedup is based on convergent encryption, it remains secure thanks to the definition of a component that implements an additional encryption operation. Furthermore, as the requirement for deduplication at block-level raises an issue with respect to key management, we suggest to include a new component in order to implement the key management for each block together with the actual deduplication operation. In this chapter we show how we have implemented the proposed architecture, the challenges we have met and our solutions to these challenges.

Cloud Computing is well known today on account of enormous measure of data storage and quick access of information over the system. It gives an individual client boundless extra space, accessibility and openness of information whenever at anyplace. Cloud service provider can boost information storage by incorporating data deduplication into cloud storage, despite the fact that information deduplication removes excess information and reproduced information happens in cloud environment. This paper presents a literature survey alongside different deduplication procedures that have been based on cloud information storage. To all the more likely guarantee secure deduplication in cloud, this paper examines file level data deduplication and block level data deduplication.


2020 ◽  
Vol 17 (8) ◽  
pp. 3631-3635
Author(s):  
L. Mary Gladence ◽  
Priyanka Reddy ◽  
Apoorva Shetty ◽  
E. Brumancia ◽  
Senduru Srinivasulu

Data deduplication is one of the main techniques for copying recovery data duplicates and was widely used in distributed storage to minimize extra space and spare data transfer capacity. It was proposed that the simultaneous encryption method encode the data before re-appropriating to preserve the confidentiality of delicate data while facilitating de replication. Unlike conventional de duplication systems, consumers are therefore viewed as having differential advantages as indupli-cate tests other than the data itself. Security analysis shows that our approach is safe in terms of the values set out in the proposed security model. For this deduplication M3 encryption algorithm and DES algorithm are used. M3 encryption is to compare another with the latest technology, for more effective, security purposes, fast actions and. The second DES encryption that was used to open the file and decrypt understandable language for humans in a secure language. A model of our current accepted copy check program is revised as proof of concept by the current research and explicitly shows the tests using our model. The proposed research shows that when opposed to conventional operations, our proposed duplicate test plot creates marginal overhead.


Author(s):  
Shynu P. G. ◽  
Nadesh R. K. ◽  
Varun G. Menon ◽  
Venu P. ◽  
Mahdi Abbasi ◽  
...  

AbstractData redundancy is a significant issue that wastes plenty of storage space in the cloud-fog storage integrated environments. Most of the current techniques, which mainly center around the static scenes, for example, the backup and archive systems, are not appropriate because of the dynamic nature of data in the cloud or integrated cloud environments. This problem can be effectively reduced and successfully managed by data deduplication techniques, eliminating duplicate data in cloud storage systems. Implementation of data deduplication (DD) over encrypted data is always a significant challenge in an integrated cloud-fog storage and computing environment to optimize the storage efficiently in a highly secured manner. This paper develops a new method using Convergent and Modified Elliptic Curve Cryptography (MECC) algorithms over the cloud and fog environment to construct secure deduplication systems. The proposed method focuses on the two most important goals of such systems. On one side, the redundancy of data needs to be reduced to its minimum, and on the other hand, a robust encryption approach must be developed to ensure the security of the data. The proposed technique is well suited for operations such as uploading new files by a user to the fog or cloud storage. The file is first encrypted using the Convergent Encryption (CE) technique and then re-encrypted using the Modified Elliptic Curve Cryptography (MECC) algorithm. The proposed method can recognize data redundancy at the block level, reducing the redundancy of data more effectively. Testing results show that the proposed approach can outperform a few state-of-the-art methods of computational efficiency and security levels.


2019 ◽  
Vol 8 (4) ◽  
pp. 9803-9807

Nowadays, data deduplication has become more essential for cloud storage providers because of continuous increase in number of users and their data file size. The users are allowed to access server anytime and anywhere to upload/download their data file. Whenever data is retrieving, it leads to several problems associated to the confidentiality and privacy. For protection of data security, we proposed an efficient technique called ClouDedup which assures file deduplication. To secure the confidentiality of critical data while supporting ClouDedup checker, we proposed a triple data encryption standard (TDES) technique to encrypt the data prior to uploading the data file in cloud storage. The privilege level of user is verified with data to assure whether he is an authorized user or not. The analysis of security demonstrates that our proposed security method is safe and secure. We prove that our proposed ClouDedup method has minimal overhead compared to normal operations. The process aims to use authorized ClouDedup checker with a triple data encryption standard (TDES) technique to minimize duplication copies of data in hybrid cloud storage and conducted test experiments using our prototype.


2014 ◽  
Vol 556-562 ◽  
pp. 6223-6227 ◽  
Author(s):  
Chao Ling Li ◽  
Yue Chen

To deduplicate the sensitive data in a cloud storage center, a scheme called as MHT-Dedup that is based on MHT (Merkle Hash Tree) is proposed. It achieves the cross-user file-level client-side deduplication and local block-level client-side deduplication concurrently. It firstly encrypts the file on block granularity, and then authenticates the file ciphertext to find duplicated files (Proofs of oWnership, PoW) and check the hash of block plaintext to find duplicated blocks. In the PoW protocol of MHT-Dedup, an authenticating binary tree is generated from the tags of encrypted blocks to assuredly find the duplicated files. MHT-Dedup gets rid of the conflict between data deduplication and encryption, achieves the file-level and block-level deduplication concurrently, avoids the misuse of storage system by users, resists to the inside and outside attacks to data confidentiality, and prevents the target collision attack to files and brute force attack to blocks.


Sign in / Sign up

Export Citation Format

Share Document