scholarly journals An Authorized Clou Dedup in Hybrid Cloud using Triple Data Encryption Standard

2019 ◽  
Vol 8 (4) ◽  
pp. 9803-9807

Nowadays, data deduplication has become more essential for cloud storage providers because of continuous increase in number of users and their data file size. The users are allowed to access server anytime and anywhere to upload/download their data file. Whenever data is retrieving, it leads to several problems associated to the confidentiality and privacy. For protection of data security, we proposed an efficient technique called ClouDedup which assures file deduplication. To secure the confidentiality of critical data while supporting ClouDedup checker, we proposed a triple data encryption standard (TDES) technique to encrypt the data prior to uploading the data file in cloud storage. The privilege level of user is verified with data to assure whether he is an authorized user or not. The analysis of security demonstrates that our proposed security method is safe and secure. We prove that our proposed ClouDedup method has minimal overhead compared to normal operations. The process aims to use authorized ClouDedup checker with a triple data encryption standard (TDES) technique to minimize duplication copies of data in hybrid cloud storage and conducted test experiments using our prototype.

2018 ◽  
Vol 3 (2) ◽  
Author(s):  
Siti Amalia Nazihah Surosa ◽  
Iskandar Fitri ◽  
Novi Dian Nathasia

Cloud computing, an efficient technology that utilizes huge amount of data file storage with security. However, the content owner does not controlling data access for unauthorized clients and does not control data storage and usage of data. Some previous approaches data access control to help data de-duplication concurrently for cloud storage system. Encrypted data for cloud storage is not effectively handled by current industrial de-duplication solutions. The deduplication is unguarded from brute-force attacks and fails in supporting control of data access .An efficient data confining technique that eliminates redundant data’s multiple copies which is commonly used is Data-Deduplication. It reduces the space needed to store these data and thus bandwidth is saved. An efficient content discovery and preserving De-duplication (ECDPD) algorithm that detects client file range and block range of de-duplication in storing data files in the cloud storage system was proposed to overpower the above problems.Data access control is supported by ECDPD actively. Based on Experimental evaluations, proposed ECDPD method reduces 3.802 milliseconds of DUT (Data Uploading Time) and 3.318 milliseconds of DDT (Data Downloading Time) compared than existing approaches


2021 ◽  
Vol 13 (7) ◽  
pp. 181
Author(s):  
Agil Yolchuyev ◽  
Janos Levendovszky

“Hybrid Cloud Storage” (HCS) is a widely adopted framework that combines the functionality of public and private cloud storage models to provide storage services. This kind of storage is especially ideal for organizations that seek to reduce the cost of their storage infrastructure with the use of “Public Cloud Storage” as a backend to on-premises primary storage. Despite the higher performance, the hybrid cloud has latency issues, related to the distance and bandwidth of the public storage, which may cause a significant drop in the performance of the storage systems during data transfer. This issue can become a major problem when one or more private storage nodes fail. In this paper, we propose a new framework for optimizing the data uploading process that is currently used with hybrid cloud storage systems. The optimization is concerned with spreading the data over the multiple storages in the HCS system according to some predefined objective functions. Furthermore, we also used Network Coding technics for minimizing data transfer latency between the receiver (private storages) and transmitter nodes.


Sign in / Sign up

Export Citation Format

Share Document