scholarly journals Secure Deduplication For Cloud Storage Using Memory Mapping Technique For Improving Performance And Security

2021 ◽  
Vol 23 (09) ◽  
pp. 1-13
Author(s):  
Jibin Joy ◽  
◽  
Dr. S. Devaraju ◽  

Data deduplication is a crucial technique for packing data and reducing duplication when transferring data. It is widely used in the cloud to restrict the usage of capacity memory and aids in transmission capacity sparing. Before redistributing data, an encryption mechanism is used to ensure the integrity of sensitive data during the deduplication process. The SHA algorithm is being used to save data in text format. To generate the security bits, padding is appended to the text. In de-duplication, it calculates the hash, i.e. hexadecimal number, string, and integer data. Hash-based de-duplication is the implementation of whole file hashing to the entire file. Text data’s hash values are considered to as feature properties. In contrast to traditional deduplication solutions, clients that transfer data to the cloud certify duplication inside the cloud data. In virtualization, both limiting primary memory size and memory blockage are considered important bottlenecks. Memory deduplication identifies pages with the same content and merges them into a single data file, reducing memory usage, memory parceling, and improving execution. In cloud storage, the MPT is used to deduplication so that it is used in single copies of the same data for different data owners. If any data users try to replicate the same data, it will be mapped and related to the archive data, implying that the data can’t be stored away. To ensure cloud data security, encryption techniques are used to encrypt data throughout deduplication procedures and prior to outsourcing cloud data.

2021 ◽  
Author(s):  
Ruba S ◽  
A.M. Kalpana

Abstract Deduplication can be used as a data redundancy removal method that has been constructed to save system storage resources through redundant data reduction in cloud storage. Now a day, deduplication techniques are increasingly exploited to cloud data centers with the growth of cloud computing techniques. Therefore, many deduplication methods were presented by many researchers to eliminate redundant data in cloud storage. For secure deduplication, previous works typically have introduced third-party auditors for the data integrity verification, but it may be suffered from data leak by the third-party auditors. And also the customary methods could not face more difficulties in big data deduplication to correctly consider the two conflicting aims of high duplicate elimination ratio and deduplication throughput. In this paper, an improved blockchain-based secure data deduplication is presented with efficient cryptographic methods to save cloud storage securely. In the proposed method, an attribute-based role key generation (ARKG) method is constructed in a hierarchical tree manner to generate a role key when the data owners upload their data to cloud service provider (CSP) and to allow authorized users to download the data. In our system, the smart contract (agreement between the data owner and CSP) is done using SHA-256 (Secure Hash Algorithm-256) to generate a tamper-proofing ledger for data integrity, in which data is protected from illegal modifications, and duplication detection is executed through hash-tag that can be formed by SHA-256. Message Locked encryption (MLE) is employed to encrypt data for data uploading by the data owners to the CSP. The experimental results show that our proposed secure deduplication scheme can give higher throughput and a low duplicate elimination ratio.


2014 ◽  
Vol 556-562 ◽  
pp. 5395-5399
Author(s):  
Jian Hong Zhang ◽  
Wen Jing Tang

Data integrity is one of the biggest concerns with cloud data storage for cloud user. Besides, the cloud user’s constrained computing capabilities make the task of data integrity auditing expensive and even formidable. Recently, a proof-of-retrievability scheme proposed by Yuan et al. has addressed the issue, and security proof of the scheme was provided. Unfortunately, in this work we show that the scheme is insecure. Namely, the cloud server who maliciously modifies the data file can pass the verification, and the client who executes the cloud storage auditing can recover the whole data file through the interactive process. Furthermore, we also show that the protocol is vulnerable to an efficient active attack, which means that the active attacker is able to arbitrarily modify the cloud data without being detected by the auditor in the auditing process. After giving the corresponding attacks to Yuan et al.’s scheme, we suggest a solution to fix the problems.


Cloud computing, an efficient technology that utilizes huge amount of data file storage with security. However, the content owner does not controlling data access for unauthorized clients and does not control data storage and usage of data. Some previous approaches data access control to help data de-duplication concurrently for cloud storage system. Encrypted data for cloud storage is not effectively handled by current industrial de-duplication solutions. The deduplication is unguarded from brute-force attacks and fails in supporting control of data access .An efficient data confining technique that eliminates redundant data’s multiple copies which is commonly used is Data-Deduplication. It reduces the space needed to store these data and thus bandwidth is saved. An efficient content discovery and preserving De-duplication (ECDPD) algorithm that detects client file range and block range of de-duplication in storing data files in the cloud storage system was proposed to overpower the above problems.Data access control is supported by ECDPD actively. Based on Experimental evaluations, proposed ECDPD method reduces 3.802 milliseconds of DUT (Data Uploading Time) and 3.318 milliseconds of DDT (Data Downloading Time) compared than existing approaches


2019 ◽  
Vol 8 (4) ◽  
pp. 9803-9807

Nowadays, data deduplication has become more essential for cloud storage providers because of continuous increase in number of users and their data file size. The users are allowed to access server anytime and anywhere to upload/download their data file. Whenever data is retrieving, it leads to several problems associated to the confidentiality and privacy. For protection of data security, we proposed an efficient technique called ClouDedup which assures file deduplication. To secure the confidentiality of critical data while supporting ClouDedup checker, we proposed a triple data encryption standard (TDES) technique to encrypt the data prior to uploading the data file in cloud storage. The privilege level of user is verified with data to assure whether he is an authorized user or not. The analysis of security demonstrates that our proposed security method is safe and secure. We prove that our proposed ClouDedup method has minimal overhead compared to normal operations. The process aims to use authorized ClouDedup checker with a triple data encryption standard (TDES) technique to minimize duplication copies of data in hybrid cloud storage and conducted test experiments using our prototype.


In recent years, Cloud computing provides strong grip and flexible access on outsource data, cloud storage, data privacy is major concern from to outsource their data, authenticated users are allowed to access this storage to prevent important and sensitive data. For data protection and utilization, we encrypt our sensitive data before outsourced our data because cannot trust storage server, are un-trusty but on other hand, data retrieval in encrypted format from cloud, is challenging task for data utilization, was encrypted from plaintext to ciphertext, when retrieves from cloud storage. However, searchable encryption schemes used Boolean search but they are unable to make data utilization for huge data and failed to handle multi-users access to retrieve ciphertext from cloud and user’s authentication. In this paper, we are using ranked keyword search over encrypted data by going k-documents at storage and using a Hierarchical Clustering Method is designed to guide more search semantics with an additional feature of making the system to cope the demand for fast ciphertext k-search in large scale environments explored the relevance score such as massive and big cloud data. This threshold splits the consequential clusters into sub-clusters until the necessity on the maximum size of cluster is reached. To make fetching search to be secure and privacy-preserving, it is built an index for searching on cloud data and retrieve the most relevant files from cloud. To defending privacy breaches from unauthorized users, users will go through authentication process and data retrieval time as well.


Author(s):  
Sunil S ◽  
A Ananda Shankar

Cloud storage system is to provides facilitative file storage and sharing services for distributed clients.The cloud storage preserve the privacy of data holders by proposing a scheme to manage encrypted data storage with deduplication. This process can flexibly support data sharing with deduplication even when the data holder is offline, and it does not intrude the privacy of data holders. It is an effective approach to verify data ownership and check duplicate storage with secure challenge and big data support. We integrate cloud data deduplication with data access control in a simple way, thus reconciling data deduplication and encryption.We prove the security and assess the performance through analysis and simulation. The results show its efficiency, effectiveness and applicability.In this proposed system the upload data will be stored on the cloud based on date.This means that it has to be available to the data holder who need it when they need it. The web log record represents whether the keyword is repeated or not. Records with only repeated search data are retained in primary storage in cloud. All the other records are stored in temporary storage server. This step reduces the size of the web log thereby avoids the burden on the memory and speeds up the analysis.


Cloud computing is the service-oriented platform which will provide security for the various data uploaded by the users. Security is the service which can be provided by the service providers. There is a lot data that can be stored in the cloud with the help of various security algorithms. The data which can be stored in the cloud is called outsourced data. Every user wants to store the sensitive data to cloud storage. In this paper, the Enhanced Privacy and Secure Storage data (EPSS) can be searched with the multiple keywords. For the searching of multiple keywords the Enhanced Keyword Search (EKS) which retrieve the data very fast and with multiple records. Experimental results show the performance of the searching and security.


2014 ◽  
Vol 556-562 ◽  
pp. 6223-6227 ◽  
Author(s):  
Chao Ling Li ◽  
Yue Chen

To deduplicate the sensitive data in a cloud storage center, a scheme called as MHT-Dedup that is based on MHT (Merkle Hash Tree) is proposed. It achieves the cross-user file-level client-side deduplication and local block-level client-side deduplication concurrently. It firstly encrypts the file on block granularity, and then authenticates the file ciphertext to find duplicated files (Proofs of oWnership, PoW) and check the hash of block plaintext to find duplicated blocks. In the PoW protocol of MHT-Dedup, an authenticating binary tree is generated from the tags of encrypted blocks to assuredly find the duplicated files. MHT-Dedup gets rid of the conflict between data deduplication and encryption, achieves the file-level and block-level deduplication concurrently, avoids the misuse of storage system by users, resists to the inside and outside attacks to data confidentiality, and prevents the target collision attack to files and brute force attack to blocks.


Sign in / Sign up

Export Citation Format

Share Document