A Novel Approach for Encrypted Data-Deduplication in Clouds

2020 ◽  
Vol 17 (8) ◽  
pp. 3631-3635
Author(s):  
L. Mary Gladence ◽  
Priyanka Reddy ◽  
Apoorva Shetty ◽  
E. Brumancia ◽  
Senduru Srinivasulu

Data deduplication is one of the main techniques for copying recovery data duplicates and was widely used in distributed storage to minimize extra space and spare data transfer capacity. It was proposed that the simultaneous encryption method encode the data before re-appropriating to preserve the confidentiality of delicate data while facilitating de replication. Unlike conventional de duplication systems, consumers are therefore viewed as having differential advantages as indupli-cate tests other than the data itself. Security analysis shows that our approach is safe in terms of the values set out in the proposed security model. For this deduplication M3 encryption algorithm and DES algorithm are used. M3 encryption is to compare another with the latest technology, for more effective, security purposes, fast actions and. The second DES encryption that was used to open the file and decrypt understandable language for humans in a secure language. A model of our current accepted copy check program is revised as proof of concept by the current research and explicitly shows the tests using our model. The proposed research shows that when opposed to conventional operations, our proposed duplicate test plot creates marginal overhead.

2018 ◽  
Vol 7 (2.4) ◽  
pp. 46 ◽  
Author(s):  
Shubhanshi Singhal ◽  
Akanksha Kaushik ◽  
Pooja Sharma

Due to drastic growth of digital data, data deduplication has become a standard component of modern backup systems. It reduces data redundancy, saves storage space, and simplifies the management of data chunks. This process is performed in three steps: chunking, fingerprinting, and indexing of fingerprints. In chunking, data files are divided into the chunks and the chunk boundary is decided by the value of the divisor. For each chunk, a unique identifying value is computed using a hash signature (i.e. MD-5, SHA-1, SHA-256), known as fingerprint. At last, these fingerprints are stored in the index to detect redundant chunks means chunks having the same fingerprint values. In chunking, the chunk size is an important factor that should be optimal for better performance of deduplication system. Genetic algorithm (GA) is gaining much popularity and can be applied to find the best value of the divisor. Secondly, indexing also enhances the performance of the system by reducing the search time. Binary search tree (BST) based indexing has the time complexity of  which is minimum among the searching algorithm. A new model is proposed by associating GA to find the value of the divisor. It is the first attempt when GA is applied in the field of data deduplication. The second improvement in the proposed system is that BST index tree is applied to index the fingerprints. The performance of the proposed system is evaluated on VMDK, Linux, and Quanto datasets and a good improvement is achieved in deduplication ratio.


Author(s):  
Pasquale Puzio ◽  
Refik Molva ◽  
Melek Önen ◽  
Sergio Loureiro

With the continuous increase of the number of users and the size of their data, data deduplication becomes a necessity for cloud storage providers. By storing a unique copy of duplicate data, cloud providers greatly reduce their storage and data transfer costs. The advantages of deduplication unfortunately come with a high cost in terms of new security and privacy challenges. In this chapter we propose ClouDedup, a secure and efficient storage service which assures block-level deduplication and data confidentiality at the same time. Although ClouDedup is based on convergent encryption, it remains secure thanks to the definition of a component that implements an additional encryption operation. Furthermore, as the requirement for deduplication at block-level raises an issue with respect to key management, we suggest to include a new component in order to implement the key management for each block together with the actual deduplication operation. In this chapter we show how we have implemented the proposed architecture, the challenges we have met and our solutions to these challenges.


2014 ◽  
Vol 2014 ◽  
pp. 1-8 ◽  
Author(s):  
Xianhan Zhang ◽  
Yang Cao

In this paper, we present a novel approach to create the new chaotic map and propose an improved image encryption scheme based on it. Compared with traditional classic one-dimensional chaotic maps like Logistic Map and Tent Map, this newly created chaotic map demonstrates many better chaotic properties for encryption, implied by a much larger maximal Lyapunov exponent. Furthermore, the new chaotic map and Arnold’s Cat Map based image encryption method is designed and proved to be of solid robustness. The simulation results and security analysis indicate that such method not only can meet the requirement of imagine encryption, but also can result in a preferable effectiveness and security, which is usable for general applications.


PLoS ONE ◽  
2021 ◽  
Vol 16 (1) ◽  
pp. e0244731
Author(s):  
Reem Almarwani ◽  
Ning Zhang ◽  
James Garside

Data Integrity Auditing (DIA) is a security service for verifying the integrity of outsourced data in Public Cloud Storage (PCS) by users or by Third-Party Auditors (TPAs) on behalf of the users. This paper proposes a novel DIA framework, called DIA-MTTP. The major novelty of the framework lies in that, while providing the DIA service in a PCS environment, it supports the use of third parties, but does not require full trust in the third parties. In achieving this property, a number of ideas also have been embedded in the design. These ideas include the use of multiple third parties and a hierarchical approach to their communication structure making the service more suited to resource-constrained user devices, the provision of two integrity assurance levels to balance the trade-off between security protection levels and the costs incurred, the application of a data deduplication measure to both new data and existing data updates to minimise the number of tags (re-)generated. In supporting the dynamic data and deduplication measure, a distributed data structure, called Multiple Mapping Tables (M2T), is proposed. Security analysis indicates that our framework is secure with the use of untrusted third parties. Performance evaluation indicates that our framework imposes less computational, communication and storage overheads than related works.


2017 ◽  
Vol 72 (5) ◽  
pp. 254-259 ◽  
Author(s):  
I. Burlacov ◽  
S. Hamann ◽  
H.-J. Spies ◽  
A. Dalke ◽  
J. Röpcke ◽  
...  

2021 ◽  
Vol 9 (7) ◽  
pp. 1463
Author(s):  
Tamirat Tefera Temesgen ◽  
Kristoffer Relling Tysnes ◽  
Lucy Jane Robertson

Cryptosporidium oocysts are known for being very robust, and their prolonged survival in the environment has resulted in outbreaks of cryptosporidiosis associated with the consumption of contaminated water or food. Although inactivation methods used for drinking water treatment, such as UV irradiation, can inactivate Cryptosporidium oocysts, they are not necessarily suitable for use with other environmental matrices, such as food. In order to identify alternative ways to inactivate Cryptosporidium oocysts, improved methods for viability assessment are needed. Here we describe a proof of concept for a novel approach for determining how effective inactivation treatments are at killing pathogens, such as the parasite Cryptosporidium. RNA sequencing was used to identify potential up-regulated target genes induced by oxidative stress, and a reverse transcription quantitative PCR (RT-qPCR) protocol was developed to assess their up-regulation following exposure to different induction treatments. Accordingly, RT-qPCR protocols targeting thioredoxin and Cryptosporidium oocyst wall protein 7 (COWP7) genes were evaluated on mixtures of viable and inactivated oocysts, and on oocysts subjected to various potential inactivation treatments such as freezing and chlorination. The results from the present proof-of-concept experiments indicate that this could be a useful tool in efforts towards assessing potential technologies for inactivating Cryptosporidium in different environmental matrices. Furthermore, this approach could also be used for similar investigations with other pathogens.


Information ◽  
2021 ◽  
Vol 12 (4) ◽  
pp. 142
Author(s):  
Weijing You ◽  
Lei Lei ◽  
Bo Chen ◽  
Limin Liu

By only storing a unique copy of duplicate data possessed by different data owners, deduplication can significantly reduce storage cost, and hence is used broadly in public clouds. When combining with confidentiality, deduplication will become problematic as encryption performed by different data owners may differentiate identical data which may then become not deduplicable. The Message-Locked Encryption (MLE) is thus utilized to derive the same encryption key for the identical data, by which the encrypted data are still deduplicable after being encrypted by different data owners. As keys may be leaked over time, re-encrypting outsourced data is of paramount importance to ensure continuous confidentiality, which, however, has not been well addressed in the literature. In this paper, we design SEDER, a SEcure client-side Deduplication system enabling Efficient Re-encryption for cloud storage by (1) leveraging all-or-nothing transform (AONT), (2) designing a new delegated re-encryption (DRE), and (3) proposing a new proof of ownership scheme for encrypted cloud data (PoWC). Security analysis and experimental evaluation validate security and efficiency of SEDER, respectively.


2021 ◽  
Vol 17 ◽  
Author(s):  
Swayamprakash Patel ◽  
Ashish Patel ◽  
Mruduka Patel ◽  
Umang Shah ◽  
Mehul Patel ◽  
...  

Background: Probe sonication and High-speed homogenizer are comparatively costly equipment to fabricate the nanoparticles. Many academic and research institutions cannot afford the procurement and maintenance of such sophisticated equipment. In the present work, a newer idea is conceptualized, which can be adopted by the underprivileged research institutions to fabricate solid lipid nanoparticles (SLN) in the absence of sophisticated equipment. The current work describes the pilot-level trials of this novel approach. This study represents the preliminary proof-of-concept trials for which the Indian patent application (3508/MUM/2015) is filed. Method: A frugal piece of equipment was made using a 50 ml centrifuge tube with conical bottom and a piezoelectric mist maker or humidifier. SLNs were prepared by combining the quasi-emulsion solvent evaporation approach and ultrasonic vibration approach. A quasi-emulsion was composed by the dropwise mixing of the organic solvent containing drug & lipid with an aqueous solution containing surfactant under continuous ultrasonic vibration in the piezoelectric chamber. The size of the droplets was significantly reduced due to piezoelectric ultrasonic vibration. Under the provision of mild vacuum and heat generated by vibration, the organic solvent was evaporated, which leaves behind a suspension of SLN. In the present work, albendazole was selected as a model drug. Various trials with Compritol 888 ATO® and Precirol ATO 5® as a lipid carrier and Tween 80 and Poloxamer 188 as a surfactant were performed. Zeta potential of SLNs was improved by the addition of polyelectrolytes like K2SO4 and Na4P2O7. Result and Conclusion: The ratio of drug to lipid was optimized to 1:4 for the most favorable results. SLN with a minimum Z-average diameter of 98.59 nm, -21 mV zeta potential, and 34.064 % (SD 10.78, n=9) entrapment efficiency were developed using the Precirol ATO 5 ® as a lipid carrier. The proof of concept for this novel approach is established through the development of Albendazole SLNs. This approach must also be evaluated for the development of polymeric nanoparticles and vesicular formulations. The further sophistication of the frugal equipment may allow more control over the quality of SLN. This approach will enable underprivileged researchers to prepare Nanopharmaceuticals. Researchers and students of such institutions can focus on the application of SLN by resolving the constraint of sophisticated equipment with this novel approach. This novel approach should also be tried for polymeric and vesicular nanopharmaceuticals.


2018 ◽  
Vol 2018 ◽  
pp. 1-10
Author(s):  
Hua Dai ◽  
Hui Ren ◽  
Zhiye Chen ◽  
Geng Yang ◽  
Xun Yi

Outsourcing data in clouds is adopted by more and more companies and individuals due to the profits from data sharing and parallel, elastic, and on-demand computing. However, it forces data owners to lose control of their own data, which causes privacy-preserving problems on sensitive data. Sorting is a common operation in many areas, such as machine learning, service recommendation, and data query. It is a challenge to implement privacy-preserving sorting over encrypted data without leaking privacy of sensitive data. In this paper, we propose privacy-preserving sorting algorithms which are on the basis of the logistic map. Secure comparable codes are constructed by logistic map functions, which can be utilized to compare the corresponding encrypted data items even without knowing their plaintext values. Data owners firstly encrypt their data and generate the corresponding comparable codes and then outsource them to clouds. Cloud servers are capable of sorting the outsourced encrypted data in accordance with their corresponding comparable codes by the proposed privacy-preserving sorting algorithms. Security analysis and experimental results show that the proposed algorithms can protect data privacy, while providing efficient sorting on encrypted data.


2018 ◽  
Vol 2018 ◽  
pp. 1-11 ◽  
Author(s):  
Guodong Ye ◽  
Kaixin Jiao ◽  
Chen Pan ◽  
Xiaoling Huang

In this paper, an effective framework for chaotic encryption based on a three-dimensional logistic map is presented together with secure hash algorithm-3 (SHA-3) and electrocardiograph (ECG) signal. Following the analysis of the drawbacks, namely, fixed key and low sensitivity, of some current algorithms, this work tries to solve these two problems and includes two contributions: (1) removal of the phenomenon of summation invariance in a plain-image, for which SHA-3 is proposed to calculate the hash value for the plain-image, with the results being employed to influence the initial keys for chaotic map; (2) resolution of the problem of fixed key by using an ECG signal, that can be different for different subjects or different for same subject at different times. The Wolf algorithm is employed to produce all the control parameters and initial keys in the proposed encryption method. It is believed that combining with the classical architecture of permutation-diffusion, the summation invariance in the plain-image and shortcoming of a fixed key will be avoided in our algorithm. Furthermore, the experimental results and security analysis show that the proposed encryption algorithm can achieve confidentiality.


Sign in / Sign up

Export Citation Format

Share Document