scholarly journals Secure Deduplication Based on Rabin Fingerprinting over Wireless Sensing Data in Cloud Computing

2018 ◽  
Vol 2018 ◽  
pp. 1-12
Author(s):  
Yinghui Zhang ◽  
Haonan Su ◽  
Menglei Yang ◽  
Dong Zheng ◽  
Fang Ren ◽  
...  

The rapid advancements in the Internet of Things (IoT) and cloud computing technologies have significantly promoted the collection and sharing of various data. In order to reduce the communication cost and the storage overhead, it is necessary to exploit data deduplication mechanisms. However, existing data deduplication technologies still suffer security and efficiency drawbacks. In this paper, we propose two secure data deduplication schemes based on Rabin fingerprinting over wireless sensing data in cloud computing. The first scheme is based on deterministic tags and the other one adopts random tags. The proposed schemes realize data deduplication before the data is outsourced to the cloud storage server, and hence both the communication cost and the computation cost are reduced. In particular, variable-size block-level deduplication is enabled based on the technique of Rabin fingerprinting which generates data blocks based on the content of the data. Before outsourcing data to the cloud, users encrypt the data based on convergent encryption technologies, which protects the data from being accessed by unauthorized users. Our security analysis shows that the proposed schemes are secure against offline brute-force dictionary attacks. In addition, the random tag makes the second scheme more reliable. Extensive experimental results indicate that the proposed data deduplication schemes are efficient in terms of the deduplication rate, the system operation time, and the tag generation time.

Author(s):  
K. V. Uma Maheswari ◽  
Dr. Dhanaraj Cheelu

Cloud computing is recognized as an alternative to traditional information technology due to its intrinsic resource sharing and low maintenance characteristics. Cloud computing provides an economical and efficient solution for sharing group resource among cloud users. Unfortunately, when sharing the data in a group while preserving data, identity privacy is still a challenging issue due to frequent change in membership. In overcome this problem, a secure data sharing scheme for dynamic groups is proposed so that any user within a group can share the data in a secure manner by leveraging both the group signature and dynamic broadcast encryption techniques. It should enable any cloud user to anonymously share data with others within the group and support efficient member revocation. The storage overhead and encryption computation cost are dependent on the number of revoked users.


Cloud computing is an on demand paradigm which provides a different kind of services to the cloud users. Cloud storage is the most popular service, as the data owners are free from the data management and storage overhead. However, the data owners’ concern about the security of the data. In order to address this issue, this paper presents an efficient security with an auditing scheme that guarantees the security of the data and preserve data integrity. In this paper, the cloud storage auditing model used efficient privacy preserving algorithm, namely Markle Hellman Knapsack Crypto-System (MHKCS) algorithm. This algorithm effectively improves the data integrity, confidentiality and security. Moreover, reduces the key generation time, encryption time and decryption time. The performance of MHKCS algorithm is calculated using evaluation metrics like encryption time, decryption time, key generation time and communication cost. The MHKCS algorithm achieved approximately 10% better performance in terms of encryption time than the existing methods RSA, MRSA, and MRSAC.


Author(s):  
Neha Thakur ◽  
Aman Kumar Sharma

Cloud computing has been envisioned as the definite and concerning solution to the rising storage costs of IT Enterprises. There are many cloud computing initiatives from IT giants such as Google, Amazon, Microsoft, IBM. Integrity monitoring is essential in cloud storage for the same reasons that data integrity is critical for any data centre. Data integrity is defined as the accuracy and consistency of stored data, in absence of any alteration to the data between two updates of a file or record.  In order to ensure the integrity and availability of data in Cloud and enforce the quality of cloud storage service, efficient methods that enable on-demand data correctness verification on behalf of cloud users have to be designed. To overcome data integrity problem, many techniques are proposed under different systems and security models. This paper will focus on some of the integrity proving techniques in detail along with their advantages and disadvantages.


2013 ◽  
Vol 846-847 ◽  
pp. 1608-1611 ◽  
Author(s):  
Hui Jie Ding

As more and more cars are in service, the traffic jam becomes a serious problem in our society. At the same time, more and more sensors make the cars more and more intelligent, and this promotes the development of Internet of things. Real time monitoring the cars will produce massive sensing data, the Cloud computing gives us a good manner to solve this problem. In this paper, we propose a traffic flow data collection and traffic signal control system based on Internet of things and the Cloud computing. The proposed system contains two main parts, sensing data collection and traffic status control subsystem.


2018 ◽  
Vol 2018 ◽  
pp. 1-12
Author(s):  
Wenqi Chen ◽  
Hui Tian ◽  
Chin-Chen Chang ◽  
Fulin Nan ◽  
Jing Lu

Cloud storage, one of the core services of cloud computing, provides an effective way to solve the problems of storage and management caused by high-speed data growth. Thus, a growing number of organizations and individuals tend to store their data in the cloud. However, due to the separation of data ownership and management, it is difficult for users to check the integrity of data in the traditional way. Therefore, many researchers focus on developing several protocols, which can remotely check the integrity of data in the cloud. In this paper, we propose a novel public auditing protocol based on the adjacency-hash table, where dynamic auditing and data updating are more efficient than those of the state of the arts. Moreover, with such an authentication structure, computation and communication costs can be reduced effectively. The security analysis and performance evaluation based on comprehensive experiments demonstrate that our protocol can achieve all the desired properties and outperform the state-of-the-art ones in computing overheads for updating and verification.


2018 ◽  
Vol 2018 ◽  
pp. 1-7 ◽  
Author(s):  
Run Xie ◽  
Chanlian He ◽  
Dongqing Xie ◽  
Chongzhi Gao ◽  
Xiaojun Zhang

With the advent of cloud computing, data privacy has become one of critical security issues and attracted much attention as more and more mobile devices are relying on the services in cloud. To protect data privacy, users usually encrypt their sensitive data before uploading to cloud servers, which renders the data utilization to be difficult. The ciphertext retrieval is able to realize utilization over encrypted data and searchable public key encryption is an effective way in the construction of encrypted data retrieval. However, the previous related works have not paid much attention to the design of ciphertext retrieval schemes that are secure against inside keyword-guessing attacks (KGAs). In this paper, we first construct a new architecture to resist inside KGAs. Moreover we present an efficient ciphertext retrieval instance with a designated tester (dCRKS) based on the architecture. This instance is secure under the inside KGAs. Finally, security analysis and efficiency comparison show that the proposal is effective for the retrieval of encrypted data in cloud computing.


2020 ◽  
Vol 2020 ◽  
pp. 1-11
Author(s):  
Bo Mi ◽  
Ping Long ◽  
Yang Liu ◽  
Fengtian Kuang

Data deduplication serves as an effective way to optimize the storage occupation and the bandwidth consumption over clouds. As for the security of deduplication mechanism, users’ privacy and accessibility are of utmost concern since data are outsourced. However, the functionality of redundancy removal and the indistinguishability of deduplication labels are naturally incompatible, which bring about a lot of threats on data security. Besides, the access control of sharing copies may lead to infringement on users’ attributes and cumbersome query overheads. To balance the usability with the confidentiality of deduplication labels and securely realize an elaborate access structure, a novel data deduplication scheme is proposed in this paper. Briefly speaking, we drew support from learning with errors (LWE) to make sure that the deduplication labels are only differentiable during the duplication check process. Instead of authority matching, the proof of ownership (PoW) is then implemented under the paradigm of inner production. Since the deduplication label is light-weighted and the inner production is easy to carry out, our scheme is more efficient in terms of computation and storage. Security analysis also indicated that the deduplication labels are distinguishable only for duplication check, and the probability of falsifying a valid ownership is negligible.


2013 ◽  
Vol 70 (24) ◽  
pp. 33-37 ◽  
Author(s):  
Navdeep Aggarwal ◽  
Parshant Tyagi ◽  
Bhanu P. Dubey ◽  
Emmanuel S. Pilli

Sign in / Sign up

Export Citation Format

Share Document