scholarly journals Preservation of Data Integrity in Public Cloud Using Enhanced Vigenere Cipher Based Obfuscation

Author(s):  
A. K. JAITHUNBI ◽  
S. SABENA ◽  
L. SAIRAMESH

Abstract Today’s internet world is moves to cloud computing to maintain their public data privately in a secure way. In cloud scenario, many security principles are implemented to maintain the secure transmission of data over the internet. And still, the main concern is about maintaining the integrity of our own data in public cloud. Mostly, research works concentrates on cryptographic techniques for secure sharing of data but there is no such mentioned works are available for data integrity. In this paper, a data masking technique called obfuscation is implemented which is used to protect the data from unwanted modification by data breaching attacks. In this work, enhanced Vigenere encryption is used to perform obfuscation that maintains the privacy of the user’s data. Enhanced Vigenere encryption algorithm combined with intelligent rules to maintain the dissimilarity between the data masking for perform encryption with different set of rules. This work mainly concentrates on data privacy with reduced time complexity for encryption and decryption.

Information ◽  
2020 ◽  
Vol 11 (9) ◽  
pp. 409
Author(s):  
Yuan Ping ◽  
Yu Zhan ◽  
Ke Lu ◽  
Baocang Wang

Although cloud storage provides convenient data outsourcing services, an untrusted cloud server frequently threatens the integrity and security of the outsourced data. Therefore, it is extremely urgent to design security schemes allowing the users to check the integrity of data with acceptable computational and communication overheads. In this paper, we first propose a public data integrity verification scheme based on the algebraic signature and elliptic curve cryptography. This scheme not only allows the third party authority deputize for users to verify the outsourced data integrity, but also resists malicious attacks such as replay attacks, replacing attack and forgery attacks. Data privacy is guaranteed by symmetric encryption. Furthermore, we construct a novel data structure named divide and conquer hash list, which can efficiently perform data updating operations, such as deletion, insertion, and modification. Compared with the relevant schemes in the literature, security analysis and performance evaluations show that the proposed scheme gains some advantages in integrity verification and dynamic updating.


2019 ◽  
Vol 12 (2) ◽  
pp. 149-158
Author(s):  
Muhammad Fadlan ◽  
Sinawati Sinawati ◽  
Aida Indriani ◽  
Evi Dianti Bintari

The importance of maintaining data integrity and security is one of the challenges in the current digital era. One method that can be used to face this challenge is through cryptography. In cryptography there are several algorithms that can be used, one of which is the Caesar cipher algorithm. This algorithm has several disadvantages, including a limited number of characters of 26 characters. This can make the encryption results easily recognizable by other parties. This study aims to design a proposal for maintaining data security through cryptographic techniques, while addressing the problems inherent in the Caesar cipher algorithm. The combination of Caesar and Beaufort algorithm is done to overcome the existing problems. In addition, a character list of 94 characters was determined to be used in the process of encryption and decryption of text data. The result, through the integration of these two algorithms, the text cipher becomes more difficult to solve. There are two stages of the encryption process by using two different types of Keys for each stage in securing data


2020 ◽  
Vol 7 (3) ◽  
pp. 398
Author(s):  
Nurainun Hasanah Sinaga ◽  
Muhammad Syahrizal

SMS (Short Message Service) is a popular communication technology revolution. The development of computerized technology has been very advanced. SMS is very vulnerable to data theft or theft by irresponsible parties. For the sake of maintaining the security of SMS can be done by using cryptographic techniques. Cryptographic techniques can encode text messages by encrypting them into passwords that are not understood. Mars algorithm is an algorithm that uses a 128-bit key and the encryption process consists of 32 rounds. This symmetry algorithm will produce a higher level of security for the ultrasound image because it can encode it into a form of cipher with a process that is complex enough so that it will be difficult for cryptanalysts to access the image. This research will use the Mars algorithm for the encryption and decryption process, so the process needs to go through several long stages in order to produce the final cipher. This study describes the process of securing SMS by encoding it based on the Mars algorithm, in the form of a password that is difficult for others to understand and understand. This is done as an effort to minimize acts of misuse of SMS


Author(s):  
Anil Kumar G. ◽  
Shantala C. P.

Owing to the highly distributed nature of the cloud storage system, it is one of the challenging tasks to incorporate a higher degree of security towards the vulnerable data. Apart from various security concerns, data privacy is still one of the unsolved problems in this regards. The prime reason is that existing approaches of data privacy doesn't offer data integrity and secure data deduplication process at the same time, which is highly essential to ensure a higher degree of resistance against all form of dynamic threats over cloud and internet systems. Therefore, data integrity, as well as data deduplication is such associated phenomena which influence data privacy. Therefore, this manuscript discusses the explicit research contribution toward data integrity, data privacy, and data deduplication. The manuscript also contributes towards highlighting the potential open research issues followed by a discussion of the possible future direction of work towards addressing the existing problems.


2019 ◽  
Vol 8 (2S11) ◽  
pp. 3606-3611

Big data privacy has assumed importance as the cloud computing became a phenomenal success in providing a remote platform for sharing computing resources without geographical and time restrictions. However, the privacy concerns on the big data being outsourced to public cloud storage are still exist. Different anonymity or sanitization techniques came into existence for protecting big data from privacy attacks. In our prior works, we have proposed a misusability probability based metric to know the probable percentage of misusability. We additionally planned a system that suggests level of sanitization before actually applying privacy protection to big data. It was based on misusability probability. In this paper, our focus is on further evaluation of our misuse probability based sanitization of big data approach by defining an algorithm which willanalyse the trade-offs between misuse probability and level of sanitization. It throws light into the proposed framework and misusability measure besides evaluation of the framework with an empirical study. Empirical study is made in public cloud environment with Amazon EC2 (compute engine), S3 (storage service) and EMR (MapReduce framework). The experimental results revealed the dynamics of the trade-offs between them. The insights help in making well informed decisions while sanitizing big data to ensure that it is protected without losing utility required.


Author(s):  
Siddhartha Duggirala

The essence of Cloud computing is moving out the processing from the local systems to remote systems. Cloud is an umbrella of physical/virtual services/resources easily accessible over the internet. With more companies adopting cloud either fully through public cloud or Hybrid model, the challenges in maintaining a cloud capable infrastructure is also increasing. About 42% of CTOs say that security is their main concern for moving into cloud. Another problem which is mainly problem with infrastructure is the connectivity issue. The datacenter could be considered as the backbone of cloud computing architecture. As the processing power and storage capabilities of the end devices like mobile phones, routers, sensor hubs improve we can increasing leverage these resources to improve your quality and reliability of services.


Fog Computing ◽  
2018 ◽  
pp. 208-219
Author(s):  
Siddhartha Duggirala

The essence of Cloud computing is moving out the processing from the local systems to remote systems. Cloud is an umbrella of physical/virtual services/resources easily accessible over the internet. With more companies adopting cloud either fully through public cloud or Hybrid model, the challenges in maintaining a cloud capable infrastructure is also increasing. About 42% of CTOs say that security is their main concern for moving into cloud. Another problem which is mainly problem with infrastructure is the connectivity issue. The datacenter could be considered as the backbone of cloud computing architecture. As the processing power and storage capabilities of the end devices like mobile phones, routers, sensor hubs improve we can increasing leverage these resources to improve your quality and reliability of services.


Sign in / Sign up

Export Citation Format

Share Document