scholarly journals Authorized Deduplication of Encrypted data in Cloud

Author(s):  
Milind B. Waghmare ◽  
Suhasini V. Padwekar

Cloud computing technology is rapidly developing nowadays. The number of files stored and processed is increasing per day. This increase brings severe challenge in requirement of space, processing power and bandwidth. More than half of the data generated in the cloud is duplicate data. To handle this data, deduplication technique is used which eliminates duplicate copies of data. This removal of duplicate data increases storage efficiency and reduce cost. In this paper, we propose secure role re-encryption system which allows authorized deduplication of data and also maintains privacy of data. This system is based on convergent algorithm and re-encryption algorithm that encrypts the user data and assign role keys to each user. This system grants privileges to users in order to maintain ownership of each user so that authorized users can access the data efficiently. In this system management center is introduced where the file is being encrypted and role keys are generated to handle authorized requests. Role keys are stored in Merkle hash tree which maps relationship between roles and keys. Authorized user who has particular role-encryption key can access the file. Convergent algorithm and role re-encryption algorithm allows access of specific file without leakage of private data. Dynamic updating of user privileges is achieved.

2020 ◽  
Vol 8 (5) ◽  
pp. 2355-2359

Cryptography at its very core is nothing but math - pure, simple, undiluted math. Math created algorithms that are basics for various encryption algorithm. Encryption is a method in which user’s confidential data or private data is encoded to cipher text and this text can be read only if it is decrypted by authorized user using the right key. Cipher text can be decoded back to plain text, only by the authorized users using a right key. Various encryption algorithm are used to encrypt the plain text to cipher text and the cipher text is decrypted back to plain text by authorized user using right key. The symmetric key algorithm uses the same key to encrypt the plain text and decrypt the cipher text. In this paper we have proposed new symmetric algorithm using ASCII value. The plain text using key and ASCII values is converted to cipher text. Encryption algorithm sends cipher text and minimum value to the authorized receiver. Receiver decrypts the cipher text to plain text using same key and minimum value. In this algorithm sequence of five pseudo random number is generated and sum of this five pseudo random number is added to the obtained decimal value. Seed to generate common sequence of pseudo random number is kept secret between sender and receiver. Proposed algorithm support variable key length and plain text size. This algorithms performs faster when text is small message, but the execution time increases as the plain text size increases. This algorithm can be used to send small messages in a secured way. .


Author(s):  
Hoda Jannati ◽  
Ebrahim Ardeshir-Larijani ◽  
Behnam Bahrak
Keyword(s):  

Author(s):  
Anja Bechmann ◽  
Peter Bjerregaard Vahlstrup

The aim of this article is to discuss methodological implications and challenges in different kinds of deep and big data studies of Facebook and Instagram through methods involving the use of Application Programming Interface (API) data. This article describes and discusses Digital Footprints (www.digitalfootprints.dk), a data extraction and analytics software that allows researchers to extract user data from Facebook and Instagram data sources; public streams as well as private data with user consent. Based on insights from the software design process and data driven studies the article argues for three main challenges: Data quality, data access and analysis, and legal and ethical considerations.


Author(s):  
G. Golovko ◽  
A. Matiashenko ◽  
N. Solopihin

This article offers an example of using an application whose main task is to encrypt data such as files and private messages. Data encryption is performed using an encryption algorithm - xor. The XOR cipher is a data encryption algorithm using exclusive disjunction. Acquired widespread use in computer networks in the 90's due to the ease of implementation. Used to encrypt Microsoft Word documents in Windows. The XOR encryption algorithm is to "overlay" a sequence of random numbers on the text to be encrypted. A sequence of random numbers is called a gamma sequence, and is used to encrypt and decrypt data. If you use a key with a length at least equal to the length of the message, the XOR cipher becomes much more crypto-resistant than when using a duplicate key. For cryptological protection of information of the travel company Rest & Travel, EDcrypt software has been created, which performs the following functions: account login; inability to use the system without logging in to the account; notification of entering incorrect user data; message encryption; decryption of messages; the ability to select the recipient of the message; encryption of text files; decryption of text files; sending text files to selected recipients; three interface languages: English, Russian, Ukrainian


Electronics ◽  
2021 ◽  
Vol 10 (24) ◽  
pp. 3131
Author(s):  
Prasanth Varma Kakarlapudi ◽  
Qusay H. Mahmoud

The concept of blockchain was introduced as the Bitcoin cryptocurrency in a 2008 whitepaper by the mysterious Satoshi Nakamoto. Blockchain has applications in many domains, such as healthcare, the Internet of Things (IoT), and data management. Data management is defined as obtaining, processing, safeguarding, and storing information about an organization to aid with making better business decisions for the firm. The collected information is often shared across organizations without the consent of the individuals who provided the information. As a result, the information must be protected from unauthorized access or exploitation. Therefore, organizations must ensure that their systems are transparent to build user confidence. This paper introduces the architectural design and development of a blockchain-based system for private data management, discusses the proof-of-concept prototype using Hyperledger Fabric, and presents evaluation results of the proposed system using Hyperledger Caliper. The proposed solution can be used in any application domain where managing the privacy of user data is important, such as in health care systems.


Sensors ◽  
2021 ◽  
Vol 21 (14) ◽  
pp. 4684
Author(s):  
Lihi Dery ◽  
Artyom Jelnov

Accurately tailored support such as advice or assistance can increase user satisfaction from interactions with smart devices; however, in order to achieve high accuracy, the device must obtain and exploit private user data and thus confidential user information might be jeopardized. We provide an analysis of this privacy–accuracy trade-off. We assume two positive correlations: a user’s utility from a device is positively correlated with the user’s privacy risk and also with the quality of the advice or assistance offered by the device. The extent of the privacy risk is unknown to the user. Thus, privacy concerned users might choose not to interact with devices they deem as unsafe. We suggest that at the first period of usage, the device should choose not to employ the full capability of its advice or assistance capabilities, since this may intimidate users from adopting it. Using three analytical propositions, we further offer an optimal policy for smart device exploitation of private data for the purpose of interactions with users.


2017 ◽  
Author(s):  
Christopher Soghoian

Today, when consumers evaluate potential telecommunications, Internet service or application providers – they are likely to consider several differentiating factors: The cost of service, the features offered as well as the providers’ reputation for network quality and customer service. The firms’ divergent approaches to privacy, and in particular, their policies regarding law enforcement and intelligence agencies’ access to their customers’ private data are not considered by consumers during the purchasing process – perhaps because it is practically impossible for anyone to discover this information. A naïve reader might simply assume that the law gives companies very little wiggle room – when they are required to provide data, they must do so. This is true. However, companies have a huge amount of flexibility in the way they design their networks, in the amount of data they retain by default, the exigent circumstances in which they share data without a court order, and the degree to which they fight unreasonable requests. As such, there are substantial differences in the privacy practices of the major players in the telecommunications and Internet applications market: Some firms retain identifying data for years, while others retain no data at all; some voluntarily provide government agencies access to user data - one carrier even argued in court that its 1st amendment free speech rights guarantee it the right to do so, while other companies refuse to voluntarily disclose data without a court order; some companies charge government agencies when they request user data, while others disclose it for free. As such, a consumer’s decision to use a particular carrier or provider can significantly impact their privacy, and in some cases, their freedom. Many companies profess their commitment to protecting their customers’ privacy, with some even arguing that they compete on their respective privacy practices. However, none seem to be willing to disclose, let alone compete on the extent to which they assist or resist government agencies’ surveillance activities. Because information about each firm’s practices is not publicly known, consumers cannot vote with their dollars, and pick service providers that best protect their privacy. In this article, I focus on this lack of information and on the policy changes necessary to create market pressure for companies to put their customers’ privacy first. I outline the numerous ways in which companies currently assist the government, often going out of their way to provide easy access to their customers’ private communications and documents. I also highlight several ways in which some companies have opted to protect user privacy, and the specific product design decisions that firms can make that either protect their customers’ private data by default, or make it trivial for the government to engage in large scale surveillance. Finally, I make specific policy recommendations that, if implemented, will lead to the public disclosure of these privacy differences between companies, and hopefully, create further market incentives for firms to embrace privacy by design.


2021 ◽  
Vol 2021 ◽  
pp. 1-11
Author(s):  
Jing Zhang ◽  
Rongxia Qin ◽  
Ruijie Mu ◽  
Xiaojun Wang ◽  
Yongli Tang

The development of edge computing and Internet of Things technology has brought convenience to our lives, but the sensitive and private data collected are also more vulnerable to attack. Aiming at the data privacy security problem of edge-assisted Internet of Things, an outsourced mutual Private Set Intersection protocol is proposed. The protocol uses the ElGamal threshold encryption algorithm to rerandomize the encrypted elements to ensure all the set elements are calculated in the form of ciphertext. After that, the protocol maps the set elements to the corresponding hash bin under the execution of two hash functions and calculates the intersection in a bin-to-bin manner, reducing the number of comparisons of the set elements. In addition, the introduction of edge servers reduces the computational burden of participating users and achieves the fairness of the protocol. Finally, the IND-CPA security of the protocol is proved, and the performance of the protocol is compared with other relevant schemes. The evaluation results show that this protocol is superior to other related protocols in terms of lower computational overhead.


Author(s):  
Tawheed Jan Shah ◽  
M. Tariq Banday

Uncompressed multimedia data such as images require huge storage space, processing power, transmission time, and bandwidth. In order to reduce the storage space, transmission time, and bandwidth, the uncompressed image data is compressed before its storage or transmission. This process not only permits a large number of images to be stored in a specified amount of storage space but also reduces the time required for them to be sent or download from the internet. In this chapter, the classification of an image on the basis of number of bits used to represent each pixel of the digital image and different types of image redundancies is presented. This chapter also introduced image compression and its classification into different lossless and lossy compression techniques along with their advantages and disadvantages. Further, discrete cosine transform, its properties, and the application of discrete cosine transform-based image compression method (i.e., JPEG compression model) along with its limitations are also discussed in detail.


Sign in / Sign up

Export Citation Format

Share Document