Experimental study of the characteristics of one method of integrity checking of large volume data storage

Author(s):  
D. A. Bobrovskiy ◽  
et al.
2018 ◽  
Vol 2018 ◽  
pp. 1-9 ◽  
Author(s):  
Ruoshui Liu ◽  
Jianghui Liu ◽  
Jingjie Zhang ◽  
Moli Zhang

Cloud computing is a new way of data storage, where users tend to upload video data to cloud servers without redundantly local copies. However, it keeps the data out of users' hands which would conventionally control and manage the data. Therefore, it becomes the key issue on how to ensure the integrity and reliability of the video data stored in the cloud for the provision of video streaming services to end users. This paper details the verification methods for the integrity of video data encrypted using the fully homomorphic crytosystems in the context of cloud computing. Specifically, we apply dynamic operation to video data stored in the cloud with the method of block tags, so that the integrity of the data can be successfully verified. The whole process is based on the analysis of present Remote Data Integrity Checking (RDIC) methods.


Author(s):  
Ivan Mozghovyi ◽  
Anatoliy Sergiyenko ◽  
Roman Yershov

Increasing requirements for data transfer and storage is one of the crucial questions now. There are several ways of high-speed data transmission, but they meet limited requirements applied to their narrowly focused specific target. The data compression approach gives the solution to the problems of high-speed transfer and low-volume data storage. This paper is devoted to the compression of GIF images, using a modified LZW algorithm with a tree-based dictionary. It has led to a decrease in lookup time and an increase in the speed of data compression, and in turn, allows developing the method of constructing a hardware compression accelerator during the future research.


Author(s):  
Poovizhi. M ◽  
Raja. G

Using Cloud Storage, users can tenuously store their data and enjoy the on-demand great quality applications and facilities from a shared pool of configurable computing resources, without the problem of local data storage and maintenance. However, the fact that users no longer have physical possession of the outsourced data makes the data integrity protection in Cloud Computing a formidable task, especially for users with constrained dividing resources. From users’ perspective, including both individuals and IT systems, storing data remotely into the cloud in a flexible on-demand manner brings tempting benefits: relief of the burden for storage management, universal data access with independent geographical locations, and avoidance of capital expenditure on hardware, software, and personnel maintenances, etc. To securely introduce an effective Sanitizer and third party auditor (TPA), the following two fundamental requirements have to be met: 1) TPA should be able to capably audit the cloud data storage without demanding the local copy of data, and introduce no additional on-line burden to the cloud user; 2) The third party auditing process should take in no new vulnerabilities towards user data privacy. In this project, utilize and uniquely combine the public auditing protocols with double encryption approach to achieve the privacy-preserving public cloud data auditing system, which meets all integrity checking without any leakage of data. To support efficient handling of multiple auditing tasks, we further explore the technique of online signature to extend our main result into a multi-user setting, where TPA can perform multiple auditing tasks simultaneously. We can implement double encryption algorithm to encrypt the data twice and stored cloud server in Electronic Health Record applications.


2020 ◽  
Vol 26 (10) ◽  
pp. 3008-3021 ◽  
Author(s):  
Jonathan Sarton ◽  
Nicolas Courilleau ◽  
Yannick Remion ◽  
Laurent Lucas

2019 ◽  
Vol 26 ◽  
pp. 1-13 ◽  
Author(s):  
E. Sánchez ◽  
J.H. Castro-Chacón ◽  
J.S. Silva ◽  
J.B. Hernández-Águila ◽  
M. Reyes-Ruiz ◽  
...  

2019 ◽  
Vol 2019 ◽  
pp. 1-15 ◽  
Author(s):  
Hisham A. Kholidy ◽  
Abdelkarim Erradi

The Cypher Physical Power Systems (CPPS) became vital targets for intruders because of the large volume of high speed heterogeneous data provided from the Wide Area Measurement Systems (WAMS). The Nonnested Generalized Exemplars (NNGE) algorithm is one of the most accurate classification techniques that can work with such data of CPPS. However, NNGE algorithm tends to produce rules that test a large number of input features. This poses some problems for the large volume data and hinders the scalability of any detection system. In this paper, we introduce VHDRA, a Vertical and Horizontal Data Reduction Approach, to improve the classification accuracy and speed of the NNGE algorithm and reduce the computational resource consumption. VHDRA provides the following functionalities: (1) it vertically reduces the dataset features by selecting the most significant features and by reducing the NNGE’s hyperrectangles. (2) It horizontally reduces the size of data while preserving original key events and patterns within the datasets using an approach called STEM, State Tracking and Extraction Method. The experiments show that the overall performance of VHDRA using both the vertical and the horizontal reduction reduces the NNGE hyperrectangles by 29.06%, 37.34%, and 26.76% and improves the accuracy of the NNGE by 8.57%, 4.19%, and 3.78% using the Multi-, Binary, and Triple class datasets, respectively.


Sign in / Sign up

Export Citation Format

Share Document