data compression
Recently Published Documents


TOTAL DOCUMENTS

4838
(FIVE YEARS 661)

H-INDEX

79
(FIVE YEARS 9)

Author(s):  
Raveendra Gudodagi ◽  
Rayapur Venkata Siva Reddy ◽  
Mohammed Riyaz Ahmed

Owing to the substantial volume of human genome sequence data files (from 30-200 GB exposed) Genomic data compression has received considerable traction and storage costs are one of the major problems faced by genomics laboratories. This involves a modern technology of data compression that reduces not only the storage but also the reliability of the operation. There were few attempts to solve this problem independently of both hardware and software. A systematic analysis of associations between genes provides techniques for the recognition of operative connections among genes and their respective yields, as well as understandings into essential biological events that are most important for knowing health and disease phenotypes. This research proposes a reliable and efficient deep learning system for learning embedded projections to combine gene interactions and gene expression in prediction comparison of deep embeddings to strong baselines. In this paper we preform data processing operations and predict gene function, along with gene ontology reconstruction and predict the gene interaction. The three major steps of genomic data compression are extraction of data, storage of data, and retrieval of the data. Hence, we propose a deep learning based on computational optimization techniques which will be efficient in all the three stages of data compression.


Author(s):  
Huda Kadhim Tayyeh ◽  
Ahmed Sabah Ahmed AL-Jumaili

Steganography is one of the cryptography techniques where secret information can be hidden through multimedia files such as images and videos. Steganography can offer a way of exchanging secret and encrypted information in an untypical mechanism where communicating parties can only interpret the secret message. The literature has shown a great interest in the least significant bit (LSB) technique which aims at embedding the secret message bits into the most insignificant bits of the image pixels. Although LSB showed a stable performance of image steganography yet, many works should be done on the message part. This paper aims to propose a combination of LSB and Deflate compression algorithm for image steganography. The proposed Deflate algorithm utilized both LZ77 and Huffman coding. After compressing the message text, LSB has been applied to embed the text within the cover image. Using benchmark images, the proposed method demonstrated an outperformance over the state of the art. This can proof the efficacy of using Deflate as a data compression prior to the LSB embedding.


2022 ◽  
Author(s):  
Ahmed Taloba ◽  
Mohamed Ahmed Fouly ◽  
Taysir Soliman

Abstract Distributed computing includes putting aside the data utilizing outsider storage and being able to get to this information from a place at any time. Due to the advancement of distributed computing and databases, high critical data are put in databases. However, the information is saved in outsourced services like Database as a Service (DaaS), security issues are raised from both server and client-side. Also, query processing on the database by different clients through the time-consuming methods and shared resources environment may cause inefficient data processing and retrieval. Secure and efficient data regaining can be obtained with the help of an efficient data processing algorithm among different clients. This method proposes a well-organized through an Efficient Secure Query Processing Algorithm (ESQPA) for query processing efficiently by utilizing the concepts of data compression before sending the encrypted results from the server to clients. We have addressed security issues through securing the data at the server-side by an encrypted database using CryptDB. Encryption techniques have recently been proposed to present clients with confidentiality in terms of cloud storage. This method allows the queries to be processed using encrypted data without decryption. To analyze the performance of ESQPA, it is compared with the current query processing algorithm in CryptDB. Results have proven the efficiency of storage space is less and it saves up to 63% of its space.


2022 ◽  
Vol 2022 ◽  
pp. 1-10
Author(s):  
Parameshwaran Ramalingam ◽  
Abolfazl Mehbodniya ◽  
Julian L. Webber ◽  
Mohammad Shabaz ◽  
Lakshminarayanan Gopalakrishnan

Telemetric information is great in size, requiring extra room and transmission time. There is a significant obstruction of storing or sending telemetric information. Lossless data compression (LDC) algorithms have evolved to process telemetric data effectively and efficiently with a high compression ratio and a short processing time. Telemetric information can be packed to control the extra room and association data transmission. In spite of the fact that different examinations on the pressure of telemetric information have been conducted, the idea of telemetric information makes pressure incredibly troublesome. The purpose of this study is to offer a subsampled and balanced recurrent neural lossless data compression (SB-RNLDC) approach for increasing the compression rate while decreasing the compression time. This is accomplished through the development of two models: one for subsampled averaged telemetry data preprocessing and another for BRN-LDC. Subsampling and averaging are conducted at the preprocessing stage using an adjustable sampling factor. A balanced compression interval (BCI) is used to encode the data depending on the probability measurement during the LDC stage. The aim of this research work is to compare differential compression techniques directly. The final output demonstrates that the balancing-based LDC can reduce compression time and finally improve dependability. The final experimental results show that the model proposed can enhance the computing capabilities in data compression compared to the existing methodologies.


2022 ◽  
Author(s):  
Hassan Noura ◽  
Joseph Azar ◽  
Ola Salman ◽  
Raphaël Couturier ◽  
Kamel Mazouzi

2022 ◽  
Vol 355 ◽  
pp. 03003
Author(s):  
Jianxin Chen ◽  
Pengcheng Wang ◽  
Xinzhuo Ren ◽  
Haojie Meng ◽  
Yinfei Xu ◽  
...  

The operating state of switch cabinet is significant for the reliability of the whole power system, collecting and monitoring its data through the wireless sensor network is an effective method to avoid accidents. This paper proposes a data compression method based on periodic transmission model under the condition of limited energy consumption and memory space resources in the complex environment of switch cabinet sensor networks. Then, the proposed method is rigorously and intuitively shown by theoretical derivation and algorithm flow chart. Finally, numerical simulations are carried out and compared with the original data. The comparisons of compression ratio and error results indicate that the improved algorithm has a better effect on the periodic sensing data with interference and can make sure the change trend of data by making certain timing sequence.


2022 ◽  
Vol 1 (1) ◽  
pp. 1
Author(s):  
Rodrigo Da Rosa ◽  
Cristiano André Costa ◽  
Igor De Nardin ◽  
Thiago Lopes

Sensors ◽  
2021 ◽  
Vol 22 (1) ◽  
pp. 306
Author(s):  
Jyrki Kullaa

Structural health monitoring (SHM) with a dense sensor network and repeated vibration measurements produces lots of data that have to be stored. If the sensor network is redundant, data compression is possible by storing the signals of selected Bayesian virtual sensors only, from which the omitted signals can be reconstructed with higher accuracy than the actual measurement. The selection of the virtual sensors for storage is done individually for each measurement based on the reconstruction accuracy. Data compression and reconstruction for SHM is the main novelty of this paper. The stored and reconstructed signals are used for damage detection and localization in the time domain using spatial or spatiotemporal correlation. Whitening transformation is applied to the training data to take the environmental or operational influences into account. The first principal component of the residuals is used to localize damage and also to design the extreme value statistics control chart for damage detection. The proposed method was studied with a numerical model of a frame structure with a dense accelerometer or strain sensor network. Only five acceleration or three strain signals out of the total 59 signals were stored. The stored and reconstructed data outperformed the raw measurement data in damage detection and localization.


Sign in / Sign up

Export Citation Format

Share Document