scholarly journals Prototipe Kompresi Lossless Audio Codec Menggunakan Entropy Encoding

Author(s):  
Andreas Soegandi

The purpose of this study was to perform lossless compression on the uncompress audio file audio to minimize file size without reducing the quality. The application is developed using the entropy encoding compression method with rice coding technique. For the result, the compression ratio is good enough and easy to be developed because the algorithm is quite simple. 

Author(s):  
Riyo Oktavianty Finola

The size of the audio file can affect the time of sending data to be long and can cause waste of storage space. Therefore, compression is performed to compress the contents of the audio file into smaller ones. One of the compression techniques is lossless technique, which is a compression method where the compressed audio file can be returned to the file before it is compressed without losing information on the previous data. This study uses the Interpolative coding algorithm on mp3 audio files. Interpolative coding algorithm is an innovative way to assign dynamic code to data symbols. This method is different from other compression methods, because the code it provides for individual symbols is not static. The design of this system consists of 2 main processes namely the compression process and the decompression process and the process of calculating the performance of Compression Ratio (CR) and Redundancy. The resulting compression results in a new file with the * .ipc extension containing the compressed bit string which can then be decompressed. Application designed only one form, where in the form there is a process for compression and decompression, while the process of compressing the input file has the extension *. MP3 and produces output with the extension * IPC, and the size of the compressed audio file is smaller than the previous file size .Keywords: Compression, Audio File, Interpolative coding algorithm


Author(s):  
Hendra Mesra ◽  
Handayani Tjandrasa ◽  
Chastine Fatichah

<p>In general, the compression method is developed to reduce the redundancy of data. This study uses a different approach to embed some bits of datum in image data into other datum using a Reversible Low Contrast Mapping (RLCM) transformation. Besides using the RLCM for embedding, this method also applies the properties of RLCM to compress the datum before it is embedded. In its algorithm, the proposed method engages Queue and Recursive Indexing. The algorithm encodes the data in a cyclic manner. In contrast to RLCM, the proposed method is a coding method as Huffman coding. This research uses publicly available image data to examine the proposed method. For all testing images, the proposed method has higher compression ratio than the Huffman coding.</p>


2016 ◽  
Vol 1 (1) ◽  
Author(s):  
Hadary Mallafi

One of the limitations in data uploading process is the maximum request length, besides that the data size that us transferred is also an issue because it influences the data sending cost. One of the way to cope with the problem of maximum request length is by downsizing the file size (chunking). Another way to do it is by enlarging the maximum reques length. Downsizing the file size can be done by chunking the files into a smaller size or by compressing it. In this paper, the author conducted a research about the file compression process that is done in client server using the technology of AJAX and Webservice. In addition to that, the file compression is combined with file chunking. In this research, the compression method that is used is dictionary based i.e. Lempel Ziv 77(LZ77). This compression is used since it can be performed in AJAX. The analysis that is made by the researcher about the compression ratio, data sending process speed, compression time, decompression time, the compression method capability in handling the maximum request length and the combination method of compression and chunking in uploading process.  The result of this research shows that compression method can handle the maximum requet length. Based on the experiment conducted, the relations between the compression ratio and window length is positively corelated. It means that the greater the window length is the more the compression ratio is.  Meanwhile, the relation between window length and uploading time is negatively linearly corelated. It means that the greater the window length is the faster the uploading time is. In addition, it can also be observed that the relation between the decompression and the file size is positively linearly correlated. It means that the greater the file size is the more time needed for decompression is.


2020 ◽  
Vol 10 (14) ◽  
pp. 4918
Author(s):  
Shaofei Dai ◽  
Wenbo Liu ◽  
Zhengyi Wang ◽  
Kaiyu Li ◽  
Pengfei Zhu ◽  
...  

This paper reports on an efficient lossless compression method for periodic signals based on adaptive dictionary predictive coding. Some previous methods for data compression, such as difference pulse coding (DPCM), discrete cosine transform (DCT), lifting wavelet transform (LWT) and KL transform (KLT), lack a suitable transformation method to make these data less redundant and better compressed. A new predictive coding approach, basing on the adaptive dictionary, is proposed to improve the compression ratio of the periodic signal. The main criterion of lossless compression is the compression ratio (CR). In order to verify the effectiveness of the adaptive dictionary predictive coding for periodic signal compression, different transform coding technologies, including DPCM, 2-D DCT, and 2-D LWT, are compared. The results obtained prove that the adaptive dictionary predictive coding can effectively improve data compression efficiency compared with traditional transform coding technology.


2003 ◽  
Vol 13 (01) ◽  
pp. 39-45
Author(s):  
AMER AL-NASSIRI

In this paper we considered a theoretical evaluation of data and text compression algorithm based on the Burrows–Wheeler Transform (BWT) and General Bidirectional Associative Memory (GBAM). A new data and text lossless compression method, based on the combination of BWT1 and GBAM2 approaches, is presented. The algorithm was tested on many texts in different formats (ASCII and RTF). The compression ratio achieved is fairly good, on average 28–36%. Decompression is fast.


2018 ◽  
Vol 7 (2.31) ◽  
pp. 69 ◽  
Author(s):  
G Murugesan ◽  
Rosario Gilmary

Text files utilize substantial amount of memory or disk space. Transmission of these files across a network depends upon a considerable amount of bandwidth. Compression procedures are explicitly advantageous in telecommunications and information technology because it facilitate devices to disseminate or reserve the equivalent amount of data in fewer bits. Text compression techniques section, the English passage by observing the patters and provide alternative symbols for larger patters of text. To diminish the depository of copious information and data storage expenditure, compression algorithms were used. Compression of significant and massive cluster of information can head to the improvement in retrieval time. Novel lossless compression algorithms have been introduced for better compression ratio. In this work, the various existing compression mechanisms that are particular for compressing the text files and Deoxyribonucleic acid (DNA) sequence files are analyzed. The performance is correlated in terms of compression ratio, time taken to compress/decompress the sequence and file size. In this proposed work, the input file is converted to DNA format and then DNA compression procedure is applied.  


Author(s):  
Hendra Mesra ◽  
Handayani Tjandrasa ◽  
Chastine Fatichah

<p>In general, the compression method is developed to reduce the redundancy of data. This study uses a different approach to embed some bits of datum in image data into other datum using a Reversible Low Contrast Mapping (RLCM) transformation. Besides using the RLCM for embedding, this method also applies the properties of RLCM to compress the datum before it is embedded. In its algorithm, the proposed method engages Queue and Recursive Indexing. The algorithm encodes the data in a cyclic manner. In contrast to RLCM, the proposed method is a coding method as Huffman coding. This research uses publicly available image data to examine the proposed method. For all testing images, the proposed method has higher compression ratio than the Huffman coding.</p>


2020 ◽  
Author(s):  
Vinicius Fulber-Garcia ◽  
Sérgio Luis Sardi Mergen

Abstract Prediction-based compression methods, like prediction by partial matching, achieve a remarkable compression ratio, especially for texts written in natural language. However, they are not efficient in terms of speed. Part of the problem concerns the usage of dynamic entropy encoding, which is considerably slower than the static alternatives. In this paper, we propose a prediction-based compression method that decouples the context model from the frequency model. The separation allows static entropy encoding to be used without a significant overhead in the meta-data embedded in the compressed data. The result is a reasonably efficient algorithm that is particularly suited for small textual files, as the experiments show. We also show it is relatively easy to built strategies designed to handle specific cases, like the compression of files whose symbols are only locally frequent.


2019 ◽  
Vol 2019 ◽  
pp. 1-11 ◽  
Author(s):  
Romi Fadillah Rahmat ◽  
T. S. M. Andreas ◽  
Fahmi Fahmi ◽  
Muhammad Fermi Pasha ◽  
Mohammed Yahya Alzahrani ◽  
...  

Compression, in general, aims to reduce file size, with or without decreasing data quality of the original file. Digital Imaging and Communication in Medicine (DICOM) is a medical imaging file standard used to store multiple information such as patient data, imaging procedures, and the image itself. With the rising usage of medical imaging in clinical diagnosis, there is a need for a fast and secure method to share large number of medical images between healthcare practitioners, and compression has always been an option. This work analyses the Huffman coding compression method, one of the lossless compression techniques, as an alternative method to compress a DICOM file in open PACS settings. The idea of the Huffman coding compression method is to provide codeword with less number of bits for the symbol that has a higher value of byte frequency distribution. Experiments using different type of DICOM images are conducted, and the analysis on the performances in terms of compression ratio and compression/decompression time, as well as security, is provided. The experimental results showed that the Huffman coding technique has the capability to compress the DICOM file up to 1 : 3.7010 ratio and up to 72.98% space savings.


Author(s):  
Emy Setyaningsih ◽  
Agus Harjoko

A compression process is to reduce or compress the size of data while maintaining the quality of information contained therein. This paper presents a survey of research papers discussing improvement of various hybrid compression techniques during the last decade. A hybrid compression technique is a technique combining excellent properties of each group of methods as is performed in JPEG compression method. This technique combines lossy and lossless compression method to obtain a high-quality compression ratio while maintaining the quality of the reconstructed image. Lossy compression technique produces a relatively high compression ratio, whereas lossless compression brings about high-quality data reconstruction as the data can later be decompressed with the same results as before the compression. Discussions of the knowledge of and issues about the ongoing hybrid compression technique development indicate the possibility of conducting further researches to improve the performance of image compression method.


Sign in / Sign up

Export Citation Format

Share Document