Stationary Stochastic Processes and Fractal Data Compression

1997 ◽  
Vol 07 (03) ◽  
pp. 551-567 ◽  
Author(s):  
Michael F. Barnsley ◽  
Anca Deliu ◽  
Ruifeng Xie

It is shown that the invariant measure of a stationary nonatomic stochastic process yields an iterated function system with probabilities and an associated dynamical system that provide the basis for optimal lossless data compression algorithms. The theory is illustrated for the case of finite-order Markov processes: For a zero-order process, it produces the arithmetic compression method; while for higher order processes it yields dynamical systems, constructed from piecewise affine mappings from the interval [0, 1] into itself, that may be used to store information efficiently. The theory leads to a new geometrical approach to the development of compression algorithms.

2013 ◽  
Vol 11 (6) ◽  
pp. 15-19 ◽  
Author(s):  
Arup Kumar Bhattacharjee Arup Kumar Bhattacharjee

2016 ◽  
Vol 78 (6-4) ◽  
Author(s):  
Muhamad Azlan Daud ◽  
Muhammad Rezal Kamel Ariffin ◽  
S. Kularajasingam ◽  
Che Haziqah Che Hussin ◽  
Nurliyana Juhan ◽  
...  

A new compression algorithm used to ensure a modified Baptista symmetric cryptosystem which is based on a chaotic dynamical system to be applicable is proposed. The Baptista symmetric cryptosystem able to produce various ciphers responding to the same message input. This modified Baptista type cryptosystem suffers from message expansion that goes against the conventional methodology of a symmetric cryptosystem. A new lossless data compression algorithm based on theideas from the Huffman coding for data transmission is proposed.This new compression mechanism does not face the problem of mapping elements from a domain which is much larger than its range.Our new algorithm circumvent this problem via a pre-defined codeword list.  The purposed algorithm has fast encoding and decoding mechanism and proven analytically to be a lossless data compression technique.


Author(s):  
Gody Mostafa ◽  
Abdelhalim Zekry ◽  
Hatem Zakaria

When transmitting the data in digital communication, it is well desired that the transmitting data bits should be as minimal as possible, so many techniques are used to compress the data. In this paper, a Lempel-Ziv algorithm for data compression was implemented through VHDL coding. One of the most lossless data compression algorithms commonly used is Lempel-Ziv. The work in this paper is devoted to improve the compression rate, space-saving, and utilization of the Lempel-Ziv algorithm using a systolic array approach. The developed design is validated with VHDL simulations using Xilinx ISE 14.5 and synthesized on Virtex-6 FPGA chip. The results show that our design is efficient in providing high compression rates and space-saving percentage as well as improved utilization. The Throughput is increased by 50% and the design area is decreased by more than 23% with a high compression ratio compared to comparable previous designs.


2013 ◽  
Vol 21 (2) ◽  
pp. 133-143
Author(s):  
Hiroyuki Okazaki ◽  
Yuichi Futa ◽  
Yasunari Shidama

Summary Huffman coding is one of a most famous entropy encoding methods for lossless data compression [16]. JPEG and ZIP formats employ variants of Huffman encoding as lossless compression algorithms. Huffman coding is a bijective map from source letters into leaves of the Huffman tree constructed by the algorithm. In this article we formalize an algorithm constructing a binary code tree, Huffman tree.


Sign in / Sign up

Export Citation Format

Share Document