A lossless image compression technique using generic peano pattern mask tree

Author(s):  
Mohammad Kabir Hossain ◽  
Shams M Imam ◽  
Khondker Shajadul Hasan ◽  
William Perrizo
Author(s):  
T Kavitha ◽  
K. Jayasankar

<p>Compression technique is adopted to solve various big data problems such as storage and transmission. The growth of cloud computing and smart phone industries has led to generation of huge volume of digital data. Digital data can be in various forms as audio, video, images and documents. These digital data are generally compressed and stored in cloud storage environment. Efficient storing and retrieval mechanism of digital data by adopting good compression technique will result in reducing cost. The compression technique is composed of lossy and lossless compression technique. Here we consider Lossless image compression technique, minimizing the number of bits for encoding will aid in improving the coding efficiency and high compression. Fixed length coding cannot assure in minimizing bit length. In order to minimize the bits variable Length codes with prefix-free codes nature are preferred. However the existing compression model presented induce high computing overhead, to address this issue, this work presents an ideal and efficient modified Huffman technique that improves compression factor up to 33.44% for Bi-level images and 32.578% for Half-tone Images. The average computation time both encoding and decoding shows an improvement of 20.73% for Bi-level images and 28.71% for Half-tone images. The proposed work has achieved overall 2% increase in coding efficiency, reduced memory usage of 0.435% for Bi-level images and 0.19% for Half-tone Images. The overall result achieved shows that the proposed model can be adopted to support ubiquitous access to digital data.</p>


Author(s):  
Hitesh H Vandra

Image compression is used to reduce bandwidth or storage requirement in image application. Mainly two types of image compression: lossy and lossless image compression. A Lossy Image Compression removes some of the source information content along with the redundancy. While the Lossless Image Compression technique the original source data is reconstructed from the compressed data by restoring the removed redundancy. The reconstructed data is an exact replica of the original source data. Many algorithms are present for lossless image compression like Huffman, rice coding, run length, LZW. LZW is referred to as a substitution or dictionary-based encoding algorithm. The algorithm builds a data dictionary of data occurring in an uncompressed data stream. Patterns of data (substrings) are identified in the data stream and are matched to entries in the dictionary. If the substring is not present in the dictionary, a code phrase is created based on the data content of the substring, and it is stored in the dictionary. The phrase is then written to the compressed output stream. In this paper we see the effect of LZW algorithm on the png, jpg, png, gif, bmp image formats.


2012 ◽  
Vol 05 (10) ◽  
pp. 752-763 ◽  
Author(s):  
A. Alarabeyyat ◽  
S. Al-Hashemi ◽  
T. Khdour ◽  
M. Hjouj Btoush ◽  
S. Bani-Ahmad ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document