Lossless compression of Fluoroscopy medical images using correlation and the combination of Run-length and Huffman coding

Author(s):  
Arif Sameh Arif ◽  
Sarina Mansor ◽  
Hezrul Abdul Karim ◽  
Rajasvaran Logeswaran
2018 ◽  
Vol 12 (11) ◽  
pp. 387
Author(s):  
Evon Abu-Taieh ◽  
Issam AlHadid

Multimedia is highly competitive world, one of the properties that is reflected is speed of download and upload of multimedia elements: text, sound, pictures, animation. This paper presents CRUSH algorithm which is a lossless compression algorithm. CRUSH algorithm can be used to compress files. CRUSH method is fast and simple with time complexity O(n) where n is the number of elements being compressed.Furthermore, compressed file is independent from algorithm and unnecessary data structures. As the paper will show comparison with other compression algorithms like Shannon–Fano code, Huffman coding, Run Length Encoding, Arithmetic Coding, Lempel-Ziv-Welch (LZW), Run Length Encoding (RLE), Burrows-Wheeler Transform.Move-to-Front (MTF) Transform, Haar, wavelet tree, Delta Encoding, Rice &Golomb Coding, Tunstall coding, DEFLATE algorithm, Run-Length Golomb-Rice (RLGR).


2020 ◽  
Vol 14 ◽  

Lossless compression is crucial in the remote transmission of large-scale medical image and the retainment of complete medical diagnostic information. The lossless compression method of medical image based on differential probability of image is proposed in this study. The medical image with DICOM format was decorrelated by the differential method, and the difference matrix was optimally coded by the Huffman coding method to obtain the optimal compression effect. Experimental results obtained using the new method were compared with those using Lempel–Ziv–Welch, modified run–length encoding, and block–bit allocation methods to verify its effectiveness. For 2-D medical images, the lossless compression effect of the proposed method is the best when the object region is more than 20% of the image. For 3-D medical images, the proposed method has the highest compression ratio among the control methods. The proposed method can be directly used for lossless compression of DICOM images.


2018 ◽  
Vol 12 (11) ◽  
pp. 406
Author(s):  
Evon Abu-Taieh ◽  
Issam AlHadid

Multimedia is highly competitive world, one of the properties that is reflected is speed of download and upload of multimedia elements: text, sound, pictures, animation. This paper presents CRUSH algorithm which is a lossless compression algorithm. CRUSH algorithm can be used to compress files. CRUSH method is fast and simple with time complexity O(n) where n is the number of elements being compressed.Furthermore, compressed file is independent from algorithm and unnecessary data structures. As the paper will show comparison with other compression algorithms like Shannon–Fano code, Huffman coding, Run Length Encoding, Arithmetic Coding, Lempel-Ziv-Welch (LZW), Run Length Encoding (RLE), Burrows-Wheeler Transform.Move-to-Front (MTF) Transform, Haar, wavelet tree, Delta Encoding, Rice &Golomb Coding, Tunstall coding, DEFLATE algorithm, Run-Length Golomb-Rice (RLGR).


Author(s):  
Ida Bagus Gede Anandita ◽  
I Gede Aris Gunadi ◽  
Gede Indrawan

Technological progress in the medical area made medical images like X-rays stored in digital files. The medical image file is relatively large so that the image needs to be compressed. The lossless compression technique is an image compression where the decompression results are the same as the original or no information lost in the compression process. The existing algorithms on lossless compression techniques are Run Length Encoding (RLE), Huffman, and Lempel Ziv Welch (LZW). This study compared the performance of the three algorithms in compressing medical images. The result of image decompression will be compared to its performance in the objective assessment such as ratio, compression time, MSE (Mean Square Error) and PNSR (Peak Signal to Noise Ratio). MSE and PSNR are used for quantitative image quality measurement for subjective assessment assisted by three experts who will compare the original image with the decompression image. Based on the results obtained from the objective assessment of compression performance of RLE algorithm showed the best performance by yielding ratio, time, MSE and PSNR respectively 86,92%, 3,11ms, 0 and 0db. For Huffman, the results can be 12.26%, 96.94ms, 0, and 0db respectively. While LZW results can be in sequence -63.79%, 160ms, 0.3 and 58.955db. For the results of the subjective assessment, the experts argued that all images can be analyzed well.


Sign in / Sign up

Export Citation Format

Share Document