Techniques for improving stability rate of linear predictive image coding schemes

1988 ◽  
Vol 135 (6) ◽  
pp. 298 ◽  
Author(s):  
M. Andrews ◽  
D.T. Nguyen
Keyword(s):  
1991 ◽  
Vol 27 (13) ◽  
pp. 1126 ◽  
Author(s):  
B. Zeng
Keyword(s):  

Author(s):  
CHUNYU LIN ◽  
YAO ZHAO ◽  
CE ZHU

In this paper, we incorporate Trellis Coded Quantization (TCQ) into a two-stage multiple description coding structure to obtain granular gain over two-stage multiple description Scalar Quantizer (SQ). Analysis and experiment on Gaussian signal show that the performance of the proposed scheme can achieve larger gain than that of the two-stage SQ scheme because of better performance of TCQ. The proposed scheme for image coding is shown to be more effective than other relevant multiple description image coding schemes in terms of central-side-distortion rate performance.


Author(s):  
K. Sowmithri

Image coding is considered to be more effective, as it reduces number of bits required to store and/or to transmit image data. Transform based image coders play a significant role as they decorrelate the spatial low level information. It is found utilization in International compression standards such as JPEG, JPEG 2000, MPEG and H264. The choice of transform is an important issue in all these transforms coding schemes. Most of the literature suggests either Discrete Cosine Transform (DCT) or Discrete Wavelet Transform (DWT). In this proposed work, the energy preservation of DCT coefficients is analysed, and to down sample these coefficients, lifting scheme is iteratively applied so as to compensate the artifacts that appear in the reconstructed picture, and to yield the higher compression ratio. This is followed by scalar quantization and entropy coding, as in JPEG. The performance of the proposed iterative lifting scheme, employed on decorrelated DCT coefficients is measured with standard Peak Signal to Noise Ratio (PSNR) and the results are encouraging.


2014 ◽  
Vol 53 (9) ◽  
pp. 093104 ◽  
Author(s):  
Yu-Chen Hu ◽  
I-Cheng Chang ◽  
Kuo-Yu Liu ◽  
Che-Lun Hung

Sign in / Sign up

Export Citation Format

Share Document