scholarly journals DESIGN ANALYSIS OF 2-D DWT BASED IMAGE COMPRESSION USING FPGA FOR MEMORY SPEED OPTIMIZER.

Wavelet Transform is successfully applied a number of fields, covering anything from pure mathematics to applied science. Numerous studies, done on wavelet Transform, have proven its advantages in image processing and data compression and have made it a encoding technique in recent data compression standards along with multi- resolution decomposition of signal and image processing applications. Pure software implementations for the Discrete Wavelet Transform (DWT), however, seem the performance bottleneck in realtime systems in terms of performance. Therefore, hardware acceleration for the DWT has developed into topic of contemporary research. On the compression of image using 2-Dimensional DWT (2D-DWT) two filters are widely-used, a highpass as well as a lowpass filter. Because filter coefficients are irrational numbers, it's advocated that they must be approximated with the use of binary fractions. The truth and efficiency with that your filter coefficients are rationalized within the implementation impacts the compression and critical hardware properties just like throughput and power consumption. An expensive precision representation ensures good compression performance, but at the expense of increased hardware resources and processing time. Conversely, lower precision with the filter coefficients ends up with smaller, faster hardware, but at the expense of poor compression performance.

Sensors ◽  
2021 ◽  
Vol 21 (2) ◽  
pp. 516
Author(s):  
Brinnae Bent ◽  
Baiying Lu ◽  
Juseong Kim ◽  
Jessilyn P. Dunn

A critical challenge to using longitudinal wearable sensor biosignal data for healthcare applications and digital biomarker development is the exacerbation of the healthcare “data deluge,” leading to new data storage and organization challenges and costs. Data aggregation, sampling rate minimization, and effective data compression are all methods for consolidating wearable sensor data to reduce data volumes. There has been limited research on appropriate, effective, and efficient data compression methods for biosignal data. Here, we examine the application of different data compression pipelines built using combinations of algorithmic- and encoding-based methods to biosignal data from wearable sensors and explore how these implementations affect data recoverability and storage footprint. Algorithmic methods tested include singular value decomposition, the discrete cosine transform, and the biorthogonal discrete wavelet transform. Encoding methods tested include run-length encoding and Huffman encoding. We apply these methods to common wearable sensor data, including electrocardiogram (ECG), photoplethysmography (PPG), accelerometry, electrodermal activity (EDA), and skin temperature measurements. Of the methods examined in this study and in line with the characteristics of the different data types, we recommend direct data compression with Huffman encoding for ECG, and PPG, singular value decomposition with Huffman encoding for EDA and accelerometry, and the biorthogonal discrete wavelet transform with Huffman encoding for skin temperature to maximize data recoverability after compression. We also report the best methods for maximizing the compression ratio. Finally, we develop and document open-source code and data for each compression method tested here, which can be accessed through the Digital Biomarker Discovery Pipeline as the “Biosignal Data Compression Toolbox,” an open-source, accessible software platform for compressing biosignal data.


2011 ◽  
Author(s):  
Egydio C. S. Caria ◽  
Trajano A. de A. Costa ◽  
João Marcos A. Rebello ◽  
Donald O. Thompson ◽  
Dale E. Chimenti

Author(s):  
Mayank Srivastava ◽  
Jamshed M Siddiqui ◽  
Mohammad Athar Ali

The rapid development of image editing software has resulted in widespread unauthorized duplication of original images. This has given rise to the need to develop robust image hashing technique which can easily identify duplicate copies of the original images apart from differentiating it from different images. In this paper, we have proposed an image hashing technique based on discrete wavelet transform and Hough transform, which is robust to large number of image processing attacks including shifting and shearing. The input image is initially pre-processed to remove any kind of minor effects. Discrete wavelet transform is then applied to the pre-processed image to produce different wavelet coefficients from which different edges are detected by using a canny edge detector. Hough transform is finally applied to the edge-detected image to generate an image hash which is used for image identification. Different experiments were conducted to show that the proposed hashing technique has better robustness and discrimination performance as compared to the state-of-the-art techniques. Normalized average mean value difference is also calculated to show the performance of the proposed technique towards various image processing attacks. The proposed copy detection scheme can perform copy detection over large databases and can be considered to be a prototype for developing online real-time copy detection system.   


Author(s):  
Latha Parameswaran ◽  
K Anbumani

This chapter discusses a content-based authentication technique based on inter-coefficient relationship of Discrete Wavelet Transform (DWT). Watermark is generated from the first level DWT. An image digest (which is a binary string) is generated from the second level DWT. The watermark is embedded in the mid-frequency coefficients of first level DWT as directed by the image digest. Image authentication is done by computing the Completeness of Signature. The proposed scheme is capable of withstanding incidental image processing operations such as compression and identifies any malicious tampering done on the host image.


Sign in / Sign up

Export Citation Format

Share Document