scholarly journals Modified Firefly Algorithm for Vector Quantization Codebook Design in Image Compression

In the recent days, the importance of image compression techniques is exponentially increased due to the generation of massive amount of data which needs to be stored or transmitted. Numerous approaches have been presented for effective image compression by the principle of representing images in its compact form through the avoidance of unnecessary pixels. Vector quantization (VA) is an effective method in image compression and the construction of quantization table is an important process is an important task. The compression performance and the quality of reconstructed data are based on the quantization table, which is actually a matrix of 64 integers. The quantization table selection is a complex combinatorial problem which can be resolved by the evolutionary algorithms (EA). Presently, EA became famous to resolve the real world problems in a reasonable amount of time. This chapter introduces Firefly (FF) with Teaching and learning based optimization (TLBO) algorithm termed as FF-TLBO algorithm for the selection of quantization table and introduces Firefly with Tumbling algorithm termed as FF-Tumbling algorithm for the selection of search space. As the FF algorithm faces a problem when brighter FFs are insignificant, the TLBO algorithm is integrated to it to resolve the problem and Tumbling efficiently train the algorithm to explore all direction in the solution space. This algorithm determines the best fit value for every bock as local best and best fitness value for the entire image is considered as global best. When these values are found by FF algorithm, compression process takes place by efficient image compression algorithm like Run Length Encoding and Huffman coding. The proposed FF-TLBO and FF-Tumbling algorithm is evaluated by comparing its results with existing FF algorithm using a same set of benchmark images in terms of Mean Square Error (MSE), Peak Signal to Noise Ratio (PSNR), Signal to Noise Ratio (SNR). The obtained results ensure the superior performance of FF-TLBO and FF-Tumbling algorithm over FF algorithm and make it highly useful for real time applications.

Author(s):  
Sanjith Sathya Joseph ◽  
R. Ganesan

Image compression is the process of reducing the size of a file without humiliating the quality of the image to an unacceptable level by Human Visual System. The reduction in file size allows as to store more data in less memory and speed up the transmission process in low bandwidth also, in case of satellite images it reduces the time required for the image to reach the ground station. In order to increase the transmission process compression plays an important role in remote sensing images.  This paper presents a coding scheme for satellite images using Vector Quantization. And it is a well-known technique for signal compression, and it is also the generalization of the scalar quantization.  The given satellite image is compressed using VCDemo software by creating codebooks for vector quantization and the quality of the compressed and decompressed image is compared by the Mean Square Error, Signal to Noise Ratio, Peak Signal to Noise Ratio values.


Author(s):  
Anusorn Jitkam ◽  
Satra Wongthanavasu

This research presents an image compression algorithm using modified Haar wavelet and vector quantization. For comparison purposes, a standard Haar wavelet with vector quantization and SPIHT, which is used in JPEG2000, are compared with the proposed method using Peak Signal-to-Noise Ratio (PSNR). The proposed method shows better results on average over the compared methods.


2021 ◽  
Vol 9 ◽  
Author(s):  
Zahra Sobhani ◽  
Yunlong Luo ◽  
Christopher T. Gibson ◽  
Youhong Tang ◽  
Ravi Naidu ◽  
...  

As an emerging contaminant, microplastic is receiving increasing attention. However, the contamination source is not fully known, and new sources are still being identified. Herewith, we report that microplastics can be found in our gardens, either due to the wrongdoing of leaving plastic bubble wraps to be mixed with mulches or due to the use of plastic landscape fabrics in the mulch bed. In the beginning, they were of large sizes, such as > 5 mm. However, after 7 years in the garden, owing to natural degradation, weathering, or abrasion, microplastics are released. We categorize the plastic fragments into different groups, 5 mm–0.75 mm, 0.75 mm–100 μm, and 100–0.8 μm, using filters such as kitchenware, meaning we can collect microplastics in our gardens by ourselves. We then characterized the plastics using Raman image mapping and a logic-based algorithm to increase the signal-to-noise ratio and the image certainty. This is because the signal-to-noise ratio from a single Raman spectrum, or even from an individual peak, is significantly less than that from a spectrum matrix of Raman mapping (such as 1 vs. 50 × 50) that contains 2,500 spectra, from the statistical point of view. From the 10 g soil we sampled, we could detect the microplastics, including large (5 mm–100 μm) fragments and small (<100 μm) ones, suggesting the degradation fate of plastics in the gardens. Overall, these results warn us that we must be careful when we do gardening, including selection of plastic items for gardens.


2015 ◽  
Vol 3 (1) ◽  
pp. SB1-SB4 ◽  
Author(s):  
Donald A. Herron

Interpreters use horizon autopicking in many seismic interpretations in the modern workstation environment. When properly used and with data quality permitting this technique enables efficient and accurate tracking of horizons but is not without its pitfalls. Four common pitfalls are improper selection of the input control or seed grid, not accounting for the “directional” behavior of tracking algorithms, attempting autopicking in areas with poor reflection continuity and/or low signal-to-noise ratio, and failing to recognize elements of geology that are not suitable for autopicking.


1994 ◽  
Vol 38 ◽  
pp. 691-698
Author(s):  
K. Kansai ◽  
K. Toda ◽  
H. Kohno ◽  
T. Arai ◽  
R. Wilson

Advancements in trace clement analysis require improvements in both the signal-to-noise ratio and accurate background correction. With a sequential spectrometer, one can obtain detection limits of around 0.1 ppm for medium to heavy Z elements. Conditions can be individually optimized for each element, for example, selection of filters, collimators, crystals and background subtraction. The disadvantage is that the analysis time may become “long” if many elements are to be analyzed. This long exposure time can lead to the deterioration of some samples.


Sign in / Sign up

Export Citation Format

Share Document