scholarly journals Layered Lossless Compression Method of Massive Fault Recording Data

Author(s):  
Jinhong Di ◽  
Pengkun Yang ◽  
Chunyan Wang ◽  
Lichao Yan

In order to overcome the problems of large error and low precision in traditional power fault record data compression, a new layered lossless compression method for massive fault record data is proposed in this paper. The algorithm applies LZW (Lempel Ziv Welch) algorithm, analyzes the LZW algorithm and existing problems, and improves the LZW algorithm. Use the index value of the dictionary to replace the input string sequence, and dynamically add unknown strings to the dictionary. The parallel search method is to divide the dictionary into several small dictionaries with different bit widths to realize the parallel search of the dictionary. According to the compression and decompression of LZW, the optimal compression effect of LZW algorithm hardware is obtained. The multi tree structure of the improved LZW algorithm is used to construct the dictionary, and the multi character parallel search method is used to query the dictionary. The multi character parallel search method is used to query the dictionary globally. At the same time, the dictionary size and update strategy of LZW algorithm are analyzed, and the optimization parameters are designed to construct and update the dictionary. Through the calculation of lossless dictionary compression, the hierarchical lossless compression of large-scale fault record data is completed. Select the optimal parameters, design the dictionary size and update strategy, and complete the lossless compression of recorded data. The experimental results show that compared with the traditional compression method, under this compression method, the mean square error percentage is effectively reduced, and the compression error and compression rate are eliminated, so as to ensure the integrity of fault record data, achieve the compression effect in a short time, and achieve the expected goal.

2020 ◽  
Vol 14 ◽  

Lossless compression is crucial in the remote transmission of large-scale medical image and the retainment of complete medical diagnostic information. The lossless compression method of medical image based on differential probability of image is proposed in this study. The medical image with DICOM format was decorrelated by the differential method, and the difference matrix was optimally coded by the Huffman coding method to obtain the optimal compression effect. Experimental results obtained using the new method were compared with those using Lempel–Ziv–Welch, modified run–length encoding, and block–bit allocation methods to verify its effectiveness. For 2-D medical images, the lossless compression effect of the proposed method is the best when the object region is more than 20% of the image. For 3-D medical images, the proposed method has the highest compression ratio among the control methods. The proposed method can be directly used for lossless compression of DICOM images.


Author(s):  
N. Karthika Devi ◽  
G. Mahendran ◽  
S. Murugeswari ◽  
S. Praveen Samuel Washburn ◽  
D. Archana Devi ◽  
...  

2011 ◽  
Vol 121-126 ◽  
pp. 1727-1733
Author(s):  
Yan Pei Liu ◽  
Jian Ping Wang ◽  
Jun Chen

With the development of component technology and component library expansion, component representation and retrieval technology as the component library management two core technologies has become a research hotspot. According to the current widely used methods described and characteristics of components faceted classification, this paper starting from component reuse by different search methods,based on view search method、based on faceted search method、based on terms retrieval method three kinds of term component retrieval methods and the corresponding model of the five matches and two matching algorithms are proposed. Theoretical analysis and experimental results show that the three kinds of retrieval methods used in large-scale component library component retrieval, to meet the component reuse of various search requirements and the retrieval efficiency is feasible.


2020 ◽  
Vol 1 (2) ◽  
pp. 44-51
Author(s):  
Paula Pereira ◽  
Tanara Kuhn

For images transfer, different embedding system exist which works by creating a mosaic image from the source image and recovery from the target image using some sort of algorithm. In current study, a method is proposed using the genetic algorithm for recovery of image from the source image. The algorithm utilized is genetic algorithm which is a search method along with another additional technique for obtaining higher robustness and security. The proposed methodology works by dividing the source image into smaller parts which are fitted into target image using the lossless compression. The mosaic image is recovered at retrieving side by the permutation array which is recovered and mapped using the pre-select key.


Author(s):  
ShenChuan Tai ◽  
TseMing Kuo ◽  
ChengHan Ho ◽  
TzuWen Liao

Sign in / Sign up

Export Citation Format

Share Document