Enhancing Lempel-Ziv codes using an on-line variable length binary encoding

Author(s):  
T. Acharya ◽  
J.F. Jaja
1996 ◽  
Vol 94 (1-4) ◽  
pp. 1-22 ◽  
Author(s):  
Tinku Acharya ◽  
Joseph F. Jájá

1970 ◽  
Vol 3 (4) ◽  
pp. 257 ◽  
Author(s):  
James L. Dolby

<p><span>Viable on-line search systems require reasonable capabilities to automatically detect (and hopefully correct) variations between request format and stored format. </span><span>An </span><span>important requirement is the solution of the problem </span><span>of </span><span>matching proper names, not only because both input specifications and storage specifications are subject to error, </span><span>but </span><span>also because various transliteration schemes </span><span>exist </span><span>and can provide variant proper name forms in the same data base. This paper reviews several proper name matching </span><span>schemes </span><span>and provides an updated version of these schemes which tests out nicely on the proper name equivalence classes of a suburban telephone book. </span><span>An </span><span>appendix lists the corpus </span><span>of </span><span>names used for algorithm test.</span></p>


1970 ◽  
Vol 3 (2) ◽  
pp. 128
Author(s):  
Rosario De Varennes

<p class="p1">Description of a system, operational since June 1968, that provides control of all s<span class="s1">e</span>rials holdings in nine campus libraries, permits updating of the complete file every two or three days, and produces various outputs for library users and library staff from data in variable fields on disks (listings, statistics, etc.). The program, presently operating on an IBM 360/50 and utilizing an IBM 2314 disk<span class="s2">-</span>storage facility and three IBM 226 CRT terminals, is written in IBM System/360 Operating System Assembler Language and in PL/I; it could encompass a file of no more than 10 million records of variable length limited to 127/255 characters and subdivided in 25 or fewer fields<span class="s1">.</span></p>


2018 ◽  
Author(s):  
Jamaluddin Jamaluddin

Dalam ilmu komputer, kompresi data adalah sebuah cara untuk memadatkan data sehingga hanya memerlukan ruang yang lebih kecil sehingga lebih efisien dalam menyimpannya atau mempersingkat waktu pertukaran data tersebut. Dalam tulisan ini, penulis ingin membandingkan efektivitas antara tiga jenis algoritma untuk melakukan kompresi data dalam bentuk teks. Ketiga algoritma tersebut adalah Fixed Length Binary Encoding, Variable Length Binary Encoding dan Algoritma Huffman.


Author(s):  
William Krakow

In the past few years on-line digital television frame store devices coupled to computers have been employed to attempt to measure the microscope parameters of defocus and astigmatism. The ultimate goal of such tasks is to fully adjust the operating parameters of the microscope and obtain an optimum image for viewing in terms of its information content. The initial approach to this problem, for high resolution TEM imaging, was to obtain the power spectrum from the Fourier transform of an image, find the contrast transfer function oscillation maxima, and subsequently correct the image. This technique requires a fast computer, a direct memory access device and even an array processor to accomplish these tasks on limited size arrays in a few seconds per image. It is not clear that the power spectrum could be used for more than defocus correction since the correction of astigmatism is a formidable problem of pattern recognition.


Author(s):  
A.M.H. Schepman ◽  
J.A.P. van der Voort ◽  
J.E. Mellema

A Scanning Transmission Electron Microscope (STEM) was coupled to a small computer. The system (see Fig. 1) has been built using a Philips EM400, equipped with a scanning attachment and a DEC PDP11/34 computer with 34K memory. The gun (Fig. 2) consists of a continuously renewed tip of radius 0.2 to 0.4 μm of a tungsten wire heated just below its melting point by a focussed laser beam (1). On-line operation procedures were developped aiming at the reduction of the amount of radiation of the specimen area of interest, while selecting the various imaging parameters and upon registration of the information content. Whereas the theoretical limiting spot size is 0.75 nm (2), routine resolution checks showed minimum distances in the order 1.2 to 1.5 nm between corresponding intensity maxima in successive scans. This value is sufficient for structural studies of regular biological material to test the performance of STEM over high resolution CTEM.


Author(s):  
Neil Rowlands ◽  
Jeff Price ◽  
Michael Kersker ◽  
Seichi Suzuki ◽  
Steve Young ◽  
...  

Three-dimensional (3D) microstructure visualization on the electron microscope requires that the sample be tilted to different positions to collect a series of projections. This tilting should be performed rapidly for on-line stereo viewing and precisely for off-line tomographic reconstruction. Usually a projection series is collected using mechanical stage tilt alone. The stereo pairs must be viewed off-line and the 60 to 120 tomographic projections must be aligned with fiduciary markers or digital correlation methods. The delay in viewing stereo pairs and the alignment problems in tomographic reconstruction could be eliminated or improved by tilting the beam if such tilt could be accomplished without image translation.A microscope capable of beam tilt with simultaneous image shift to eliminate tilt-induced translation has been investigated for 3D imaging of thick (1 μm) biologic specimens. By tilting the beam above and through the specimen and bringing it back below the specimen, a brightfield image with a projection angle corresponding to the beam tilt angle can be recorded (Fig. 1a).


Sign in / Sign up

Export Citation Format

Share Document