Optimized Hybrid DCT-SVD Computation over Extremely Large Images

2021 ◽  
Vol 13 (2) ◽  
pp. 56-61
Author(s):  
Iwan Setiawan ◽  
Akbari Indra Basuki ◽  
Didi Rosiyadi

High performance computing (HPC) is required for image processing especially for picture element (pixel) with huge size. To avoid dependence to HPC equipment which is very expensive to be provided, the soft approach has been performed in this work. Actually, both hard and soft methods offer similar goal which are to reach time computation as short as possible. The discrete cosine transformation (DCT) and singular values decomposition (SVD) are conventionally performed to original image by consider it as a single matrix. This will result in computational burden for images with huge pixel. To overcome this problem, the second order matrix has been performed as block matrix to be applied on the original image which delivers the DCT-SVD hybrid formula. Hybrid here means the only required parameter shown in formula is intensity of the original pixel as the DCT and SVD formula has been merged in derivation. Result shows that when using Lena as original image, time computation of the singular values using the hybrid formula is almost two seconds faster than the conventional. Instead of pushing hard to provide the equipment, it is possible to overcome computational problem due to the size simply by using the proposed formula.

Author(s):  
Chwei-Shyong Tsai ◽  
Chin-Chen Chang

Digital watermarking is an effective technique to protect the intellectual property rights of digital images. In general, a gray-level image can provide more perceptual information; moreover, the size of each pixel in the gray-level image is bigger. Commonly, gray-level digital watermarks are more robust. In this chapter, the proposed watermarking scheme adopts a gray-level image as the watermark. In addition, discrete cosine transformation (DCT) technique and quantization method are applied to strengthen the robustness of the watermarking system. Both original image and digital watermark, processed by DCT transformation, can build a quantization table to reduce the information size of the digital watermark. After quantized watermark is embedded into the middle frequency bands of the transformed original image, the quality of the watermarked image is always visually acceptable because of the effectiveness of the quantization technique. The experimental results show that the embedded watermark can resist image cropping, JPEG lossy compression, and destructive processes such as image blurring and sharpening.


Author(s):  
Mark H. Ellisman

The increased availability of High Performance Computing and Communications (HPCC) offers scientists and students the potential for effective remote interactive use of centralized, specialized, and expensive instrumentation and computers. Examples of instruments capable of remote operation that may be usefully controlled from a distance are increasing. Some in current use include telescopes, networks of remote geophysical sensing devices and more recently, the intermediate high voltage electron microscope developed at the San Diego Microscopy and Imaging Resource (SDMIR) in La Jolla. In this presentation the imaging capabilities of a specially designed JEOL 4000EX IVEM will be described. This instrument was developed mainly to facilitate the extraction of 3-dimensional information from thick sections. In addition, progress will be described on a project now underway to develop a more advanced version of the Telemicroscopy software we previously demonstrated as a tool to for providing remote access to this IVEM (Mercurio et al., 1992; Fan et al., 1992).


MRS Bulletin ◽  
1997 ◽  
Vol 22 (10) ◽  
pp. 5-6
Author(s):  
Horst D. Simon

Recent events in the high-performance computing industry have concerned scientists and the general public regarding a crisis or a lack of leadership in the field. That concern is understandable considering the industry's history from 1993 to 1996. Cray Research, the historic leader in supercomputing technology, was unable to survive financially as an independent company and was acquired by Silicon Graphics. Two ambitious new companies that introduced new technologies in the late 1980s and early 1990s—Thinking Machines and Kendall Square Research—were commercial failures and went out of business. And Intel, which introduced its Paragon supercomputer in 1994, discontinued production only two years later.During the same time frame, scientists who had finished the laborious task of writing scientific codes to run on vector parallel supercomputers learned that those codes would have to be rewritten if they were to run on the next-generation, highly parallel architecture. Scientists who are not yet involved in high-performance computing are understandably hesitant about committing their time and energy to such an apparently unstable enterprise.However, beneath the commercial chaos of the last several years, a technological revolution has been occurring. The good news is that the revolution is over, leading to five to ten years of predictable stability, steady improvements in system performance, and increased productivity for scientific applications. It is time for scientists who were sitting on the fence to jump in and reap the benefits of the new technology.


2001 ◽  
Author(s):  
Donald J. Fabozzi ◽  
Barney II ◽  
Fugler Blaise ◽  
Koligman Joe ◽  
Jackett Mike ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document