scholarly journals Low Complexity Fluctuation Measurement in Image Processing Considering Order

Author(s):  
Tareq Khan

The standard deviation can measure the spread out of a set of numbers and entropy can measure the randomness. However, they do not consider the order of the numbers. This can lead to misleading results where the order of the numbers is vital. An image is a set of numbers (i.e. pixel values) that is sensitive to order. In this paper, a low complexity and efficient method for measuring the fluctuation is proposed considering the order of the numbers. The proposed method sums up the changes of consecutive numbers and can be used in image processing applications. Simulation shows that the proposed method is 8 to 33 times faster than other related works.

2011 ◽  
Vol 346 ◽  
pp. 731-737 ◽  
Author(s):  
Jin Feng Yang ◽  
Man Hua Liu ◽  
Hui Zhao ◽  
Wei Tao

This paper presents an efficient method to detect the fastener based on the technologies of image processing and optical detection. As feature descriptor, the Direction Field of fastener image is computed for template matching. This fastener detection method can be used to determine the status of fastener on the corresponding track, i.e., whether the fastener is on the track or missing. Experimental results are presented to show that the proposed method is computation efficiency and is robust for fastener detection in complex environment.


Author(s):  
Wesley S. Hunko ◽  
Vishnuvardhan Chandrasekaran ◽  
Lewis N. Payton

The purpose of this paper is to present the results of a study comparing an old technique for measuring low surface roughness with a new technique of data acquisition and processing that is potentially cheaper, quicker and more automated. It offers the promise of in-process quality monitoring of surface finish. Since the late 1800s, researchers have investigated the light scattering effects of surface asperities and have developed many interferometry techniques to quantify this phenomenon. Through the use of interferometry, the surface roughness of objects can be very accurately measured and compared. Unlike contact measurement such as profilometers, interferometry is nonintrusive and can take surface measurements at very wide ranges of scale. The drawbacks to this method are the high costs and complexity of data acquisition and analysis equipment. This study attempts to eliminate these drawbacks by developing a single built-in MATLAB function, to simplify data analysis, and a very economically priced digital microscope (less than $200), for data acquisition. This is done by comparing the results of various polishing compounds on the basis of the polished surface results obtained from MATLAB’s IMHIST function to the results of stylus profilometry methods. The study with the MATLAB method is also to be compared to 3D microscopy with a Keyence microscope. With surface roughness being a key component in many manufacturing and tribology applications, the apparent need for accurate, reliable and economical measuring systems is prevalent. However, interferometry is not a cheap or simple process. “Over the last few years, advances in image processing techniques have provided a basis for developing image-based surface roughness measuring techniques” [1]. One popular image processing technique is through the use of MATLAB’s Image Processing Toolbox. This includes an array of functions that can be used to quantify and compare textures of a surface. Some of these include standard deviation, entropy, and histograms of images for further analysis. “These statistics can characterize the texture of an image because they provide information about the local variability of the intensity values of pixels in an image. For example, in areas with smooth texture, the range of values in the neighborhood around a pixel will be a small value; in areas of rough texture, the range will be larger. Similarly, calculating the standard deviation of pixels in a neighborhood can indicate the degree of variability of pixel values in that region” [2]. By combining the practices of interferometry with the processing techniques of MATLAB, this fairly new method of roughness measurement proved itself as a very viable and inexpensive technique. This technique should prove to be a very viable means of interferometry at an affordable cost.


Author(s):  
Richard Chbeir

In last two decades, image retrieval has seen a growth of interests in several domains. As a result, a lot of work has been done in order to integrate it in the standard data processing environments (Rui, Huang, & Chang, 1999; Smeulders, Gevers, & Kersten, 1998; Yoshitaka & Ichikawa, 1999). To retrieve images, different methods have been proposed in the literature (Chang & Jungert, 1997; Guttman, 1984; Lin, Jagadish, & Faloutsos, 1994). These methods can be grouped into two major approaches: metadata-based and content-based approaches. The metadata-based approach uses alphanumeric attributes and traditional techniques to describe the context and/or the content of the image such as title, author name, date, and so on. The content-based approach uses image processing algorithms to extract low-level features of images such as colors, textures, and shapes. Image retrieval using these features is done by methods of similarity and hence is a non-exact matching.


2020 ◽  
Vol 10 (1) ◽  
pp. 68-73
Author(s):  
Farhad M. Khalifa ◽  
Mohammed G. Saeed

In the past decade, a transform called all phase discrete cosine biorthogonal transform (APDCBT) appeared in the field of digital image processing. It is mainly used to solve the negativity found in discrete cosine transform (DCT), especially in a low bit rate. In this paper, the APDCBT is employed for watermark insertion based on selected regions of an image. The insertion is depending on the homogeneity of each part of the image. To determine the extent of homogeneity, there are two criteria: Mean and standard deviation are applied for the intensity of the image. Medium frequency bands of APDCBT in image pixel blocks are used to hold the embedded watermark. Then, the transform is inversed to obtain the resultant watermarked image. The robustness of APDCBT against watermark removal attacks is tested and the experimental results showed the superiority of the APDCBT over traditional DCT in the watermark embedding system. More specifically, when the LSB reset attack applied. For instance, when the contrast adjustment attack applied, the average of normalized cross-correlation (NCC) values of extracted watermark images with the original watermark, it was 0.992 for the proposed method. This is a promised result, if it is compared with the NCC of the DCT method, which was 0.423. The proposed method can be used for copyright protection purpose.


Sign in / Sign up

Export Citation Format

Share Document