PENILAIAN KUALITI IMEJ DIGITAL BERDASARKAN KAEDAH CIRI-CIRI SISTEM PENGLIHATAN MANUSIA DAN PRINSIP STRUKTUR IMEJ

2016 ◽  
Vol 78 (5-10) ◽  
Author(s):  
Bahbibi Rahmatullah ◽  
Siti Tasnim Mahamud

Tremendous advances of information technology provide a large role for digital images for delivering information quickly and accurately. However, digital images are exposed to distortions and imperfect quality during acquisition, compression, transmission, processing and reproduction. Therefore, the development of effectively image quality assessment (IQA) is crucial in order to identify and measure the distortion in image quality. Perception by human observers (manually) as the ultimate receiver of the visual information contained in an image and most reliable to assess the quality of image. However, manual subjective assessment method is considered costly and time consuming. This lead to the development of proposed automatic method to measure image quality as accurately as the manual method. The goal of objective image quality assessment is to develop a computational model that can accurately and automatically predict the perceptual image quality. An ideal objective IQA method should be able to imitate the quality predictions of an average human observer. Full-reference image quality assessment is a method where image with perfect quality provided as a reference image for guiding the IQA system. This paper presents the study and comparison between two full-reference method that frequently used in IQA system that is method based on the properties of human visual system (HVS) and method based on principle of image structure. Both of this method is proven can be used to measure digital images quality accurately and depends on distortion types that occurred on measured images.

2018 ◽  
Vol 32 (34n36) ◽  
pp. 1840085
Author(s):  
Ruxi Xiang ◽  
Feng Wu

In this paper, we present an effective quality assessment method based on the relation intensity ratio and detail similarity for image quality assessment (IQA) with the full reference image, which first allows us to compute the nonlinear gradient magnitude with Gaussian smoothing of the reference and distorted images and construct the relation intensity ratio and detail similarity between them. Next, the final IQA map is formed by linearly combining the relation intensity ratio with the detail similarity. Finally, we adopt a new pooling strategy which effectively integrates the mean and standard deviation of the final IQA map to accurately predict image quality. Experiments based on two publicly available databases show that the proposed method can provide accurate predictions compared with most state-of-the-art IQA methods.


2011 ◽  
Vol 55-57 ◽  
pp. 31-36
Author(s):  
Lian Fen Huang ◽  
Xiao Nan Cui ◽  
Jian An Lin ◽  
Zhi Yuan Shi

Because human visual perception is highly adapted for extracting structural information from a scene, but the existing SSIM index is a full reference method which needs entire information of reference images. In this paper, we develop a reduced reference SSIM method and evaluate its performance through a set of assessment criteria, as well as comparison to both EPSNR and SSIM methods on a database of images compressed with JPEG and JPEG2000.


2020 ◽  
Vol 64 (1) ◽  
pp. 10505-1-10505-16
Author(s):  
Yin Zhang ◽  
Xuehan Bai ◽  
Junhua Yan ◽  
Yongqi Xiao ◽  
C. R. Chatwin ◽  
...  

Abstract A new blind image quality assessment method called No-Reference Image Quality Assessment Based on Multi-Order Gradients Statistics is proposed, which is aimed at solving the problem that the existing no-reference image quality assessment methods cannot determine the type of image distortion and that the quality evaluation has poor robustness for different types of distortion. In this article, an 18-dimensional image feature vector is constructed from gradient magnitude features, relative gradient orientation features, and relative gradient magnitude features over two scales and three orders on the basis of the relationship between multi-order gradient statistics and the type and degree of image distortion. The feature matrix and distortion types of known distorted images are used to train an AdaBoost_BP neural network to determine the image distortion type; the feature matrix and subjective scores of known distorted images are used to train an AdaBoost_BP neural network to determine the image distortion degree. A series of comparative experiments were carried out using Laboratory of Image and Video Engineering (LIVE), LIVE Multiply Distorted Image Quality, Tampere Image, and Optics Remote Sensing Image databases. Experimental results show that the proposed method has high distortion type judgment accuracy and that the quality score shows good subjective consistency and robustness for all types of distortion. The performance of the proposed method is not constricted to a particular database, and the proposed method has high operational efficiency.


2021 ◽  
Vol 7 (7) ◽  
pp. 112
Author(s):  
Domonkos Varga

The goal of no-reference image quality assessment (NR-IQA) is to evaluate their perceptual quality of digital images without using the distortion-free, pristine counterparts. NR-IQA is an important part of multimedia signal processing since digital images can undergo a wide variety of distortions during storage, compression, and transmission. In this paper, we propose a novel architecture that extracts deep features from the input image at multiple scales to improve the effectiveness of feature extraction for NR-IQA using convolutional neural networks. Specifically, the proposed method extracts deep activations for local patches at multiple scales and maps them onto perceptual quality scores with the help of trained Gaussian process regressors. Extensive experiments demonstrate that the introduced algorithm performs favorably against the state-of-the-art methods on three large benchmark datasets with authentic distortions (LIVE In the Wild, KonIQ-10k, and SPAQ).


PLoS ONE ◽  
2018 ◽  
Vol 13 (6) ◽  
pp. e0199430 ◽  
Author(s):  
Chaofeng Li ◽  
Yifan Li ◽  
Yunhao Yuan ◽  
Xiaojun Wu ◽  
Qingbing Sang

Sign in / Sign up

Export Citation Format

Share Document