scholarly journals REPRESENTATION BOUND FOR HUMAN FACIAL MIMIC WITH THE AID OF PRINCIPAL COMPONENT ANALYSIS

2010 ◽  
Vol 10 (03) ◽  
pp. 343-363
Author(s):  
ULRIK SÖDERSTRÖM ◽  
HAIBO LI

In this paper, we examine how much information is needed to represent the facial mimic, based on Paul Ekman's assumption that the facial mimic can be represented with a few basic emotions. Principal component analysis is used to compact the important facial expressions. Theoretical bounds for facial mimic representation are presented both for using a certain number of principal components and a certain number of bits. When 10 principal components are used to reconstruct color image video at a resolution of 240 × 176 pixels the representation bound is on average 36.8 dB, measured in peak signal-to-noise ratio. Practical confirmation of the theoretical bounds is demonstrated. Quantization of projection coefficients affects the representation, but a quantization with approximately 7-8 bits is found to match an exact representation, measured in mean square error.

1992 ◽  
Vol 75 (3) ◽  
pp. 929-930 ◽  
Author(s):  
Oliver C. S. Tzeng

This note summarizes my remarks on the application of reliability of the principal component and the eigenvalue-greater-than-1 rule for determining the number of factors in principal component analysis of a correlation matrix. Due to the unpredictability and uselessness of the reliability approach and the Kaiser-Guttman rule, research workers are encouraged to use other methods such as the scree test.


PLoS ONE ◽  
2021 ◽  
Vol 16 (3) ◽  
pp. e0248896
Author(s):  
Nico Migenda ◽  
Ralf Möller ◽  
Wolfram Schenck

“Principal Component Analysis” (PCA) is an established linear technique for dimensionality reduction. It performs an orthonormal transformation to replace possibly correlated variables with a smaller set of linearly independent variables, the so-called principal components, which capture a large portion of the data variance. The problem of finding the optimal number of principal components has been widely studied for offline PCA. However, when working with streaming data, the optimal number changes continuously. This requires to update both the principal components and the dimensionality in every timestep. While the continuous update of the principal components is widely studied, the available algorithms for dimensionality adjustment are limited to an increment of one in neural network-based and incremental PCA. Therefore, existing approaches cannot account for abrupt changes in the presented data. The contribution of this work is to enable in neural network-based PCA the continuous dimensionality adjustment by an arbitrary number without the necessity to learn all principal components. A novel algorithm is presented that utilizes several PCA characteristics to adaptivly update the optimal number of principal components for neural network-based PCA. A precise estimation of the required dimensionality reduces the computational effort while ensuring that the desired amount of variance is kept. The computational complexity of the proposed algorithm is investigated and it is benchmarked in an experimental study against other neural network-based and incremental PCA approaches where it produces highly competitive results.


Sensors ◽  
2021 ◽  
Vol 21 (6) ◽  
pp. 2092
Author(s):  
Juan Meléndez ◽  
Guillermo Guarnizo

An imaging Fourier-transform spectrometer in the mid-infrared (1850–6667 cm−1) has been used to acquire transmittance spectra at a resolution of 1 cm−1 of three atmospheric pollutants with known column densities (Q): methane (258 ppm·m), nitrous oxide (107.5 ppm·m) and propane (215 ppm·m). Values of Q and T have been retrieved by fitting them with theoretical spectra generated with parameters from the HITRAN database, based on a radiometric model that takes into account gas absorption and emission, and the instrument lineshape function. A principal component analysis (PCA) of experimental data has found that two principal components are enough to reconstruct gas spectra with high fidelity. PCA-processed spectra have better signal-to-noise ratio without loss of spatial resolution, improving the uniformity of retrieval. PCA has been used also to speed up retrieval, by pre-calculating simulated spectra for a range of expected Q and T values, applying PCA to them and then comparing the principal components of experimental spectra with those of the simulated ones to find the gas Q and T values. A reduction in calculation time by a factor larger than one thousand is achieved with improved accuracy. Retrieval can be further simplified by obtaining T and Q as quadratic functions of the two first principal components.


2006 ◽  
Vol 1 (1) ◽  
Author(s):  
K. Katayama ◽  
K. Kimijima ◽  
O. Yamanaka ◽  
A. Nagaiwa ◽  
Y. Ono

This paper proposes a method of stormwater inflow prediction using radar rainfall data as the input of the prediction model constructed by system identification. The aim of the proposal is to construct a compact system by reducing the dimension of the input data. In this paper, Principal Component Analysis (PCA), which is widely used as a statistical method for data analysis and compression, is applied to pre-processing radar rainfall data. Then we evaluate the proposed method using the radar rainfall data and the inflow data acquired in a certain combined sewer system. This study reveals that a few principal components of radar rainfall data can be appropriate as the input variables to storm water inflow prediction model. Consequently, we have established a procedure for the stormwater prediction method using a few principal components of radar rainfall data.


Author(s):  
Maryam Abedini ◽  
Horriyeh Haddad ◽  
Marzieh Faridi Masouleh ◽  
Asadollah Shahbahrami

This study proposes an image denoising algorithm based on sparse representation and Principal Component Analysis (PCA). The proposed algorithm includes the following steps. First, the noisy image is divided into overlapped [Formula: see text] blocks. Second, the discrete cosine transform is applied as a dictionary for the sparse representation of the vectors created by the overlapped blocks. To calculate the sparse vector, the orthogonal matching pursuit algorithm is used. Then, the dictionary is updated by means of the PCA algorithm to achieve the sparsest representation of vectors. Since the signal energy, unlike the noise energy, is concentrated on a small dataset by transforming into the PCA domain, the signal and noise can be well distinguished. The proposed algorithm was implemented in a MATLAB environment and its performance was evaluated on some standard grayscale images under different levels of standard deviations of white Gaussian noise by means of peak signal-to-noise ratio, structural similarity indexes, and visual effects. The experimental results demonstrate that the proposed denoising algorithm achieves significant improvement compared to dual-tree complex discrete wavelet transform and K-singular value decomposition image denoising methods. It also obtains competitive results with the block-matching and 3D filtering method, which is the current state-of-the-art for image denoising.


2014 ◽  
Vol 926-930 ◽  
pp. 4085-4088
Author(s):  
Chuan Jun Li

This article uses the PCA method (Principal component analysis) to evaluate the level of corporate governance. PCA is used to analyze the correlation among 10 original indicators, and extract some principal components so that most of the information of the original indicators is extracted. The formulation of the index of corporate governance can be got by calculating the weight based on the variance contribution rate of the principal component, which can comprehensively evaluate corporate governance.


2013 ◽  
Vol 834-836 ◽  
pp. 935-938
Author(s):  
Lian Shun Zhang ◽  
Chao Guo ◽  
Bao Quan Wang

In this paper, the liquor brands were identified based on the near infrared spectroscopy method and the principal component analysis. 60 samples of 6 different brands liquor were measured by the spectrometer of USB4000. Then, in order to eliminate the noise caused by the external factors, the smoothing method and the multiplicative scatter correction method were used. After the preprocessing, we got the revised spectra of the 60 samples. The difference of the spectrum shape of different brands is not much enough to classify them. So the principal component analysis was applied for further analysis. The results showed that the first two principal components variance contribution rate had reached 99.06%, which can effectively represent the information of the spectrums after preprocessing. From the scatter plot of the two principal components, the 6 different brands of liquor were identified more accurate and easier than the spectra curves.


2015 ◽  
Vol 50 (8) ◽  
pp. 649-657 ◽  
Author(s):  
Regina Maria Villas Bôas de Campos Leite ◽  
Maria Cristina Neves de Oliveira

Abstract:The objective of this work was to evaluate the suitability of the multivariate method of principal component analysis (PCA) using the GGE biplot software for grouping sunflower genotypes for their reaction to Alternaria leaf spot disease (Alternariaster helianthi), and for their yield and oil content. Sixty-nine genotypes were evaluated for disease severity in the field, at the R3 growth stage, in seven growing seasons, in Londrina, in the state of Paraná, Brazil, using a diagrammatic scale developed for this disease. Yield and oil content were also evaluated. Data were standardized using the software Statistica, and GGE biplot was used for PCA and graphical display of data. The first two principal components explained 77.9% of the total variation. According to the polygonal biplot using the first two principal components and three response variables, the genotypes were divided into seven sectors. Genotypes located on sectors 1 and 2 showed high yield and high oil content, respectively, and those located on sector 7 showed tolerance to the disease and high yield, despite the high disease severity. The principal component analysis using GGE biplot is an efficient method for grouping sunflower genotypes based on the studied variables.


2015 ◽  
Vol 36 (6) ◽  
pp. 3909
Author(s):  
Michelle Santos da Silva ◽  
Luciana Shiotsuki ◽  
Raimundo Nonato Braga Lôbo ◽  
Olivardo Facó

A multivariate approach was adopted to evaluate the relationship among traits measured in the performance testing of Morada Nova sheep, verify the efficiency of a ranking method used in these tests and identify the most significant traits for use in future analyses. Data from 150 young rams participating in five versions of the performance tests for the Morada Nova breed were used. Twenty traits were measured in each animal: initial weight (IW), final weight (FW), average daily weight gain (ADG), loin eye area (LEA), scrotal circumference (SC), fat thickness (FT), conformation (C), precocity (Pc), muscularity (M), breed features (BF), legs (L), withers height (WH), chest width (CW), rump height (RH), rump width (RW), rump length (RL), body length (BL), body depth (BD), heart girth (HG) and body condition scoring (BCS). The Pearson’s correlation coefficients ranged from –0.10 to 0.93, with the highest correlations were between body weight variables and morphometric measurements. The three first principal components explained 72.28% of the total variability among all traits. The variables related to animal size defined the first principal component, whereas those related to visual appraisal and suitability for meat production defined the second and third principal components, respectively. The combination of traits from the principal component analysis showed that the ranking method currently used in the performance testing of Morada Nova sheep is efficient for selecting larger rams with better breed features and higher degrees of specialization for meat production.


2022 ◽  
Author(s):  
Jaime González Maiz Jiménez ◽  
Adán Reyes Santiago

This research measures the systematic risk of 10 sectors in the American Stock Market, discerning the COVID-19 pandemic period. The novelty of this study is the use of the Principal Component Analysis (PCA) technique to measure the systematic risk of each sector, selecting five stocks per sector with the greatest market capitalization. The results show that the sectors that have the greatest increase in exposure to systematic risk during the pandemic are restaurants, clothing, and insurance, whereas the sectors that show the greatest decrease in terms of exposure to systematic risk are automakers and tobacco. Due to the results of this study, it seems advisable for practitioners to select stocks that belong to either the automakers or tobacco sector to get protection from health crises, such as COVID-19.


Sign in / Sign up

Export Citation Format

Share Document