An Improvement on Horn's Parallel Analysis Methodology for Selecting the Correct Number of Factors to Retain

1995 ◽  
Vol 55 (3) ◽  
pp. 377-393 ◽  
Author(s):  
Louis W. Glorfeld
2021 ◽  
pp. 001316442098205
Author(s):  
André Beauducel ◽  
Norbert Hilger

Methods for optimal factor rotation of two-facet loading matrices have recently been proposed. However, the problem of the correct number of factors to retain for rotation of two-facet loading matrices has rarely been addressed in the context of exploratory factor analysis. Most previous studies were based on the observation that two-facet loading matrices may be rank deficient when the salient loadings of each factor have the same sign. It was shown here that full-rank two-facet loading matrices are, in principle, possible, when some factors have positive and negative salient loadings. Accordingly, the current simulation study on the number of factors to extract for two-facet models was based on rank-deficient and full-rank two-facet population models. The number of factors to extract was estimated from traditional parallel analysis based on the mean of the unreduced eigenvalues as well as from nine other rather traditional versions of parallel analysis (based on the 95th percentile of eigenvalues, based on reduced eigenvalues, based on eigenvalue differences). Parallel analysis based on the mean eigenvalues of the correlation matrix with the squared multiple correlations of each variable with the remaining variables inserted in the main diagonal had the highest detection rates for most of the two-facet factor models. Recommendations for the identification of the correct number of factors are based on the simulation results, on the results of an empirical example data set, and on the conditions for approximately rank-deficient and full-rank two-facet models.


2006 ◽  
Author(s):  
Jinyan Fan ◽  
Felix James Lopez ◽  
Jennifer Nieman ◽  
Robert C. Litchfield ◽  
Robert S. Billings

2021 ◽  
pp. 001316442199283
Author(s):  
Yan Xia

Despite the existence of many methods for determining the number of factors, none outperforms the others under every condition. This study compares traditional parallel analysis (TPA), revised parallel analysis (RPA), Kaiser’s rule, minimum average partial, sequential χ2, and sequential root mean square error of approximation, comparative fit index, and Tucker–Lewis index under a realistic scenario in behavioral studies, where researchers employ a closing–fitting parsimonious model with K factors to approximate a population model, leading to a trivial model-data misfit. Results show that while traditional and RPA both stand out when zero population-level misfits exist, the accuracy of RPA substantially deteriorates when a K-factor model can closely approximate the population. TPA is the least sensitive to trivial misfits and results in the highest accuracy across most simulation conditions. This study suggests the use of TPA for the investigated models. Results also imply that RPA requires further revision to accommodate a degree of model–data misfit that can be tolerated.


2018 ◽  
Vol 79 (1) ◽  
pp. 85-107 ◽  
Author(s):  
Yan Xia ◽  
Samuel B. Green ◽  
Yuning Xu ◽  
Marilyn S. Thompson

Past research suggests revised parallel analysis (R-PA) tends to yield relatively accurate results in determining the number of factors in exploratory factor analysis. R-PA can be interpreted as a series of hypothesis tests. At each step in the series, a null hypothesis is tested that an additional factor accounts for zero common variance among measures in the population. Integration of an effect size statistic—the proportion of common variance (PCV)—into this testing process should allow for a more nuanced interpretation of R-PA results. In this article, we initially assessed the psychometric qualities of three PCV statistics that can be used in conjunction with principal axis factor analysis: the standard PCV statistic and two modifications of it. Based on analyses of generated data, the modification that considered only positive eigenvalues ([Formula: see text]) overall yielded the best results. Next, we examined PCV using minimum rank factor analysis, a method that avoids the extraction of negative eigenvalues. PCV with minimum rank factor analysis generally did not perform as well as [Formula: see text], even with a relatively large sample size of 5,000. Finally, we investigated the use of [Formula: see text] in combination with R-PA and concluded that practitioners can gain additional information from [Formula: see text] and make more nuanced decision about the number of factors when R-PA fails to retain the correct number of factors.


2017 ◽  
Vol 78 (4) ◽  
pp. 589-604 ◽  
Author(s):  
Samuel Green ◽  
Yuning Xu ◽  
Marilyn S. Thompson

Parallel analysis (PA) assesses the number of factors in exploratory factor analysis. Traditionally PA compares the eigenvalues for a sample correlation matrix with the eigenvalues for correlation matrices for 100 comparison datasets generated such that the variables are independent, but this approach uses the wrong reference distribution. The proper reference distribution of eigenvalues assesses the kth factor based on comparison datasets with k−1 underlying factors. Two methods that use the proper reference distribution are revised PA (R-PA) and the comparison data method (CDM). We compare the accuracies of these methods using Monte Carlo methods by manipulating the factor structure, factor loadings, factor correlations, and number of observations. In the 17 conditions in which CDM was more accurate than R-PA, both methods evidenced high accuracies (i.e.,>94.5%). In these conditions, CDM had slightly higher accuracies (mean difference of 1.6%). In contrast, in the remaining 25 conditions, R-PA evidenced higher accuracies (mean difference of 12.1%, and considerably higher for some conditions). We consider these findings in conjunction with previous research investigating PA methods and concluded that R-PA tends to offer somewhat stronger results. Nevertheless, further research is required. Given that both CDM and R-PA involve hypothesis testing, we argue that future research should explore effect size statistics to augment these methods.


2011 ◽  
Vol 72 (3) ◽  
pp. 357-374 ◽  
Author(s):  
Samuel B. Green ◽  
Roy Levy ◽  
Marilyn S. Thompson ◽  
Min Lu ◽  
Wen-Juo Lo

A number of psychometricians have argued for the use of parallel analysis to determine the number of factors. However, parallel analysis must be viewed at best as a heuristic approach rather than a mathematically rigorous one. The authors suggest a revision to parallel analysis that could improve its accuracy. A Monte Carlo study is conducted to compare revised and traditional parallel analysis approaches. Five dimensions are manipulated in the study: number of observations, number of factors, number of measured variables, size of the factor loadings, and degree of correlation between factors. Based on the results, the revised parallel analysis method, using principal axis factoring and the 95th percentile eigenvalue rule, offers promise.


2010 ◽  
Vol 70 (6) ◽  
pp. 885-901 ◽  
Author(s):  
Aaron V. Crawford ◽  
Samuel B. Green ◽  
Roy Levy ◽  
Wen-Juo Lo ◽  
Lietta Scott ◽  
...  

2019 ◽  
Vol 24 (4) ◽  
pp. 452-467 ◽  
Author(s):  
Sangdon Lim ◽  
Seungmin Jahng

Sign in / Sign up

Export Citation Format

Share Document