Number of Factors Decision: Parallel Analysis Is Not the Panacea

2006 ◽  
Author(s):  
Jinyan Fan ◽  
Felix James Lopez ◽  
Jennifer Nieman ◽  
Robert C. Litchfield ◽  
Robert S. Billings
2021 ◽  
pp. 001316442199283
Author(s):  
Yan Xia

Despite the existence of many methods for determining the number of factors, none outperforms the others under every condition. This study compares traditional parallel analysis (TPA), revised parallel analysis (RPA), Kaiser’s rule, minimum average partial, sequential χ2, and sequential root mean square error of approximation, comparative fit index, and Tucker–Lewis index under a realistic scenario in behavioral studies, where researchers employ a closing–fitting parsimonious model with K factors to approximate a population model, leading to a trivial model-data misfit. Results show that while traditional and RPA both stand out when zero population-level misfits exist, the accuracy of RPA substantially deteriorates when a K-factor model can closely approximate the population. TPA is the least sensitive to trivial misfits and results in the highest accuracy across most simulation conditions. This study suggests the use of TPA for the investigated models. Results also imply that RPA requires further revision to accommodate a degree of model–data misfit that can be tolerated.


2018 ◽  
Vol 79 (1) ◽  
pp. 85-107 ◽  
Author(s):  
Yan Xia ◽  
Samuel B. Green ◽  
Yuning Xu ◽  
Marilyn S. Thompson

Past research suggests revised parallel analysis (R-PA) tends to yield relatively accurate results in determining the number of factors in exploratory factor analysis. R-PA can be interpreted as a series of hypothesis tests. At each step in the series, a null hypothesis is tested that an additional factor accounts for zero common variance among measures in the population. Integration of an effect size statistic—the proportion of common variance (PCV)—into this testing process should allow for a more nuanced interpretation of R-PA results. In this article, we initially assessed the psychometric qualities of three PCV statistics that can be used in conjunction with principal axis factor analysis: the standard PCV statistic and two modifications of it. Based on analyses of generated data, the modification that considered only positive eigenvalues ([Formula: see text]) overall yielded the best results. Next, we examined PCV using minimum rank factor analysis, a method that avoids the extraction of negative eigenvalues. PCV with minimum rank factor analysis generally did not perform as well as [Formula: see text], even with a relatively large sample size of 5,000. Finally, we investigated the use of [Formula: see text] in combination with R-PA and concluded that practitioners can gain additional information from [Formula: see text] and make more nuanced decision about the number of factors when R-PA fails to retain the correct number of factors.


2017 ◽  
Vol 78 (4) ◽  
pp. 589-604 ◽  
Author(s):  
Samuel Green ◽  
Yuning Xu ◽  
Marilyn S. Thompson

Parallel analysis (PA) assesses the number of factors in exploratory factor analysis. Traditionally PA compares the eigenvalues for a sample correlation matrix with the eigenvalues for correlation matrices for 100 comparison datasets generated such that the variables are independent, but this approach uses the wrong reference distribution. The proper reference distribution of eigenvalues assesses the kth factor based on comparison datasets with k−1 underlying factors. Two methods that use the proper reference distribution are revised PA (R-PA) and the comparison data method (CDM). We compare the accuracies of these methods using Monte Carlo methods by manipulating the factor structure, factor loadings, factor correlations, and number of observations. In the 17 conditions in which CDM was more accurate than R-PA, both methods evidenced high accuracies (i.e.,>94.5%). In these conditions, CDM had slightly higher accuracies (mean difference of 1.6%). In contrast, in the remaining 25 conditions, R-PA evidenced higher accuracies (mean difference of 12.1%, and considerably higher for some conditions). We consider these findings in conjunction with previous research investigating PA methods and concluded that R-PA tends to offer somewhat stronger results. Nevertheless, further research is required. Given that both CDM and R-PA involve hypothesis testing, we argue that future research should explore effect size statistics to augment these methods.


2011 ◽  
Vol 72 (3) ◽  
pp. 357-374 ◽  
Author(s):  
Samuel B. Green ◽  
Roy Levy ◽  
Marilyn S. Thompson ◽  
Min Lu ◽  
Wen-Juo Lo

A number of psychometricians have argued for the use of parallel analysis to determine the number of factors. However, parallel analysis must be viewed at best as a heuristic approach rather than a mathematically rigorous one. The authors suggest a revision to parallel analysis that could improve its accuracy. A Monte Carlo study is conducted to compare revised and traditional parallel analysis approaches. Five dimensions are manipulated in the study: number of observations, number of factors, number of measured variables, size of the factor loadings, and degree of correlation between factors. Based on the results, the revised parallel analysis method, using principal axis factoring and the 95th percentile eigenvalue rule, offers promise.


2010 ◽  
Vol 70 (6) ◽  
pp. 885-901 ◽  
Author(s):  
Aaron V. Crawford ◽  
Samuel B. Green ◽  
Roy Levy ◽  
Wen-Juo Lo ◽  
Lietta Scott ◽  
...  

2019 ◽  
Vol 24 (4) ◽  
pp. 452-467 ◽  
Author(s):  
Sangdon Lim ◽  
Seungmin Jahng

2020 ◽  
Vol 41 (1) ◽  
pp. 137-168
Author(s):  
Bob Carter

This response to Huw Beynon’s paper, ‘After the Long Boom: Living with Capitalism in the Twenty-First Century’ in HSIR 40 (2019), offers a parallel analysis of the fortunes of labour in the public sector. Among Beynon’s central observations, drawing on Karl Marx and Harry Braverman, was the continued reproduction of ‘unskilled’ and degraded labour. A parallel process, de-professionalizing occupations through the separation of conception and execution, has been a feature of the almost continual restructuring of state and local authority organizations and their work practices since the 1960s. This has accelerated in the era of governments committed to neoliberal values and policies. Despite public-sector trade unions having been largely conservative and defensive in their values and practice, a number of factors, both structural and conjunctural, have compelled them to face this new reality and make them the most likely organizations to challenge the expanding reach of neoliberalism. Recognizing these factors provides a possible remedy to the implied pessimism that follows the largely private-sector focus of Beynon’s contribution.


Sign in / Sign up

Export Citation Format

Share Document