The Validity of Specific Cognitive Abilities for the Prediction of Training Success in Germany

2014 ◽  
Vol 13 (3) ◽  
pp. 123-133 ◽  
Author(s):  
Wiebke Goertz ◽  
Ute R. Hülsheger ◽  
Günter W. Maier

General mental ability (GMA) has long been considered one of the best predictors of training success and considerably better than specific cognitive abilities (SCAs). Recently, however, researchers have provided evidence that SCAs may be of similar importance for training success, a finding supporting personnel selection based on job-related requirements. The present meta-analysis therefore seeks to assess validities of SCAs for training success in various occupations in a sample of German primary studies. Our meta-analysis (k = 72) revealed operational validities between ρ = .18 and ρ = .26 for different SCAs. Furthermore, results varied by occupational category, supporting a job-specific benefit of SCAs.

2014 ◽  
Vol 13 (3) ◽  
pp. 117-122 ◽  
Author(s):  
Stefan Krumm ◽  
Lothar Schmidt-Atzert ◽  
Anastasiya A. Lipnevich

Recent findings suggest that the role of specific cognitive abilities in predicting work-related criteria may be critical and may add to the widely demonstrated importance of general mental ability. To summarize and organize these findings, the current paper puts forward two perspectives on the role of specific cognitive abilities in predicting work-related outcomes. Similarities and discrepancies of these perspectives are outlined together with suggestions for boundary conditions of the dominance of general versus specific cognitive abilities. Finally, avenues for future research within and across the two perspectives are discussed.


2018 ◽  
Vol 4 (3) ◽  
pp. 1
Author(s):  
Jaroslaw Grobelny

There are two main views on the role of cognitive abilities in job performance prediction. The first approach is based on meta-analysis and incremental validity analysis research and the main assumption is that general mental ability (GMA) is the best job performance predictor regardless of the occupation. The second approach, referred to as specific validity theory, assumes that job-unique weighting of different specific mental abilities (SMA) is a better predictor of job performance than GMA and occupational context cannot be ignored when job performance is predicted. The validity study of both GMA and SMA as predictors of job performance across different occupational groups (N = 4033, k = 15) was conducted. The results were analyzed by calculating observed validity coefficients and with the use of the incremental validity and the relative importance analysis. The results supports the specific validity theory – SMA proved to be a valid job performance predictor and occupational context moderated GMA validity.


1988 ◽  
Vol 63 (1) ◽  
pp. 131-134 ◽  
Author(s):  
Deborah L. Whetzel ◽  
Michael A. Mc Daniel

This paper addresses the usefulness of reporting coder reliability in validity generalization studies. The Principles for the Validation and Use of Personnel Selection Instruments of the Society for Industrial and Organizational Psychology state that given the results of meta-analytic studies, validities generalize far more than previously believed; however, users of validity generalization results are required to report the reliability of data entering validity generalization analyses. In response to this concern, reliability coefficients were computed on the validity and sample size between two studies (i.e., data bases) of the Wonderlic Personnel Test and the Otis Test of General Mental Ability. These variables, validity, and sample size, were investigated since these are the crucial components in validity generalization analysis. Results indicated that the correlation between the validities of the two studies was .99 and the correlation between the sample sizes of the two studies was 1.00. To illustrate further the reliability of coding in validity generalization research, separate meta-analyses were conducted on the validity of these tests on each of the two data bases. When correcting only for sampling error, the results indicated that the separate meta-analyses yielded identical results, M = .24, SD = .09. These results show that concerns about the reliability of validity generalization data bases are unwarranted and that independent investigators coding the same data, record the same values and obtain the same results.


2014 ◽  
Vol 2014 (1) ◽  
pp. 11088 ◽  
Author(s):  
Erik Gonzalez-Mule ◽  
Kameron Carter ◽  
Michael K Mount

Author(s):  
Luke I. Rowe ◽  
John Hattie ◽  
Robert Hester

AbstractCollective intelligence (CI) is said to manifest in a group’s domain general mental ability. It can be measured across a battery of group IQ tests and statistically reduced to a latent factor called the “c-factor.” Advocates have found the c-factor predicts group performance better than individual IQ. We test this claim by meta-analyzing correlations between the c-factor and nine group performance criterion tasks generated by eight independent samples (N = 857 groups). Results indicated a moderate correlation, r, of .26 (95% CI .10, .40). All but four studies comprising five independent samples (N = 366 groups) failed to control for the intelligence of individual members using individual IQ scores or their statistically reduced equivalent (i.e., the g-factor). A meta-analysis of this subset of studies found the average IQ of the groups’ members had little to no correlation with group performance (r = .06, 95% CI −.08, .20). Around 80% of studies did not have enough statistical power to reliably detect correlations between the primary predictor variables and the criterion tasks. Though some of our findings are consistent with claims that a general factor of group performance may exist and relate positively to group performance, limitations suggest alternative explanations cannot be dismissed. We caution against prematurely embracing notions of the c-factor unless it can be independently and robustly replicated and demonstrated to be incrementally valid beyond the g-factor in group performance contexts.


Sign in / Sign up

Export Citation Format

Share Document