scholarly journals Gathering Behavioral Samples Through a Computerized and Standardized Assessment Center Exercise

2010 ◽  
Vol 9 (2) ◽  
pp. 94-98 ◽  
Author(s):  
Filip Lievens ◽  
Etienne Van Keer ◽  
Ellen Volckaert

Although computerization and standardization might make assessment center (AC) exercises easier to administer and score, drawbacks are that most of such exercises have a static and multiple-choice format. This study reports on the development and initial validation of a computerized and standardized AC exercise that simulates key managerial tasks. This AC exercise capitalizes not only on the benefits of computerization and standardization (efficiency and cost savings) but at the same time aims to avoid their usual drawbacks (lower response fidelity and interactivity). The composite exercise score was significantly related to several criteria of interest and had incremental validity beyond cognitive ability. The exercise was also significantly related to candidates’ people management competencies.

1998 ◽  
Vol 51 (2) ◽  
pp. 357-374 ◽  
Author(s):  
HAROLD W. GOLDSTEIN ◽  
KENNETH P. YUSKO ◽  
ERIC P. BRAVERMAN ◽  
D. BRENT SMITH ◽  
BETH CHUNG

2013 ◽  
Vol 12 (4) ◽  
pp. 157-169 ◽  
Author(s):  
Philip L. Roth ◽  
Allen I. Huffcutt

The topic of what interviews measure has received a great deal of attention over the years. One line of research has investigated the relationship between interviews and the construct of cognitive ability. A previous meta-analysis reported an overall corrected correlation of .40 ( Huffcutt, Roth, & McDaniel, 1996 ). A more recent meta-analysis reported a noticeably lower corrected correlation of .27 ( Berry, Sackett, & Landers, 2007 ). After reviewing both meta-analyses, it appears that the two studies posed different research questions. Further, there were a number of coding judgments in Berry et al. that merit review, and there was no moderator analysis for educational versus employment interviews. As a result, we reanalyzed the work by Berry et al. and found a corrected correlation of .42 for employment interviews (.15 higher than Berry et al., a 56% increase). Further, educational interviews were associated with a corrected correlation of .21, supporting their influence as a moderator. We suggest a better estimate of the correlation between employment interviews and cognitive ability is .42, and this takes us “back to the future” in that the better overall estimate of the employment interviews – cognitive ability relationship is roughly .40. This difference has implications for what is being measured by interviews and their incremental validity.


1994 ◽  
Vol 47 (4) ◽  
pp. 715-738 ◽  
Author(s):  
THERESE HOFF MACAN ◽  
MARCIA J. AVEDON ◽  
MATTHEW PAESE ◽  
DAVID E. SMITH

2020 ◽  
Vol 37 (3) ◽  
pp. 435-452
Author(s):  
Yi-Jui Iva Chen ◽  
Mark Wilson ◽  
Robin C. Irey ◽  
Mary K. Requa

Orthographic processing – the ability to perceive, access, differentiate, and manipulate orthographic knowledge – is essential when learning to recognize words. Despite its critical importance in literacy acquisition, the field lacks a tool to assess this essential cognitive ability. The goal of this study was to design a computer-based assessment of orthographic processing and investigate its psychometric properties. The rationale for designing specific items was discussed, methods used to separate orthographic processing from word recognition and spelling ability were presented, and item suitability was examined. Person separation reliability was .91 for this assessment. Validity evidence was gathered and reported.


2000 ◽  
Vol 30 (7) ◽  
pp. 1474-1491 ◽  
Author(s):  
Paul E. Spector ◽  
Jeffrey R. Schneider ◽  
Carol A. Vance ◽  
Sarah A. Hezlett

1994 ◽  
Vol 2 (1) ◽  
pp. 53-58 ◽  
Author(s):  
John E. Delery ◽  
Patrick M. Wright ◽  
Kari McArthur ◽  
D. Christopher Anderson

2013 ◽  
Vol 12 (2) ◽  
pp. 63-73 ◽  
Author(s):  
Matthew J. W. McLarnon ◽  
Mitchell G. Rothstein

This study sought to provide the initial psychometric evidence supporting a new measure of resiliency. In consideration of the shortcomings of previous measures, a more comprehensive measure was developed based on the theoretical model of King and Rothstein (2010) . The resulting measure, the Workplace Resilience Inventory (WRI), encompasses an individual’s personal characteristics, social support network, initial responses to a significant and life changing event, and self-regulatory processes. Following a rigorous, theoretically-based, and empirically-supported procedure for selecting items, the facets of the WRI demonstrated acceptable internal consistency, as well as adequate independence. The WRI demonstrates significant relations with important well-being criteria, such as satisfaction with life, depression, and perceived stress, and demonstrates incremental validity above and beyond a previously validated measure of resiliency, the Psychological Capital (PsyCap) questionnaire.


Sign in / Sign up

Export Citation Format

Share Document