Validity in psychological research The interpretation of experiments; Sources of variance on the experiment;Validity in experiments and other research designs;Types of validity (Statistical conclusion; Internal; Construct); Tackling confounds in psychological research; Expectancy; External validity; Meta-analysis

2020 ◽  
Vol 24 (4) ◽  
pp. 316-344
Author(s):  
Leandre R. Fabrigar ◽  
Duane T. Wegener ◽  
Richard E. Petty

In recent years, psychology has wrestled with the broader implications of disappointing rates of replication of previously demonstrated effects. This article proposes that many aspects of this pattern of results can be understood within the classic framework of four proposed forms of validity: statistical conclusion validity, internal validity, construct validity, and external validity. The article explains the conceptual logic for how differences in each type of validity across an original study and a subsequent replication attempt can lead to replication “failure.” Existing themes in the replication literature related to each type of validity are also highlighted. Furthermore, empirical evidence is considered for the role of each type of validity in non-replication. The article concludes with a discussion of broader implications of this classic validity framework for improving replication rates in psychological research.


2014 ◽  
Vol 45 (3) ◽  
pp. 239-245 ◽  
Author(s):  
Robert J. Calin-Jageman ◽  
Tracy L. Caldwell

A recent series of experiments suggests that fostering superstitions can substantially improve performance on a variety of motor and cognitive tasks ( Damisch, Stoberock, & Mussweiler, 2010 ). We conducted two high-powered and precise replications of one of these experiments, examining if telling participants they had a lucky golf ball could improve their performance on a 10-shot golf task relative to controls. We found that the effect of superstition on performance is elusive: Participants told they had a lucky ball performed almost identically to controls. Our failure to replicate the target study was not due to lack of impact, lack of statistical power, differences in task difficulty, nor differences in participant belief in luck. A meta-analysis indicates significant heterogeneity in the effect of superstition on performance. This could be due to an unknown moderator, but no effect was observed among the studies with the strongest research designs (e.g., high power, a priori sampling plan).


2017 ◽  
Vol 41 (4) ◽  
pp. 451-464 ◽  
Author(s):  
Sara I. McClelland

In research using self-report measures, there is little attention paid to how participants interpret concepts; instead, researchers often assume definitions are shared, universal, or easily understood. I discuss the self-anchored ladder, adapted from Cantril’s ladder, which is a procedure that simultaneously collects a participant’s self-reported rating and their interpretation of that rating. Drawing from a study about sexual satisfaction that included a self-anchored ladder, four analyses are presented and discussed in relation to one another: (1) comparisons of sexual satisfaction scores, (2) variations of structures participants applied to the ladder, (3) frequency of terms used to describe sexual satisfaction, and (4) thematic analysis of “best” and “worst” sexual satisfaction. These analytic strategies offer researchers a model for how to incorporate self-anchored ladder items into research designs as a means to draw out layers of meaning in quantitative, qualitative, and mixed methods data. I argue that the ladder invites the potential for conceptual disruption by prioritizing skepticism in survey research and bringing greater attention to how social locations, histories, economic structures, and other factors shape self-report data. I also address issues related to the multiple epistemological positions that the ladder demands. Finally, I argue for the centrality of epistemological self-reflexivity in critical feminist psychological research. Additional online materials for this article are available on PWQ’s website at http://journals.sagepub.com/doi/suppl/10.1177/0361684317725985


2016 ◽  
Vol 90 (5) ◽  
pp. 1131-1132
Author(s):  
Mohamed E. Elrggal ◽  
Ghaly A. Kotb ◽  
Alyaa Elghitani ◽  
Rowan Zyada

2018 ◽  
Vol 33 (4) ◽  
pp. 574-584 ◽  
Author(s):  
Anni Rajala

Purpose Relationship learning is viewed as an important factor in enhancing competitiveness and an important determinant of profitability in relationships. Prior studies have acknowledged the positive effects of interorganizational learning on performance, but the performance measures applied have varied. The purpose of this study is to examine the relationship between interorganizational learning and different types of performance. The paper also goes beyond direct effects by investigating the moderating effects of different research designs. Design/methodology/approach This paper applies a meta-analytic approach to systematically analyze 21 independent studies (N = 4,618) to reveal the relationship between interorganizational learning and performance. Findings The findings indicate that interorganizational learning is an important predictor of performance, and that the effects of interorganizational learning on performance differ in magnitude under different research conditions. Research limitations/implications The paper focuses on interorganizational learning, and during the data collection, some related topics were excluded from the data search to retain the focus on learning. Practical implications The study evinces the breadth of the field of interorganizational learning and how different research designs affect research results. Moreover, this meta-analysis indicates the need for greater clarity when defining the concepts used in studies and for definitions of the concepts applied in the field of interorganizational learning to be unified. Originality/value This study is the first to meta-analytically synthesize literature on interorganizational learning. It also illuminates new perspectives for future studies within this field.


Sign in / Sign up

Export Citation Format

Share Document