scholarly journals Advancing Meta-Analysis With Knowledge-Management Platforms: Using metaBUS in Psychology

2019 ◽  
Vol 3 (1) ◽  
pp. 124-137 ◽  
Author(s):  
Frank A. Bosco ◽  
James G. Field ◽  
Kai R. Larsen ◽  
Yingyi Chang ◽  
Krista L. Uggerslev

In this article, we provide a review of research-curation and knowledge-management efforts that may be leveraged to advance research and education in psychological science. After reviewing the approaches and content of other efforts, we focus on the metaBUS project’s platform, the most comprehensive effort to date. The metaBUS platform uses standards-based protocols in combination with human judgment to organize and make readily accessible a database of research findings, currently numbering more than 1 million. It allows users to conduct rudimentary, instant meta-analyses, and capacities for visualization and communication of meta-analytic findings have recently been added. We conclude by discussing challenges, opportunities, and recommendations for expanding the project beyond applied psychology.

2020 ◽  
Author(s):  
Magdalena Siegel ◽  
Junia Eder ◽  
Jelte M. Wicherts ◽  
Jakob Pietschnig

Inflated or outright false effects plague Psychological Science, but advances in the identification of dissemination biases in general and publication bias in particular have helped in dealing with biased effects in the literature. However, the application of publication bias detection methods appears to be not equally prevalent across subdisciplines. It has been suggested that particularly in I/O Psychology, appropriate publication bias detection methods are underused. In this meta-meta-analysis, we present prevalence estimates, predictors, and time trends of publication bias in 128 meta-analyses that were published in the Journal of Applied Psychology (7,263 effect sizes, 3,000,000+ participants). Moreover, we reanalyzed data of 87 meta-analyses and applied nine standard and more modern publication bias detection methods. We show that (i) the bias detection method applications are underused (only 41% of meta-analyses use at least one method) but have increased in recent years, (ii) those meta-analyses that apply such methods now use more, but mostly inappropriate methods, and (iii) the prevalence of publication bias is disconcertingly high (15% to 20% show severe, 33% to 48% some bias indication) but mostly remains undetected. Although our results indicate somewhat of a trend towards higher bias awareness, they also indicate that concerns about publication bias in I/O Psychology are justified and researcher awareness about appropriate and state-of-the-art bias detection needs to be further increased. Embracing open science practices such as data sharing or study preregistration is needed to raise reproducibility and ultimately strengthen Psychological Science in general and I/O Psychology in particular.


2021 ◽  
Vol 99 (Supplement_3) ◽  
pp. 62-62
Author(s):  
Emma N Bermingham

Abstract In a world of the “Three Rs” (replace, reduce and refine), combined with more research published via open access research journals, there is increasing interest in the statistical analysis of existing literature. Meta-analysis – the combination of multiple studies, can be used to get better oversight into a specific question of interest. Additionally, it can be used to identify gaps in knowledge. For example, while there are a number of publications investigating energy requirements in adult cat and dog, few studies assess older animals. Similarly, in the dog, there is a lack of literature around dogs at the extremes of body size (i.e. giant and toy breeds). Herein, we describe several published examples that have been used to determine energy requirements of cats and dogs, and more recently, the impacts of diet on the microbiome of the cat and dog. This includes the use of Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) guidelines, research findings and general findings related to research design and quality.


2020 ◽  
Vol 24 (3) ◽  
pp. 195-209
Author(s):  
Richard E. Hohn ◽  
Kathleen L. Slaney ◽  
Donna Tafreshi

As meta-analytic studies have come to occupy a sizable contingent of published work in the psychological sciences, clarity in the research and reporting practices of such work is crucial to the interpretability and reproducibility of research findings. The present study examines the state of research and reporting practices within a random sample of 384 published psychological meta-analyses across several important dimensions (e.g., search methods, exclusion criteria, statistical techniques). In addition, we surveyed the first authors of the meta-analyses in our sample to ask them directly about the research practices employed and reporting decisions made in their studies, including the assessments and procedures they conducted and the guidelines or materials they relied on. Upon cross-validating the first author responses with what was reported in their published meta-analyses, we identified numerous potential gaps in reporting and research practices. In addition to providing a survey of recent reporting practices, our findings suggest that (a) there are several research practices conducted by meta-analysts that are ultimately not reported; (b) some aspects of meta-analysis research appear to be conducted at disappointingly low rates; and (c) the adoption of the reporting standards, including the Meta-Analytic Reporting Standards (MARS), has been slow to nonexistent within psychological meta-analytic research.


2006 ◽  
Vol 19 (3) ◽  
Author(s):  
Marise Ph. Born ◽  
Stefan T. Mol

Quantitatively integrating empirical studies: The method of meta-analysis Quantitatively integrating empirical studies: The method of meta-analysis Marise Ph. Born & Stefan T. Mol, Gedrag & Organisatie, Volume 19, September 2006, nr. 3, pp. 251-271 Meta-analysis is a quantitative integration of results of a series of empirical studies into a specific research question. The method of meta-analysis has obtained a dominant position in the social sciences and beyond, as it may help in obtaining an overview of the explosively increased number of research publications. This contribution discusses the basics and consecutive steps in performing a meta-analysis. A meta-analysis that we conducted on expatriates serves as an illustration. Next to the many points in favor of meta-analyses, such as having a better overview of a research domain and shifting the traditional focus on significances of effects to sizes of effects, several important controversies remain. One of these is the issue of waving away a specific cause of variance in research findings as a methodological artifact, or interpreting it as a meaningful case of variance. We maintain that every social or industrial- and organizational psychologist who wants to stay up-to-date scientifically should be able to interpret meta-analyses.


2008 ◽  
Vol 35 (2) ◽  
pp. 393-419 ◽  
Author(s):  
Inge Geyskens ◽  
Rekha Krishnan ◽  
Jan-Benedict E. M. Steenkamp ◽  
Paulo V. Cunha

Meta-analysis has become increasingly popular in management research to quantitatively integrate research findings across a large number of studies. In an effort to help shape future applications of meta-analysis in management, this study chronicles and evaluates the decisions that management researchers made in 69 meta-analytic studies published between 1980 and 2007 in 14 management journals. It performs four meta-analyses of relationships that have been studied with varying frequency in management research, to provide empirical evidence that meta-analytical decisions influence results. The implications of the findings are discussed with a focus on the changes that seem appropriate.


2017 ◽  
Vol 22 (5) ◽  
pp. 565-582 ◽  
Author(s):  
Colin I.S.G. Lee ◽  
Frank A. Bosco ◽  
Piers Steel ◽  
Krista L. Uggerslev

Purpose In this study, the authors revisit the meta-analytic correlates of career satisfaction and demonstrate the use of metaBUS – a database repository of meta-analytic effect sizes and related information from the field of applied psychology. The purpose of this paper is to extend prior meta-analytic research on the topic of career satisfaction and compare the results from the metaBUS-enabled meta-analysis, with the results from meta-analyses that do not build on the repository. Design/methodology/approach A multilevel meta-analysis was conducted on all correlates available in the metaBUS database and the approach was described in a step-by-step fashion. Findings The demonstration reiterated some of the findings of prior meta-analyses, but also revealed considerable incongruity between the sample taken from the metaBUS database and the meta-analytic sample from studies that relied on non-metaBUS-based literature searches. Nevertheless, the results are similar in terms of the directions of the effects and the relative sizes of the effects. Research limitations/implications The paper demonstrates the use of the metaBUS database. In addition, results suggest that meta-analyses on career satisfaction might have suffered from sample selection issues, but further research is required in order to establish the source of the sample selection incongruence. Originality/value This is the first step-by-step demonstration of the use of metaBUS specifically for meta-analyses.


2017 ◽  
Vol 141 (3) ◽  
pp. 423-430 ◽  
Author(s):  
Xulei Liu ◽  
Michael Kinzler ◽  
Jiangfan Yuan ◽  
Guozhong He ◽  
Lanjing Zhang

Context.— Little is known regarding the reporting quality of meta-analyses in diagnostic pathology. Objective.— To compare reporting quality of meta-analyses in diagnostic pathology and medicine and to examine factors associated with reporting quality of diagnostic pathology meta-analyses. Design.— Meta-analyses were identified in 12 major diagnostic pathology journals without specifying years and 4 major medicine journals in 2006 and 2011 using PubMed. Reporting quality of meta-analyses was evaluated using the 27-item checklist of Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statement published in 2009. A higher PRISMA score indicates higher reporting quality. Results.— Forty-one diagnostic pathology meta-analyses and 118 medicine meta-analyses were included. Overall, reporting quality of meta-analyses in diagnostic pathology was lower than that in medicine (median [interquartile range] = 22 [15, 25] versus 27 [23, 28], P < .001). Compared with medicine meta-analyses, diagnostic pathology meta-analyses less likely reported 23 of the 27 items (85.2%) on the PRISMA checklist, but more likely reported the data items. Higher reporting quality of diagnostic pathology meta-analyses was associated with recent publication years (later than 2009 versus 2009 or earlier, P = .002) and non–North American first authors (versus North American, P = .001), but not journal publisher's location (P = .11). Interestingly, reporting quality was not associated with adjusted citation ratio for meta-analyses in either diagnostic pathology or medicine (P = .40 and P = .09, respectively). Conclusions.— Meta-analyses in diagnostic pathology had lower reporting quality than those in medicine. Reporting quality of diagnostic pathology meta-analyses is linked to publication year and first author's location, but not to journal publisher's location or article's adjusted citation ratios. More research and education on meta-analysis methodology may improve the reporting quality of diagnostic pathology meta-analyses.


2019 ◽  
pp. 0739456X1985642 ◽  
Author(s):  
Petter Næss

This commentary presents a critique of a particular, strictly quantitative way of reviewing research findings within the field of land use and transportation studies, so-called meta-analyses. Beyond criticism raised earlier, the article draws attention to serious bias resulting when meta-analysis include studies encumbered with model specification error due to poor understanding of causal mechanisms. The article also discusses underestimated limitations due to neglect of differences between geographical contexts and inconsistent measurement of variables across studies. An example of an alternative approach is offered at the end of the article.


2019 ◽  
Author(s):  
Malte Elson

Research synthesis is based on the assumption that when the same association between constructs is observed repeatedly in a field, the relationship is probably real, even if its exact magnitude can be debated. Yet this probability is not only a function of recurring results, but also of the quality and consistency in the empirical procedures that produced those results and that any meta-analysis necessarily inherits. Standardized protocols in data collection, analysis, and interpretation are important empirical properties, and a healthy sign of a discipline's maturity.This manuscript proposes that meta-analyses as typically applied in psychology benefit from complementing their aggregates of observed effect sizes by systematically examining the standardization of methodology that deterministically produced them. Potential units of analyses are described and two examples are offered to illustrate the benefits of such efforts. Ideally, this synergetic approach emphasizes the role of methods in advancing theory by improving the quality of meta-analytic inferences.


2021 ◽  
pp. 174569162096419
Author(s):  
Audrey Helen Linden ◽  
Johannes Hönekopp

Heterogeneity emerges when multiple close or conceptual replications on the same subject produce results that vary more than expected from the sampling error. Here we argue that unexplained heterogeneity reflects a lack of coherence between the concepts applied and data observed and therefore a lack of understanding of the subject matter. Typical levels of heterogeneity thus offer a useful but neglected perspective on the levels of understanding achieved in psychological science. Focusing on continuous outcome variables, we surveyed heterogeneity in 150 meta-analyses from cognitive, organizational, and social psychology and 57 multiple close replications. Heterogeneity proved to be very high in meta-analyses, with powerful moderators being conspicuously absent. Population effects in the average meta-analysis vary from small to very large for reasons that are typically not understood. In contrast, heterogeneity was moderate in close replications. A newly identified relationship between heterogeneity and effect size allowed us to make predictions about expected heterogeneity levels. We discuss important implications for the formulation and evaluation of theories in psychology. On the basis of insights from the history and philosophy of science, we argue that the reduction of heterogeneity is important for progress in psychology and its practical applications, and we suggest changes to our collective research practice toward this end.


Sign in / Sign up

Export Citation Format

Share Document