Meta-Analyses of Built Environment Effects on Travel: No New Platinum Standard

2019 ◽  
pp. 0739456X1985642 ◽  
Author(s):  
Petter Næss

This commentary presents a critique of a particular, strictly quantitative way of reviewing research findings within the field of land use and transportation studies, so-called meta-analyses. Beyond criticism raised earlier, the article draws attention to serious bias resulting when meta-analysis include studies encumbered with model specification error due to poor understanding of causal mechanisms. The article also discusses underestimated limitations due to neglect of differences between geographical contexts and inconsistent measurement of variables across studies. An example of an alternative approach is offered at the end of the article.

2021 ◽  
Vol 99 (Supplement_3) ◽  
pp. 62-62
Author(s):  
Emma N Bermingham

Abstract In a world of the “Three Rs” (replace, reduce and refine), combined with more research published via open access research journals, there is increasing interest in the statistical analysis of existing literature. Meta-analysis – the combination of multiple studies, can be used to get better oversight into a specific question of interest. Additionally, it can be used to identify gaps in knowledge. For example, while there are a number of publications investigating energy requirements in adult cat and dog, few studies assess older animals. Similarly, in the dog, there is a lack of literature around dogs at the extremes of body size (i.e. giant and toy breeds). Herein, we describe several published examples that have been used to determine energy requirements of cats and dogs, and more recently, the impacts of diet on the microbiome of the cat and dog. This includes the use of Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) guidelines, research findings and general findings related to research design and quality.


2020 ◽  
Vol 24 (3) ◽  
pp. 195-209
Author(s):  
Richard E. Hohn ◽  
Kathleen L. Slaney ◽  
Donna Tafreshi

As meta-analytic studies have come to occupy a sizable contingent of published work in the psychological sciences, clarity in the research and reporting practices of such work is crucial to the interpretability and reproducibility of research findings. The present study examines the state of research and reporting practices within a random sample of 384 published psychological meta-analyses across several important dimensions (e.g., search methods, exclusion criteria, statistical techniques). In addition, we surveyed the first authors of the meta-analyses in our sample to ask them directly about the research practices employed and reporting decisions made in their studies, including the assessments and procedures they conducted and the guidelines or materials they relied on. Upon cross-validating the first author responses with what was reported in their published meta-analyses, we identified numerous potential gaps in reporting and research practices. In addition to providing a survey of recent reporting practices, our findings suggest that (a) there are several research practices conducted by meta-analysts that are ultimately not reported; (b) some aspects of meta-analysis research appear to be conducted at disappointingly low rates; and (c) the adoption of the reporting standards, including the Meta-Analytic Reporting Standards (MARS), has been slow to nonexistent within psychological meta-analytic research.


2006 ◽  
Vol 19 (3) ◽  
Author(s):  
Marise Ph. Born ◽  
Stefan T. Mol

Quantitatively integrating empirical studies: The method of meta-analysis Quantitatively integrating empirical studies: The method of meta-analysis Marise Ph. Born & Stefan T. Mol, Gedrag & Organisatie, Volume 19, September 2006, nr. 3, pp. 251-271 Meta-analysis is a quantitative integration of results of a series of empirical studies into a specific research question. The method of meta-analysis has obtained a dominant position in the social sciences and beyond, as it may help in obtaining an overview of the explosively increased number of research publications. This contribution discusses the basics and consecutive steps in performing a meta-analysis. A meta-analysis that we conducted on expatriates serves as an illustration. Next to the many points in favor of meta-analyses, such as having a better overview of a research domain and shifting the traditional focus on significances of effects to sizes of effects, several important controversies remain. One of these is the issue of waving away a specific cause of variance in research findings as a methodological artifact, or interpreting it as a meaningful case of variance. We maintain that every social or industrial- and organizational psychologist who wants to stay up-to-date scientifically should be able to interpret meta-analyses.


2008 ◽  
Vol 35 (2) ◽  
pp. 393-419 ◽  
Author(s):  
Inge Geyskens ◽  
Rekha Krishnan ◽  
Jan-Benedict E. M. Steenkamp ◽  
Paulo V. Cunha

Meta-analysis has become increasingly popular in management research to quantitatively integrate research findings across a large number of studies. In an effort to help shape future applications of meta-analysis in management, this study chronicles and evaluates the decisions that management researchers made in 69 meta-analytic studies published between 1980 and 2007 in 14 management journals. It performs four meta-analyses of relationships that have been studied with varying frequency in management research, to provide empirical evidence that meta-analytical decisions influence results. The implications of the findings are discussed with a focus on the changes that seem appropriate.


2017 ◽  
Author(s):  
Freya Acar ◽  
Ruth Seurinck ◽  
Simon B. Eickhoff ◽  
Beatrijs Moerkerke

AbstractThe importance of integrating research findings is incontrovertible and coordinate based meta-analyses have become a popular approach to combine results of fMRI studies when only peaks of activation are reported. Similar to classical meta-analyses, coordinate based meta-analyses may be subject to different forms of publication bias which impacts results and possibly invalidates findings. We develop a tool that assesses the robustness to potential publication bias on cluster level. We investigate the possible influence of the file-drawer effect, where studies that do not report certain results fail to get published, by determining the number of noise studies that can be added to an existing fMRI meta-analysis before the results are no longer statistically significant. In this paper we illustrate this tool through an example and test the effect of several parameters through extensive simulations. We provide an algorithm for which code is freely available to generate noise studies and enables users to determine the robustness of meta-analytical results.


2019 ◽  
Vol 3 (1) ◽  
pp. 124-137 ◽  
Author(s):  
Frank A. Bosco ◽  
James G. Field ◽  
Kai R. Larsen ◽  
Yingyi Chang ◽  
Krista L. Uggerslev

In this article, we provide a review of research-curation and knowledge-management efforts that may be leveraged to advance research and education in psychological science. After reviewing the approaches and content of other efforts, we focus on the metaBUS project’s platform, the most comprehensive effort to date. The metaBUS platform uses standards-based protocols in combination with human judgment to organize and make readily accessible a database of research findings, currently numbering more than 1 million. It allows users to conduct rudimentary, instant meta-analyses, and capacities for visualization and communication of meta-analytic findings have recently been added. We conclude by discussing challenges, opportunities, and recommendations for expanding the project beyond applied psychology.


1999 ◽  
Vol 85 (3_suppl) ◽  
pp. 1179-1194 ◽  
Author(s):  
Emil J. Posavac ◽  
Kristienne R. Kattapong ◽  
Dennis E. Dew

The effects of 47 peer-based health education programs described in 36 published studies were examined. The over-all effect size was small: the mean d was .190 when controls received no program and .020 when controls received an alternative program. Programs were divided into those focusing on preventing or reducing smoking and programs on other health issues; the latter were further divided into primary prevention and secondary prevention programs. Differences among studies suggested several biases which were likely to have influenced the effect sizes. Preventive interventions that produce only small effects can be valuable because many participants would not have developed the problem even without the program. This review suggested that, when health education programs are studied, (a) detailed statistical information should be provided to facilitate using the research findings in meta-analyses and (b) the costs of innovative programs should be presented to judge whether the results are worth the cost.


Author(s):  
Alberto Soto ◽  
Timothy B. Smith ◽  
Derek Griner ◽  
Melanie Domenech Rodríguez ◽  
Guillermo Bernal

Mental health treatments can be more effective when they align with the culture of the client and when therapists demonstrate multicultural competence. This chapter defines cultural adaptations and therapist multicultural competencies and provides clinical examples of each. It summarizes relevant research findings in two meta-analyses. In the meta-analysis on the effectiveness of culturally adapted interventions, the average effect size across 99 studies was d = .50. In the second meta-analysis on 15 studies of therapist cultural competence, the results differed by rating source: Client-rated measures of therapist cultural competence correlated strongly with treatment outcomes (r = .38) but therapists’ self-rated competency did not (r = .06). The chapter lists limitations of the research and patient contributions, concluding with research-supported therapeutic practices that help clients benefit from cultural adaptations and from therapists they perceive as culturally competent.


2020 ◽  
Author(s):  
Ziga Malek ◽  
Peter Verburg

<p>Environmental changes have been studied in numerous local scale studies all around the world. They provide invaluable evidence on the causes and consequences of the way we use and change the environment. However, it remains unknown, how we can use this evidence beyond the study area boundaries, which limits the transferability of potential more sustainable solutions. We present a novel, interdisciplinary workflow on how to combine systematic reviews and meta-analyses with spatial analysis on the example of land use change. First, we performed a systematic review on local scale land use change. The collected studies were used to generate a classification of different actors behind land use change using clustering. Secondly, using the documented case study evidence, we statistically analysed how the location influences the spatial distribution of these studies. We used data on socio-economic, soil, terrain and climate variables. Using the derived statistical relationships, we were able to map the spatial likelihood of the studies, and how representative the study collection is for other parts of the world. The results enabled us to identify areas, which are similar to the meta-analysis collection. Conversely, areas that are very different can be used to identify understudied areas where more research is necessary. The proposed workflow can be used across different domains of environmental and earth system sciences.</p>


2013 ◽  
Vol 12 (4) ◽  
pp. 157-169 ◽  
Author(s):  
Philip L. Roth ◽  
Allen I. Huffcutt

The topic of what interviews measure has received a great deal of attention over the years. One line of research has investigated the relationship between interviews and the construct of cognitive ability. A previous meta-analysis reported an overall corrected correlation of .40 ( Huffcutt, Roth, & McDaniel, 1996 ). A more recent meta-analysis reported a noticeably lower corrected correlation of .27 ( Berry, Sackett, & Landers, 2007 ). After reviewing both meta-analyses, it appears that the two studies posed different research questions. Further, there were a number of coding judgments in Berry et al. that merit review, and there was no moderator analysis for educational versus employment interviews. As a result, we reanalyzed the work by Berry et al. and found a corrected correlation of .42 for employment interviews (.15 higher than Berry et al., a 56% increase). Further, educational interviews were associated with a corrected correlation of .21, supporting their influence as a moderator. We suggest a better estimate of the correlation between employment interviews and cognitive ability is .42, and this takes us “back to the future” in that the better overall estimate of the employment interviews – cognitive ability relationship is roughly .40. This difference has implications for what is being measured by interviews and their incremental validity.


Sign in / Sign up

Export Citation Format

Share Document