Releasing CLARIFY: A New Guideline for Improving Yoga Research Transparency and Usefulness

Author(s):  
Steffany Moonaz ◽  
Daryl Nault ◽  
Holger Cramer ◽  
Lesley Ward
2018 ◽  
Vol 2 (2) ◽  
pp. 198-206 ◽  
Author(s):  
OLIVIA M. MAYNARD ◽  
MARCUS R. MUNAFÒ

AbstractThere are inherent differences in the priorities of academics and policy-makers. These pose unique challenges for teams such as the Behavioural Insights Team (BIT), which has positioned itself as an organisation conducting academically rigorous behavioural science research in policy settings. Here we outline the threats to research transparency and reproducibility that stem from working with policy-makers and other non-academic stakeholders. These threats affect how we perform, communicate, verify and evaluate research. Solutions that increase research transparency include pre-registering study protocols, making data open and publishing summaries of results. We suggest an incentive structure (a simple ‘nudge’) that rewards BIT's non-academic partners for engaging in these practices.


2018 ◽  
Vol 28 (4) ◽  
pp. 3-4
Author(s):  
Valerie A. Canady

2020 ◽  
Author(s):  
Per Engzell ◽  
Julia Marie Rohrer

The transdisciplinary movement towards greater research transparency opens the door for a meta-scientific exchange between different social sciences. In the spirit of such an exchange, we offer some lessons inspired by ongoing debates in psychology, highlighting the broad benefits of open science but also potential pitfalls, as well as practical challenges in the implementation that have not yet been fully resolved. Our discussion is aimed towards political scientists but relevant for population sciences more broadly.


2018 ◽  
Vol 56 (3) ◽  
pp. 920-980 ◽  
Author(s):  
Garret Christensen ◽  
Edward Miguel

There is growing interest in enhancing research transparency and reproducibility in economics and other scientific fields. We survey existing work on these topics within economics and discuss the evidence suggesting that publication bias, inability to replicate, and specification searching remain widespread in the discipline. We next discuss recent progress in this area, including through improved research design, study registration and pre-analysis plans, disclosure standards, and open sharing of data and materials, drawing on experiences in both economics and other social sciences. We discuss areas where consensus is emerging on new practices, as well as approaches that remain controversial, and speculate about the most effective ways to make economics research more credible in the future. ( JEL A11, C18, I23)


2021 ◽  
pp. 1-6
Author(s):  
R. Michael Alvarez ◽  
Simon Heuberger

ABSTRACT In recent years, scholars, journals, and professional organizations in political science have been working to improve research transparency. Although better transparency is a laudable goal, the implementation of standards for reproducibility still leaves much to be desired. This article identifies two practices that political science should adopt to improve research transparency: (1) journals must provide detailed replication guidance and run provided material; and (2) authors must begin their work with replication in mind. We focus on problems that occur when scholars provide research materials to journals for replication, and we outline best practices regarding documentation and code structure for researchers to use.


2019 ◽  
pp. 004912411988245
Author(s):  
Elena Damian ◽  
Bart Meuleman ◽  
Wim van Oorschot

In this article, we examine whether cross-national studies disclose enough information for independent researchers to evaluate the validity and reliability of the findings (evaluation transparency) or to perform a direct replication (replicability transparency). The first contribution is theoretical. We develop a heuristic theoretical model including the actors, factors, and processes that influence the transparency of cross-national studies and provide an overview of the measures currently taken to improve research transparency. The second contribution is empirical, in which we analyze the level of transparency in published cross-national studies. Specifically, using a random sample of 305 comparative studies published in one of 29 peer-reviewed social sciences journals (from 1986 to 2016), we show that, even though all the articles include some methodological information, the great majority lack sufficient information for evaluation and replication. Lastly, we develop and propose a set of transparency guidelines tailored for reporting cross-national survey research.


2020 ◽  
Vol 44 (4) ◽  
pp. 189-191
Author(s):  
Stefania Fatone ◽  
Michael P Dillon ◽  
Brian J Hafner ◽  
Nerrolyn Ramstrand

2019 ◽  
Vol 6 (1) ◽  
Author(s):  
Konstantinos Nasiotis ◽  
Martin Cousineau ◽  
François Tadel ◽  
Adrien Peyrache ◽  
Richard M. Leahy ◽  
...  

Abstract The methods for electrophysiology in neuroscience have evolved tremendously over the recent years with a growing emphasis on dense-array signal recordings. Such increased complexity and augmented wealth in the volume of data recorded, have not been accompanied by efforts to streamline and facilitate access to processing methods, which too are susceptible to grow in sophistication. Moreover, unsuccessful attempts to reproduce peer-reviewed publications indicate a problem of transparency in science. This growing problem could be tackled by unrestricted access to methods that promote research transparency and data sharing, ensuring the reproducibility of published results. Here, we provide a free, extensive, open-source software that provides data-analysis, data-management and multi-modality integration solutions for invasive neurophysiology. Users can perform their entire analysis through a user-friendly environment without the need of programming skills, in a tractable (logged) way. This work contributes to open-science, analysis standardization, transparency and reproducibility in invasive neurophysiology.


Sign in / Sign up

Export Citation Format

Share Document