scholarly journals The Transparency of Quantitative Empirical Legal Research (2018–2020)

2022 ◽  
Author(s):  
Jason Chin ◽  
Kathryn Zeiler ◽  
Natali Dilevski ◽  
Alex O. Holcombe ◽  
Rosemary Grace Gatfield-Jeffries ◽  
...  

Scientists are increasingly concerned with making their work easy to verify and build upon. Associated practices include sharing data, materials, and analytic scripts, and preregistering protocols. This has been referred to as a “credibility revolution”. The credibility of empirical legal research has been questioned in the past due to its distinctive peer review system and because the legal background of its researchers means that many often are not trained in study design or statistics. Still, there has been no systematic study of transparency and credibility-related characteristics of published empirical legal research. To fill this gap and provide an estimate of current practices that can be tracked as the field evolves, we assessed 300 empirical articles from highly ranked law journals including both faculty-edited journals and student-edited journals. We found high levels of article accessibility (86% could be accessed without a subscription, 95% CI = [82%, 90%]), especially among student-edited journals (100% accessibility). Few articles stated that a study’s data are available, (19%, 95% CI = [15%, 23%]), and only about half of those datasets are reportedly available without contacting the author. Preregistration (3%, 95% CI = [1%, 5%]) and availability of analytic scripts (6%, 95% = [4%, 9%]) were very uncommon. We suggest that empirical legal researchers and the journals that publish their work cultivate norms and practices to encourage research credibility.

2015 ◽  
Vol 48 (02) ◽  
pp. 346-352 ◽  
Author(s):  
Paul A. Djupe

ABSTRACTCharges are frequently leveled that the peer-review system is broken, and reviewers are overburdened with requests. But this specific charge has been made in the absence of data about the actual reviewing loads of political scientists. I report the results of a recent survey asking a random sample of about 600 APSA members with PhDs what their reviewing loads are like and what their beliefs are about the value of peer reviewing to them and others. Article reviewing loads correspond to rank, institution, and scholarly productivity in predictable ways. At PhD-granting institutions, assistant professors averaged 5.5, associate professors averaged 7.0, and full professors averaged 8.3 in the past year; everyone else averaged just under 3 reviews a year. To recognize the value we place on peer reviewing, we need a system that collects data on who reviews and presents them in a format usable by scholars and their relevant evaluation bodies.


BMJ ◽  
2011 ◽  
Vol 342 (may16 2) ◽  
pp. d3046-d3046 ◽  
Author(s):  
A. O'Dowd

Sign in / Sign up

Export Citation Format

Share Document