Review Essay: Matt Petticrew and Helen Roberts, Systematic Reviews in the Social Sciences: A Practical Guide. Oxford, UK: Blackwell Publishing, 2005. 352 pp. ISBN 978—14051—2110—1 (hbk); ISBN 978—1—4051—2111—8 (pbk)

2008 ◽  
Vol 7 (3) ◽  
pp. 381-386
Author(s):  
Matthew Norton
Author(s):  
N. S. Babich

The author analyzes implicit epistemological assumptions of the modern systematic reviews of scientific literatures that usually are left unconsidered or problematized. The foundations for building the image of scientific communication as representative, clearly cut and easily analyzed reflection of efficient search for and spread of truth which approaching is characterized by increased explorers’ consent. Generalization of this communication brings the evidential effect to advance argument in scientific discussions. However, a series of conditions for adequate conversion and «migration» of published conclusions into the conclusions of systematic review has to be provided to preserve evidential effect in summarizing analysis. The essential components of systematic reviewing methodology comprise: setting the task of obtaining quantified results; selection criteria for unambiguous correspondence between the model of process under scientific investigation and totality of publications; representative observation of relevant publications and making conclusions based on comparative evidential effect of research and consent level achieved. The systematic reviews compliant with the above requirements make them a powerful instrument of evidence in the social sciences, biology and medicine.


2012 ◽  
Vol 36 (112) ◽  
pp. 6-15 ◽  
Author(s):  
Sue F Phelps ◽  
Nicole Campbell

This article is about the use of systematic reviews as a research methodology in library and information studies (LIS). A systematic review is an attempt to gather all of the research on a given topic in order to answer a specific question. They have been used extensively in the health care field and have more recently found their way into the social sciences, including librarianship. Examples of the use of systematic reviews in LIS illustrate the benefits and challenges to using this methodology. Included is a brief description of how to conduct a review and a reading list for further information.


2020 ◽  
Vol 44 (1-2) ◽  
pp. 1-2
Author(s):  
Harrison Dekker ◽  
Amy Riegelman

As guest editors, we are excited to publish this special double issue of IASSIST Quarterly. The topics of reproducibility, replicability, and transparency have been addressed in past issues of IASSIST Quarterly and at the IASSIST conference, but this double issue is entirely focused on these issues. In recent years, efforts “to improve the credibility of science by advancing transparency, reproducibility, rigor, and ethics in research” have gained momentum in the social sciences (Center for Effective Global Action, 2020). While few question the spirit of the reproducibility and research transparency movement, it faces significant challenges because it goes against the grain of established practice. We believe the data services community is in a unique position to help advance this movement given our data and technical expertise, training and consulting work, international scope, and established role in data management and preservation, and more. As evidence of the movement, several initiatives exist to support research reproducibility infrastructure and data preservation efforts: Center for Open Science (COS) / Open Science Framework (OSF)[i] Berkeley Initiative for Transparency in the Social Sciences (BITSS)[ii] CUrating for REproducibility (CURE)[iii] Project Tier[iv] Data Curation Network[v] UK Reproducibility Network[vi] While many new initiatives have launched in recent years, prior to the now commonly used phrase “reproducibility crisis” and Ioannidis publishing the essay, “Why Most Published Research Findings are False,” we know that the data services community was supporting reproducibility in a variety of ways (e.g., data management, data preservation, metadata standards) in wellestablished consortiums such as Inter-university Consortium for Political and Social Research (ICPSR) (Ioannidis, 2005). The articles in this issue comprise several very important aspects of reproducible research: Identification of barriers to reproducibility and solutions to such barriers Evidence synthesis as related to transparent reporting and reproducibility Reflection on how information professionals, researchers, and librarians perceive the reproducibility crisis and how they can partner to help solve it. The issue begins with “Reproducibility literature analysis” which looks at existing resources and literature to identify barriers to reproducibility and potential solutions. The authors have compiled a comprehensive list of resources with annotations that include definitions of key concepts pertinent to the reproducibility crisis. The next article addresses data reuse from the perspective of a large research university. The authors examine instances of both successful and failed data reuse instances and identify best practices for librarians interested in conducting research involving the common forms of data collected in an academic library. Systematic reviews are a research approach that involves the quantitative and/or qualitative synthesis of data collected through a comprehensive literature review.  “Methods reporting that supports reader confidence for systematic reviews in psychology” looks at the reproducibility of electronic literature searches reported in psychology systematic reviews. A fundamental challenge in reproducing or replicating computational results is the need for researchers to make available the code used in producing these results. But sharing code and having it to run correctly for another user can present significant technical challenges. In “Reproducibility, preservation, and access to research with Reprozip, Reproserver” the authors describe open source software that they are developing to address these challenges.  Taking a published article and attempting to reproduce the results, is an exercise that is sometimes used in academic courses to highlight the inherent difficulty of the process. The final article in this issue, “ReprohackNL 2019: How libraries can promote research reproducibility through community engagement” describes an innovative library-based variation to this exercise.   Harrison Dekker, Data Librarian, University of Rhode Island Amy Riegelman, Social Sciences Librarian, University of Minnesota   References Center for Effective Global Action (2020), About the Berkeley Initiative for Transparency in the Social Sciences. Available at: https://www.bitss.org/about (accessed 23 June 2020). Ioannidis, J.P. (2005) ‘Why most published research findings are false’, PLoS Medicine, 2(8), p. e124.  doi:  https://doi.org/10.1371/journal.pmed.0020124   [i] https://osf.io [ii] https://www.bitss.org/ [iii] http://cure.web.unc.edu [iv] https://www.projecttier.org/ [v] https://datacurationnetwork.org/ [vi] https://ukrn.org


Sign in / Sign up

Export Citation Format

Share Document