scholarly journals Shining a light on the cloud

Impact ◽  
2020 ◽  
Vol 2020 (8) ◽  
pp. 46-47
Author(s):  
Lucy Annette

The Social Sciences & Humanities Open Cloud (SSHOC) is a 40-month-long project under the umbrella of the European Open Science Cloud (EOSC) and funded by Horizon 2020. This project unites 20 partner organisations as well as their 27 associates. SSHOC seeks to create interconnected data infrastructures focused on an integrated, cloud-based network structure.

Impact ◽  
2021 ◽  
Vol 2021 (2) ◽  
pp. 99-100
Author(s):  
Lucy Annette

The Social Sciences & Humanities Open Cloud (SSHOC) is a 40-month-long project under the umbrella of the European Open Science Cloud (EOSC) and funded by Horizon 2020. This project unites 20 partner organisations as well as their 27 associates. SSHOC seeks to create interconnected data infrastructures focused on an integrated, cloud-based network structure.


2020 ◽  
Author(s):  
Alexander Wuttke

The trustworthiness of scientific findings is at the center of current scholarly and public debates. The contestation of scientific knowledge claims is reason to take a break from our every-day tasks as scientists and to reflect upon our doing as professional truth-seekers. This essay reviews two recent books on foundational and practical questions on the scholarly generation of knowledge. 'Why Trust Science' (Oreskes) is an intellectual expedition into the epistemological foundations of science. 'Transparent and Reproducible Social Science Research' (Christensen, Freese, Miguel) is the first book-length primer on contemporary Open Science debates in the social sciences. Together, these books demonstrate the range of what we can learn from the new wave of ‘research on research’, both as curious citizens and as academic scholars.


Author(s):  
Christian Olalla-Soler

This article offers an overview of open science and open-science practices and their applications to translation and interpreting studies (TIS). Publications on open science in different disciplines were reviewed in order to define open science, identify academic publishing practices emerging from the core features of open science, and discuss the limitations of such practices in the humanities and the social sciences. The compiled information was then contextualised within TIS academic publishing practices based on bibliographic and bibliometric data. The results helped to identify what open-science practices have been adopted in TIS, what problems emerge from applying some of these practices, and in what ways such practices could be fostered in our discipline. This article aims to foster a debate on the future of TIS publishing and the role that open science will play in the discipline in the upcoming years.


2020 ◽  
Vol 44 (1-2) ◽  
pp. 1-2
Author(s):  
Harrison Dekker ◽  
Amy Riegelman

As guest editors, we are excited to publish this special double issue of IASSIST Quarterly. The topics of reproducibility, replicability, and transparency have been addressed in past issues of IASSIST Quarterly and at the IASSIST conference, but this double issue is entirely focused on these issues. In recent years, efforts “to improve the credibility of science by advancing transparency, reproducibility, rigor, and ethics in research” have gained momentum in the social sciences (Center for Effective Global Action, 2020). While few question the spirit of the reproducibility and research transparency movement, it faces significant challenges because it goes against the grain of established practice. We believe the data services community is in a unique position to help advance this movement given our data and technical expertise, training and consulting work, international scope, and established role in data management and preservation, and more. As evidence of the movement, several initiatives exist to support research reproducibility infrastructure and data preservation efforts: Center for Open Science (COS) / Open Science Framework (OSF)[i] Berkeley Initiative for Transparency in the Social Sciences (BITSS)[ii] CUrating for REproducibility (CURE)[iii] Project Tier[iv] Data Curation Network[v] UK Reproducibility Network[vi] While many new initiatives have launched in recent years, prior to the now commonly used phrase “reproducibility crisis” and Ioannidis publishing the essay, “Why Most Published Research Findings are False,” we know that the data services community was supporting reproducibility in a variety of ways (e.g., data management, data preservation, metadata standards) in wellestablished consortiums such as Inter-university Consortium for Political and Social Research (ICPSR) (Ioannidis, 2005). The articles in this issue comprise several very important aspects of reproducible research: Identification of barriers to reproducibility and solutions to such barriers Evidence synthesis as related to transparent reporting and reproducibility Reflection on how information professionals, researchers, and librarians perceive the reproducibility crisis and how they can partner to help solve it. The issue begins with “Reproducibility literature analysis” which looks at existing resources and literature to identify barriers to reproducibility and potential solutions. The authors have compiled a comprehensive list of resources with annotations that include definitions of key concepts pertinent to the reproducibility crisis. The next article addresses data reuse from the perspective of a large research university. The authors examine instances of both successful and failed data reuse instances and identify best practices for librarians interested in conducting research involving the common forms of data collected in an academic library. Systematic reviews are a research approach that involves the quantitative and/or qualitative synthesis of data collected through a comprehensive literature review.  “Methods reporting that supports reader confidence for systematic reviews in psychology” looks at the reproducibility of electronic literature searches reported in psychology systematic reviews. A fundamental challenge in reproducing or replicating computational results is the need for researchers to make available the code used in producing these results. But sharing code and having it to run correctly for another user can present significant technical challenges. In “Reproducibility, preservation, and access to research with Reprozip, Reproserver” the authors describe open source software that they are developing to address these challenges.  Taking a published article and attempting to reproduce the results, is an exercise that is sometimes used in academic courses to highlight the inherent difficulty of the process. The final article in this issue, “ReprohackNL 2019: How libraries can promote research reproducibility through community engagement” describes an innovative library-based variation to this exercise.   Harrison Dekker, Data Librarian, University of Rhode Island Amy Riegelman, Social Sciences Librarian, University of Minnesota   References Center for Effective Global Action (2020), About the Berkeley Initiative for Transparency in the Social Sciences. Available at: https://www.bitss.org/about (accessed 23 June 2020). Ioannidis, J.P. (2005) ‘Why most published research findings are false’, PLoS Medicine, 2(8), p. e124.  doi:  https://doi.org/10.1371/journal.pmed.0020124   [i] https://osf.io [ii] https://www.bitss.org/ [iii] http://cure.web.unc.edu [iv] https://www.projecttier.org/ [v] https://datacurationnetwork.org/ [vi] https://ukrn.org


2021 ◽  
Author(s):  
Veli-Matti Karhulahti ◽  
Hans-Joachim Backe

Openings of research results, datasets, and scientific practices in general are currently being implemented across fields. Especially strongly data-driven areas like medicine are discussing publishing transparency too – in a context where open review formats now dominate. Social sciences and humanities (SSH), in turn, still rely on closed systems. In this study, we draw on 12 semi-structured interviews with chief editors of leading journals in SSH fields to better understand the transparencies of such review processes. We find that, within SSH, ‘double blind’ peer review represents a gold standard that credible journals follow by default. However, the actual review processes of these journals are multi-stage and largely open with the authors’ names standardly visible to decision-making peers, with ‘double blind’ principles forming but part of it. We recommend journals to communicate the transparencies of their review in more detail, also and especially if they are ‘double blind’.


2021 ◽  
Vol 50 (1) ◽  
pp. 15
Author(s):  
Matthias Reiter-Pázmándy

Open science and open access to research data are important aspects of research policy in Austria. In the last years, the social sciences have seen the building of research infrastructures that generate data and archives that store data. Data standards have been established, several working groups exist and a number of activities aim to further develop various aspects of open science, open data and access to data. However, some barriers and challenges still exist in the practice of sharing research data. One aspect that should be emphasised and incentivised is the re-use of research data.


Author(s):  
Lyubomir Penev ◽  
Dimitrios Koureas ◽  
Quentin Groom ◽  
Jerry Lanfear ◽  
Donat Agosti ◽  
...  

The Horizon 2020 project Biodiversity Community Integrated Knowledge Library (BiCIKL) (started 1st of May 2021, duration 3 years) will build a new European community of key research infrastructures, researchers, citizen scientists and other stakeholders in biodiversity and life sciences. Together, the BiCIKL 14 partners will solidify open science practices by providing access to data, tools and services at each stage of, and along the entire biodiversity research and data life cycle (specimens, sequences, taxon names, analytics, publications, biodiversity knowledge graph) (Fig. 1, see also the BiCIKL kick-off presentation through Suppl. material 1), in compliance with the FAIR (Findable, Accessible, Interoperable and Reusable) data principles. The existing services provided by the participating infrastructures will expand through development and adoption of shared, common or interoperable domain standards, resulting in liberated and enhanced flows of data and knowledge across these domains. BiCIKL puts a special focus on the biodiversity literature. Over the span of the project, BiCIKL will develop new methods and workflows for semantic publishing and integrated access to harvesting, liberating, linking, and re-using sub-article-level data extracted from literature (i.e., specimens, material citations, sequences, taxonomic names, taxonomic treatments, figures, tables). Data linkages may be realised with different technologies (e.g., data warehousing, linking between FAIR Data Objects, Linked Open Data) and can be bi-lateral (between two data infrastructures) or multi-lateral (among multiple data infrastructures). The main challenge of BiCIKL is to design, develop and implement a FAIR Data Place (FDP), a central tool for search, discovery and management of interlinked FAIR data across different domains. The key final output of BiCIKL will the future Biodiversity Knowledge Hub (BKH), a one-stop portal, providing access to the BiCIKL services, tools and workflows, beyond the lifetime of the project.


2017 ◽  
Vol 25 (2) ◽  
pp. 150-173 ◽  
Author(s):  
Finbarr Brereton ◽  
Eoin O'Neill ◽  
Louise Dunne

Academic research is increasingly required to demonstrate economic and policy relevance, with this becoming a key metric by which the success of research projects are being judged. Furthermore, the active, as opposed to passive, participation of citizens in science is now encouraged through dissemination and outreach, using, for example, co-production techniques. These non-traditional academic impacts have become a key component of a number of funding agency calls, most notably the European Union’s research funding programme Horizon 2020. However, exactly how measurable these ‘impacts’ are, particularly social and policy impacts, is unclear as there is not an obvious metric. Additionally, there is no standardised approach to assessing research impact recognised in the social sciences. Using a case study which describes the experience of using public engagement seminars as a means to disseminate academic research to stakeholder communities, this article aims to develop an impact assessment strategy to measure societal impact applicable in the social sciences. Based on recommendations in the UK Research Excellence Framework, amongst other literature, we put forward three steps to better capture research ‘impact’ in a more meaningful way in future research projects: (i) establish the quality of the academic research, (ii) choose appropriate discipline-specific criteria for measuring societal impact and (iii) choose appropriate measurable indicators. Other useful insights include the difficulty of motivating public interest in topics that are no longer high profile or emotive, and hence the necessity to provide access to research findings as early as possible in the research cycle. The article concludes with a discussion of the difficulties of measuring ‘impact’ in a meaningful sense.


Sign in / Sign up

Export Citation Format

Share Document