scholarly journals FAIR Principles for Digital Repositories: Essence and Applications for Heritage Objects

Author(s):  
Kalina Sotirova-Valkova ◽  

The emergence of the FAIR initiative in 2016 is based on the need for good management of disparate data, and improving the functionality of digital repositories and e-infrastructures. The aim is to promote the re-use of (scientific) data, a need recognized by academia, industry, funding agencies and memory institutions. This paper discusses the nature of the FAIR principles, its` technologies, the concept of FAIR digital object, FAIR ecosystem and persistent identifiers, a possible solution to the images-publication in scientific publications and in museum digital repositories through the International Image Interoperability Framework (IIIF), and all these through the focus of possible digital vision of the Bulgarian memory institutions. Keywords: FAIR principles, heritage, Persistent Identifiers, LOD

RECIIS ◽  
2021 ◽  
Vol 15 (3) ◽  
Author(s):  
Patricia Henning ◽  
Luis Olavo Bonino Da Silva ◽  
Luís Ferreira Pires ◽  
Marten Van Sinderen ◽  
João Luís Rebelo Moreira

The FAIR principles have become a data management instrument for the academic and scientific community, since they provide a set of guiding principles to bring findability, accessibility, interoperability and reusability to data and metadata stewardship. Since their official publication in 2016 by Scientific Data – Nature, these principles have received worldwide recognition and have been quickly endorsed and adopted as a cornerstone of data stewardship and research policy. However, when put into practice, they occasionally result in organisational, legal and technological challenges that can lead to doubts and uncertainty as to whether the effort of implementing them is worthwhile. Soon after their publication, the European Commission and other funding agencies started to require that project proposals include a Data Management Plan (DMP) based on the FAIR principles. This paper reports on the adherence of DMPs to the FAIR principles, critically evaluating ten European DMP templates. We observed that the current FAIRness of most of these DMPs is only partly satisfactory, in that they address data best practices, findability, accessibility and sometimes preservation, but pay much less attention to metadata and interoperability.


Author(s):  
Joyce Mirella dos Anjos Viana ◽  
Paula Regina Dal'Evedove

The repositories of scientific data are a reality experienced worldwide, contributing to the storage, preservation and access to data from scientific research. In view of the important role that these contemporary systems play, the objective is to investigate the indexing of scientific data within the scope of the Scientific Data Repositories Network of the State of São Paulo, a platform that allows access to scientific data and increases the visibility of research conducted in the participating institutions. To this end, the information policies established by each member institution are analyzed in order to contribute to studies of information representation in digital repositories. This is an exploratory and documentary study, with data collection carried out by consulting the electronic sites of the scientific data repositories and member institutions. The analysis of the identified information policies reveals that the data repositories linked to the Network partially meet the FAIR principles, use software and standards that allow interoperability, have data identification systems and a protocol for collecting metadata. The indexing of scientific data in the member repositories occurs by the author-researcher himself or by the team responsible for the system. There is a need for more in-depth studies on the Network of Scientific Data Repositories of the State of São Paulo, with emphasis on the quality of the subject metadata and the specificities of the indexing policies in scientific data repositories.


2021 ◽  
Author(s):  
Zsolt Tibor Kosztyán ◽  
Beáta Fehérvölgyi ◽  
Tibor Csizmadia ◽  
Kinga Kerekes

AbstractGiven the significant role of universities in economic growth and social progress as well as the increasing demand for greater transparency regarding the use of public money, a valid assessment of university performance has become crucial for various stakeholders, including government, industry, funding agencies, and society at large. Contemporary assessments still focus solely on universities’ properties, thereby failing to capture their network relations. To overcome this limitation, this paper proposes a multilayer network-based method to measure the embeddedness of universities in collaboration and mobility networks. This method has several advantages: first, it is relevant for HEIs’ core missions, introducing a new dimension complementary to the existing rankings; second, it is size invariant; and last but not least, it is fully transparent. The proposed multilayer network approach enables the integration of further networks, which creates opportunities for a more comprehensive assessment of universities’ performance in achieving their core missions.


2015 ◽  
Author(s):  
Ronald D Vale

Scientific publications enable results and ideas to be transmitted throughout the scientific community. The number and type of journal publications also have become the primary criteria used in evaluating career advancement. Our analysis suggests that publication practices have changed considerably in the life sciences over the past thirty years. More experimental data is now required for publication, and the average time required for graduate students to publish their first paper has increased and is approaching the desirable duration of Ph.D. training. Since publication is generally a requirement for career progression, schemes to reduce the time of graduate student and postdoctoral training may be difficult to implement without also considering new mechanisms for accelerating communication of their work. The increasing time to publication also delays potential catalytic effects that ensue when many scientists have access to new information. The time has come for life scientists, funding agencies, and publishers to discuss how to communicate new findings in a way that best serves the interests of the public and the scientific community.


2020 ◽  
Author(s):  
Rahul Ramachandran ◽  
Kaylin Bugbee ◽  
Kevin Murphy

<p>Open science is a concept that represents a fundamental change in scientific culture. This change is characterized by openness, where research objects and results are shared as soon as possible, and connectivity to a wider audience. Understanding about what Open Science actually means  differs from various stakeholders.</p><p>Thoughts on Open Science fall into four distinct viewpoints. The first viewpoint strives to make science accessible to a larger community by focusing on allowing non-scientists to participate in the research process through citizen science project and by more effectively communicating research results to the broader public. The second viewpoint considers providing equitable knowledge access to everyone by not only considering access to journal publications but also to other objects in the research process such as data and code. The third viewpoint focuses on making both the research process and the communication of results more efficient. There are two aspects to this component which can be described as social and technical components. The social component is driven by the need to tackle complex problems that require collaboration and a team approach to science while the technical component focuses on creating tools, services and especially scientific platforms to make the scientific process more efficient. Lastly, the fourth viewpoint strives to develop new metrics to measure scientific contributions that go beyond the current metrics derived solely from scientific publications and to consider contributions from other research objects such as data, code or knowledge sharing through blogs and other social media communication mechanisms. </p><p>Technological change is a factor in all four of these viewpoints on Open Science. New capabilities in compute, storage, methodologies, publication and sharing enable technologists to better serve as the primary drivers for Open Science by providing more efficient technological solutions. Sharing knowledge, information and other research objects such as data and code has become easier with new modalities of sharing available to researchers. In addition, technology is enabling the democratization of science at two levels. First, researchers are no longer constrained by lack of infrastructure resources needed to tackle difficult problems. Second, the Citizen Science projects now involve the public at different steps of the scientific process from collecting the data to analysis.</p><p>This presentations investigates the four described viewpoints on Open Science from the perspective of any large organization involved in scientific data stewardship and management. The presentation will list possible technological strategies that organizations may adopt to further align with all aspects of the Open Science movement. </p>


2020 ◽  
Author(s):  
Chad Trabant ◽  
Rick Benson ◽  
Rob Casey ◽  
Gillian Sharer ◽  
Jerry Carter

<p>The data center of the National Science Foundation’s Seismological Facility for the Advancement of Geoscience (SAGE), operated by IRIS Data Services, has evolved over the past 30 years to address the data accessibility needs of the scientific research community.  In recent years a broad call for adherence to FAIR data principles has prompted repositories to increased activity to support them. As these principles are well aligned with the needs of data users, many of the FAIR principles are already supported and actively promoted by IRIS.  Standardized metadata and data identifiers support findability. Open and standardized web services enable a high degree of accessibility. Interoperability is ensured by offering data in a combination of rich, domain-specific formats in addition to simple, text-based formats. The use of open, rich (domain-specific) format standards enables a high degree of reuse.  Further advancement towards these principles includes: an introduction and dissemination of DOIs for data; and an introduction of Linked Data support, via JSON-LD, allowing scientific data brokers, catalogers and generic search systems to discover data. Naturally, some challenges remain such as: the granularity and mechanisms needed for persistent IDs for data; the reality that metadata is updated with corrections (having implications for FAIR data principles); and the complexity of data licensing in a repository with data contributed from individual PIs, national observatories, and international collaborations.  In summary, IRIS Data Services is well along the path of adherence of FAIR data principles with more work to do. We will present the current status of these efforts and describe the key challenges that remain.</p>


Author(s):  
Ingrid Dillo ◽  
Lisa De Leeuw

Open data and data management policies that call for the long-term storage and accessibility of data are becoming more and more commonplace in the research community. With it the need for trustworthy data repositories to store and disseminate data is growing. CoreTrustSeal, a community based and non-profit organisation, offers data repositories a core level certification based on the DSA-WDS Core Trustworthy Data Repositories Requirements catalogue and procedures. This universal catalogue of requirements reflects the core characteristics of trustworthy data repositories. Core certification involves an uncomplicated process whereby data repositories supply evidence that they are sustainable and trustworthy. A repository first conducts an internal self-assessment, which is then reviewed by community peers. Once the self-assessment is found adequate the CoreTrustSeal board certifies the repository with a CoreTrustSeal. The Seal is valid for a period of three years. Being a certified repository has several external and internal benefits. It for instance improves the quality and transparency of internal processes, increases awareness of and compliance with established standards, builds stakeholder confidence, enhances the reputation of the repository, and demonstrates that the repository is following good practices. It is also offering a benchmark for comparison and helps to determine the strengths and weaknesses of a repository. In the future we foresee a larger uptake through different domains, not in the least because within the European Open Science Cloud, the FAIR principles and therefore also the certification of trustworthy digital repositories holding data is becoming increasingly important. Next to that the CoreTrustSeal requirements will most probably become a European Technical standard which can be used in procurement (under review by the European Commission).


2020 ◽  
Vol 2 (1-2) ◽  
pp. 122-130 ◽  
Author(s):  
Larry Lannom ◽  
Dimitris Koureas ◽  
Alex R. Hardisty

We examine the intersection of the FAIR principles (Findable, Accessible, Interoperable and Reusable), the challenges and opportunities presented by the aggregation of widely distributed and heterogeneous data about biological and geological specimens, and the use of the Digital Object Architecture (DOA) data model and components as an approach to solving those challenges that offers adherence to the FAIR principles as an integral characteristic. This approach will be prototyped in the Distributed System of Scientific Collections (DiSSCo) project, the pan-European Research Infrastructure which aims to unify over 110 natural science collections across 21 countries. We take each of the FAIR principles, discuss them as requirements in the creation of a seamless virtual collection of bio/geo specimen data, and map those requirements to Digital Object components and facilities such as persistent identification, extended data typing, and the use of an additional level of abstraction to normalize existing heterogeneous data structures. The FAIR principles inform and motivate the work and the DO Architecture provides the technical vision to create the seamless virtual collection vitally needed to address scientific questions of societal importance.


2017 ◽  
Author(s):  
Chun-Nan Hsu ◽  
Anita Bandrowski ◽  
Jeffrey S. Grethe ◽  
Maryann E. Martone

Digital repositories bring direct impact and influence on the research community and society but measuring their value using formal metrics remains challenging. their value. It is challenging to define a single perfect metric that covers all quality aspects. Here, we distinguish here between impact and influence and discuss measures and mentions as the basis of quality metrics of a digital repository. We argue that these challenges may potentially be overcome through the introduction of standard resource identification and data citation practices. We briefly summarize our research and experience in the Neuroscience Information Framework, the BD2K BioCaddie project on data citation, and the Resource Identification Initiative. Full implementation of these standards will depend on cooperation from all stakeholders --- digital repositories, authors, publishers, and funding agencies, but both resource and data citation have been gaining support with researchers and publishers.


Author(s):  
I. Ivánová ◽  
N. Brown ◽  
R. Fraser ◽  
N. Tengku ◽  
E. Rubinov

Abstract. FAIR, which stands for Findable, Accessible, Interoperable and Reusable, are the main principles adopted for sharing scientific data across communities. Implementing FAIR principles in publishing increases the value of digital resources, and the reuse of these by humans as well as machines. Introducing FAIR practices to the geospatial domain is especially relevant for the foundation geospatial data, such as precise positioning data. Within the next five years, Global Navigation Satellite Systems (GNSS), with corrections from internet or satellite communications, will permit national coverage of positioning services with real-time accuracy of several centimetres or better. However, implementing FAIR principles is not yet common practice in the geospatial domain. There are dozens of standards available for defining and sharing geospatial data. These include the ISO 19100 series of standards, OGC specifications and several community profiles and best practice. However, in most cases these standards fall short in ensuring the FAIR distribution of geospatial resources. As our preliminary findings show, current geodetic metadata and data are not yet fully FAIR and data discovery and access is still very challenging. In this paper we discuss the concept of FAIR and its meaning for geodetic data, explore the needs of precise positioning users and their requirement for metadata and present preliminary results on the FAIRness of current geodetic standards.


Sign in / Sign up

Export Citation Format

Share Document