International Journal of Digital Curation
Latest Publications


TOTAL DOCUMENTS

499
(FIVE YEARS 80)

H-INDEX

23
(FIVE YEARS 1)

Published By "Ukoln, University Of Bath"

1746-8256

2022 ◽  
Vol 16 (1) ◽  
pp. 10
Author(s):  
Bradley Wade Bishop ◽  
Carolyn F Hank ◽  
Joel T Webster

   This paper assesses data consumers’ perspectives on the interoperable and re-usable aspects of the FAIR Data Principles. Taking a domain-specific informatics approach, ten oceanographers were asked to think of a recent search for data and describe their process of discovery, evaluation, and use. The interview schedule, derived from the FAIR Data Principles, included questions about the interoperability and re-usability of data. Through this critical incident technique, findings on data interoperability and re-usability give data curators valuable insights into how real-world users access, evaluate, and use data. Results from this study show that oceanographers utilize tools that make re-use simple, with interoperability seamless within the systems used. The processes employed by oceanographers present a good baseline for other domains adopting the FAIR Data Principles. 


2021 ◽  
Vol 16 (1) ◽  
pp. 6
Author(s):  
Caroline Jay ◽  
Robert Haines ◽  
Daniel S. Katz

Software now lies at the heart of scholarly research. Here we argue that as well as being important from a methodological perspective, software should, in many instances, be recognised as an output of research, equivalent to an academic paper. The article discusses the different roles that software may play in research and highlights the relationship between software and research sustainability and reproducibility. It describes the challenges associated with the processes of citing and reviewing software, which differ from those used for papers. We conclude that whilst software outputs do not necessarily fit comfortably within the current publication model, there is a great deal of positive work underway that is likely to make an impact in addressing this.


2021 ◽  
Vol 16 (1) ◽  
pp. 17
Author(s):  
Evy Neyens ◽  
Sadia Vancauwenbergh

In Flanders, Research Performing Organizations (RPO) are required to provide information on publicly financed research to the Flemish Research Information Space (FRIS), a current research information system and research discovery platform hosted by the Flemish Department of Economics, Science and Innovation. FRIS currently discloses information onresearchers, research institutions, publications, and projects. Flemish decrees on Special and Industrial research funding, and the Flemish Open Science policy require RPOs to also provide metadata on research datasets to FRIS. To ensure accurate and uniform delivery of information across all information providing institutions on research datasets to FRIS, it isnecessary to develop a common application profile for research datasets. This article outlines the development of the Flemish application profile for research datasets that was developed by the Flemish Open Science Board (FOSB) WorkingGroup Metadata and Standardization. The main challenge was to achieve interoperability among stakeholders, which in part had existing metadata schemes and research information infrastructures in place, while others were still in the early stages of development.


2021 ◽  
Vol 16 (1) ◽  
pp. 16
Author(s):  
Amy Currie ◽  
William Kilbride

Digital preservation is a fast-moving and growing community of practice of ubiquitous relevance, but in which capability is unevenly distributed. Within the open science and research data communities, digital preservation has a close alignment to the FAIR principles and is delivered through a complex specialist infrastructure comprising technology, staff and policy. However, capacity erodes quickly, establishing a need for ongoing examination and review to ensure that skills, technology, and policy remain fit for changing purpose. To address this challenge, the Digital Preservation Coalition (DPC) conducted the FAIR Forever study, commissioned by the European Open Science Cloud (EOSC) Sustainability Working Group and funded by the EOSC Secretariat Project in 2020, to assess the current strengths, weaknesses, opportunities and threats to the preservation of research data across EOSC, and the feasibility of establishing shared approaches, workflows and services that would benefit EOSC stakeholders. This paper draws from the FAIR Forever study to document and explore its key findings on the identified strengths, weaknesses, opportunities, and threats to the preservation of FAIR data in EOSC, and to the preservation of research data more broadly. It begins with background of the study and an overview of the methodology employed, which involved a desk-based assessment of the emerging EOSC vision, interviews with representatives of EOSC stakeholders, and focus groups with digital preservation specialists and data managers in research organizations. It summarizes key findings on the need for clarity on digital preservation in the EOSC vision and for elucidation of roles, responsibilities, and accountabilities to mitigate risks of data loss, reputation, and sustainability. It then outlines the recommendations provided in the final report presented to the EOSC Sustainability Working Group. To better ensure that research data can be FAIRer for longer, the recommendations of the study are presented with discussion on how they can be extended and applied to various research data stakeholders in and outside of EOSC, and suggest ways to bring together research data curation, management, and preservation communities to better ensure FAIRness now and in the long term.


2021 ◽  
Vol 16 (1) ◽  
pp. 8
Author(s):  
Mary Elizabeth Downing-Turner

Analog audio materials present unique preservation and access challenges for even the largest libraries. These challenges are magnified for smaller institutions where budgets, staffing, and equipment limit what can be achieved. Because in-house migration to digital of analog audio is often out of reach for smaller institutions, the choice is between finding the room in the budget to out-source a project, or sit by and watch important materials decay. Cost is the most significant barrier to audio migration. Audio preservation labs can charge hundreds or even thousands of dollars to migrate analog to digital. Top-tier audio preservation equipment is equally expensive. When faced with the decomposition of an oral history collection recorded on cassette tape, one library decided that where there was a will, there was a way. The College of Education One-Room Schoolhouse Oral History Collection consisted of 247 audio cassettes containing interviews with one-room school house teachers from 68 counties in Kansas. The cassette tapes in this collection were between 20-40 years old and generally inaccessible for research due to fear the tapes could be damaged during playback. This case study looks at how a single Digital Curation Librarian with no audio digitization experience migrated nearly 200 hours of audio to digital using a $40 audio converter from Amazon and a campus subscription to Adobe Audition. This case study covers the decision to digitize the collection, the digitization process including audio clean-up, metadata collection and creation, presentation of the collection in CONTENTdm, and final preservation of audio files. The project took 20 months to complete and resulted in significant lessons learned that have informed decisions regarding future audio conversion projects.    


2021 ◽  
Vol 16 (1) ◽  
pp. 15
Author(s):  
Evanthia Samaras

Digital visual effects (VFX), including computer animation, have become a commonplace feature of contemporary episodic and film production projects. Using various commercial applications and bespoke tools, VFX artists craft digital objects (known as “assets”) to create visual elements such as characters and environments, which are composited together and output as shots. While the shots that make up the finished film or television (TV) episode are maintained and preserved within purpose-built digital asset management systems and repositories by the studios commissioning the projects; the wider VFX network currently has no consistent guidelines nor requirements around the digital curation of VFX digital assets and records. This includes a lack of guidance about how to effectively futureproof digital VFX and preserve it for the long-term. In this paper I provide a case study – a single shot from a 3D animation short film – to illustrate the complexities of digital VFX assets and records and the pipeline environments whence they are generated. I also draw from data collected from interviews with over 20 professional VFX practitioners from award-winning VFX companies, and I undertake socio-technical analysis of VFX using actor-network theory. I explain how high data volumes of digital information, rapid technology progression and dependencies on software pose significant preservation challenges. In addition, I outline that by conducting holistic appraisal, selection and disposal activities across their entire digital collections, and by continuing to develop and adopt open formats; the VFX industry has improved capability to preserve first-hand evidence of their work in years to come.


2021 ◽  
Vol 16 (1) ◽  
pp. 36
Author(s):  
Jukka Rantasaari

Sound research data management (RDM) competencies are elementary tools used by researchers to ensure integrated, reliable, and re-usable data, and to produce high quality research results. In this study, 35 doctoral students and faculty members were asked to self-rate or rate doctoral students’ current RDM competencies and rate the importance of these competencies. Structured interviews were conducted, using close-ended and open-ended questions, covering research data lifecycle phases such as collection, storing, organization, documentation, processing, analysis, preservation, and data sharing. The quantitative analysis of the respondents’ answers indicated a wide gap between doctoral students’ rated/self-rated current competencies and the rated importance of these competencies. In conclusion, two major educational needs were identified in the qualitative analysis of the interviews: to improve and standardize data management planning, including awareness of the intellectual property and agreements issues affecting data processing and sharing; and to improve and standardize data documenting and describing, not only for the researcher themself but especially for data preservation, sharing, and re-using. Hence the study informs the development of RDM education for doctoral students.


2021 ◽  
Vol 16 (1) ◽  
pp. 23
Author(s):  
Vivian B. Hutchison ◽  
Tamar Norkin ◽  
Maddison L. Langseth ◽  
Drew A. Ignizio ◽  
Lisa S. Zolly ◽  
...  

As Federal Government agencies in the United States pivot to increase access to scientific data (Sheehan, 2016), the U.S. Geological Survey (USGS) has made substantial progress (Kriesberg et al., 2017). USGS authors are required to make federally funded data publicly available in an approved data repository (USGS, 2016b). This type of public data product, known as a USGS data release, serves as a method for publishing reviewed and approved data. In this paper, we present major milestones in the approach the USGS took to transition an existing technology platform to a Trusted Digital Repository. We describe both the technical and the non-technical actions that contributed to a successful outcome.We highlight how initial workflows revealed patterns that were later automated, and the ways in which assessments and user feedback influenced design and implementation. The paper concludes with lessons learned, such as the importance of a community of practice, application programming interface (API)-driven technologies, iterative development, and user-centered design. This paper is intended to offer a potential roadmap for organizations pursuing similar goals.  


2021 ◽  
Vol 16 (1) ◽  
pp. 16
Author(s):  
Live Kvale ◽  
Nils Pharo

A three-phase Delphi study was used to investigate an emerging community for research data management in Norway and their understanding and application of data management plans (DMPs). The findings reveal visions of what the DMP should be as well as different practice approaches, yet the stakeholders present common goals. This paper discusses the different perspectives on the DMP by applying Star and Griesemer’s theory of boundary objects (Star & Griesemer, 1989). The debate on what the DMP is and the findings presented are relevant to all research communities currently implementing DMP procedures and requirements. The current discussions about DMPs tend to be distant from the active researchers and limited to the needs of funders and institutions rather than to the usefulness for researchers. By analysing the DMP as a boundary object, plastic and adaptable yet with a robust identity (Star & Griesemer, 1989), and by translating between worlds where collaboration on data sharing can take place we expand the perspectives and include all stakeholders. An understanding of the DMP as a boundary object can shift the focus from shaping a DMP which fulfils funders’ requirements to enabling collaboration on data management and sharing across domains using standardised forms.


2021 ◽  
Vol 16 (1) ◽  
pp. 31
Author(s):  
Alexis Martin ◽  
Charlotte Chazeau ◽  
Nicolas Gasco ◽  
Guy Duhamel ◽  
Patrice Pruvost

The scientific monitoring of the Southern Ocean French fishing industry is based on the use the Pecheker database. Pecheker is dedicated to the digital curation of the data collected on field by scientific observers and which analysis allows the scientists of the Muséum national d’Histoire naturelle institution to provide guidelines and advice for the regulation of the fishing activity, the protection of the fish stocks and the protection of the marine ecosystems. The template of Pecheker has been developed to make the database adapted to the ecosystem-based management concept. Considering the global context of biodiversity erosion, this modern approach of management aims to take account of the environmental background of the fisheries to ensure their sustainable development. Completeness and high quality of the raw data is a key element for an ecosystem-based management database such as Pecheker. Here, we present the development of this database as a case study of fisheries data curation to be shared with the readers. Full code to deploy a database based on the Pecheker template is provided in supplementary materials. Considering the success factors we could identify, we propose a discussion about how the community could build a global fisheries information system based on a network of small databases including interoperability standards.


Sign in / Sign up

Export Citation Format

Share Document