data preservation
Recently Published Documents


TOTAL DOCUMENTS

188
(FIVE YEARS 57)

H-INDEX

12
(FIVE YEARS 2)

2022 ◽  
pp. 352-368
Author(s):  
Cahyo Trianggoro ◽  
Abdurrakhman Prasetyadi

In recent decades, libraries, archives, and museums have created digital collections that comprise millions of objects to provide long-term access to them. One of the core preservation activities deals with the evaluation of appropriate formats used for encoding digital content. The development of science has entered the 4th paradigm, where data has become much more intensive than in the previous period. This situation raises new challenges in managing research data, especially related to data preservation in digital format, which allows research data to be utilized for the long term. The development of science in the 4th paradigm allows researchers to collaborate with and reuse research datasets produced by a research group. To take advantage of each other's data, there is a principle that must be understood together, namely the FAIR principle, an acronym for findable, accessible, interoperable, and reusable.


2021 ◽  
Author(s):  
Seogchan Kang ◽  
Ki-Tae Kim ◽  
Jaeyoung Choi ◽  
Hyun Kim ◽  
Kyeongchae Cheong ◽  
...  

Genomics’ impact on crop production continuously expands. The number of sequenced plant and microbial species and strains representing diverse populations of individual species rapidly increases thanks to the advent of next generation sequencing technologies. Their genomic blueprints revealed candidate genes involved in various functions and processes crucial for crop health and helped understand how the sequenced organisms have evolved at the genome level. Functional genomics quickly translates these blueprints into a detailed mechanistic understanding of how such functions and processes work and are regulated; this understanding guides and empowers efforts to protect crops from diverse biotic and abiotic threats. Metagenome analyses help identify candidate microbes crucial for crop health and uncover how microbial communities associated with crop production respond to environmental conditions and cultural practices, presenting opportunities to enhance crop health by judiciously configuring microbial communities. Efficient conversion of disparate types of massive genomics data into actionable knowledge requires a robust informatics infrastructure supporting data preservation, analysis, and sharing. This review starts with an overview of how genomics came about and has quickly transformed life science. We illuminate how genomics and informatics can be applied to investigate various crop health-related problems using selected studies. We end the review by noting why community empowerment via crowdsourcing is crucial to harnessing genomics to protect global food and nutrition security without continuously expanding the environmental footprint of crop production.


2021 ◽  
Vol 16 (1) ◽  
pp. 36
Author(s):  
Jukka Rantasaari

Sound research data management (RDM) competencies are elementary tools used by researchers to ensure integrated, reliable, and re-usable data, and to produce high quality research results. In this study, 35 doctoral students and faculty members were asked to self-rate or rate doctoral students’ current RDM competencies and rate the importance of these competencies. Structured interviews were conducted, using close-ended and open-ended questions, covering research data lifecycle phases such as collection, storing, organization, documentation, processing, analysis, preservation, and data sharing. The quantitative analysis of the respondents’ answers indicated a wide gap between doctoral students’ rated/self-rated current competencies and the rated importance of these competencies. In conclusion, two major educational needs were identified in the qualitative analysis of the interviews: to improve and standardize data management planning, including awareness of the intellectual property and agreements issues affecting data processing and sharing; and to improve and standardize data documenting and describing, not only for the researcher themself but especially for data preservation, sharing, and re-using. Hence the study informs the development of RDM education for doctoral students.


2021 ◽  
pp. 1-25
Author(s):  
Peter J. Cobb ◽  
Koraljka Golub

The digital humanities (DH) is an emerging field of teaching and research that invites modern technologies to address traditional humanities questions while simultaneously making space for humanistic critiques of those technologies. A natural relationship exists between DH and the field of information studies (the iField), particularly surrounding their common focus on the interface between humans and computers, as well as subfields such as the organization of information, libraries and archives, data preservation, and information in society. Thus, we propose that iField programs in universities should take an active role in DH education. We are particularly interested in programs that are officially Information Schools (iSchools), members of the international iSchools Organization. Our research began as part of a DH curriculum committee convened by the iSchools Organization. To support iSchool engagement in DH education, we have inventoried and analyzed the degrees and supplemental credentials offered by DH education programs throughout the world. Our study deployed multiple data collection methods, which included conducting both ad hoc and comprehensive website surveys, querying an online DH catalog, and inviting members of the iSchools Organization to participate in an online questionnaire. This work has revealed several common patterns for the current structure of DH programs, including the various types of degrees or supplemental credentials offered. We observe that iSchools have a significant opportunity to become more engaged in DH education and we suggest several possible approaches based on our research.


2021 ◽  
Author(s):  
Despoina Tsiafaki ◽  
Markos Katsianis

This article provides an overview of the current situation in Greece regarding digital archaeological data stewardship. A brief chronicle of Greek archaeology sets the scene for a better understanding of the present situation. Greek archaeology is supervised by the Ministry of Culture and Sports, with the Archaeological Service as the central organisation in charge of antiquities. However, archaeological data resulting from archaeological fieldwork are produced by several other entities. This article presents the policies governing both physical and digital documentation archives. It introduces the current practices for archaeological data preservation and the relevant digital infrastructures, attempting to showcase the existing environment. We categorise prevailing problems on three levels, all based on the fact that digital and open access arrived recently in a well-established environment formed gradually over almost two centuries. Even so, fragmentation and variation would be the proper terms to describe the status of the stewardship of digital archaeological data in Greece. Our review shows that there is substantial effort directed towards digital archaeological data stewardship and accessibility by all stakeholders within the archaeological sector. Finally, we add a few thoughts and suggestions, and indicate the need to generate a network that could take steps towards more inclusive strategies within digital data stewardship. The key to leveraging change is raising awareness about data sustainability and reuse, and the COVID-19 outbreak indicates a clear change in mentality in this direction, since open access resources have begun to be key to education and research conducted in Greece.


Author(s):  
Raveendra Gudodagi ◽  
R. Venkata Siva Reddy

Compression of genomic data has gained enormous momentum in recent years because of advances in technology, exponentially growing health concerns, and government funding for research. Such advances have driven us to personalize public health and medical care. These pose a considerable challenge for ubiquitous computing in data storage. One of the main issues faced by genomic laboratories is the 'cost of storage' due to the large data file of the human genome (ranging from 30 GB to 200 GB). Data preservation is a set of actions meant to protect data from unauthorized access or changes. There are several methods used to protect data, and encryption is one of them. Protecting genomic data is a critical concern in genomics as it includes personal data. We suggest a secure encryption and decryption technique for diverse genomic data (FASTA / FASTQ format) in this article. Since we know the sequenced data is massive in bulk, the raw sequenced file is broken into sections and compressed. The Advanced Encryption Standard (AES) algorithm is used for encryption, and the Galois / Counter Mode (GCM) algorithm, is used to decode the encrypted data. This approach reduces the amount of storage space used for the data disc while preserving the data. This condition necessitates the use of a modern data compression strategy. That not only reduces storage but also improves process efficiency by using a k-th order Markov chain. In this regard, no efforts have been made to address this problem separately, from both the hardware and software realms. In this analysis, we support the need for a tailor-made hardware and software ecosystem that will take full advantage of the current stand-alone solutions. The paper discusses sequenced DNA, which may take the form of raw data obtained from sequencing. Inappropriate use of genomic data presents unique risks because it can be used to classify any individual; thus, the study focuses on the security provisioning and compression of diverse genomic data using the Advanced Encryption Standard (AES) Algorithm.


Information ◽  
2021 ◽  
Vol 12 (5) ◽  
pp. 181
Author(s):  
Christos Karagiannis ◽  
Kostas Vergidis

Fighting crime in cyberspace requires law enforcement authorities to immerse in a digital ocean of vast amount of information and also to acquire and objectify the evidence of criminal activity. Handling digital evidence is a complex and multifaceted process as they can provide critical evidentiary information in an unquestionable and irrefutable way. When digital evidence resides in a cloud storage environment the criminal investigation is faced with unprecedented contemporary legal challenges. In this paper, the authors identify three main legal challenges that arise from the current cloud-based technological landscape, i.e., territoriality (the loss of location), possession (the cloud content ownership) and confiscation procedure (user authentication/data preservation issues). On the onset of the identified challenges, the existing American, European and International legal frameworks are thoroughly evaluated. Finally, the authors discuss and endorse the Power of Disposal, a newly formed legal notion and a multidisciplinary solution with a global effect as a result of collaboration between technical, organizational and legal perspectives as an effective first step to mitigate the identified legal challenges.


Complexity ◽  
2021 ◽  
Vol 2021 ◽  
pp. 1-12
Author(s):  
Gongzheng Liu ◽  
Jingsha He ◽  
Xinggang Xuan

Since digital forensics becomes more and more popular, more and more attention has been paid to the originality and validity of data, and data preservation technology emerges as the times require. However, the current data preservation models and technologies are only the combination of cryptography technology, and there is a risk of being attacked and cracked. And in the process of data preservation, human participation is also needed, which may lead to data tampering. To solve problems given, this paper presents a data preservation model based on blockchain and multidimensional hash. With the decentralization and smart contract characteristics of blockchain, data can be automatically preserved without human participation to form a branch chain of custody in the unit of case, and blockchain has good antiattack performance, which is the so-called 51% attack. Meanwhile, in order to solve the problem of data confusion and hard to query caused by the excessive number of cases, hash, cryptography, and timestamps are used to form a serialized main chain of custody. Because of the confliction problem of hash and judicial trial needs to absolutely guarantee the authenticity and validity of data, multidimensional hash is used to replace regular hash. In this way, the data preservation becomes an automatic, nonhuman-interventional process. Experiments have been carried out to show the security and effectiveness of the proposed model.


Sign in / Sign up

Export Citation Format

Share Document