scholarly journals A Survey on Smart Toll Collection Management System with Security

Author(s):  
Veena S
Author(s):  
Brecht Declercq ◽  
Loes Nijsmans

Both traditional and more recent audiovisual carriers degrade. Even CD-ROMs have typically only a ten-year expected life span. In addition, playback equipment for both analogue and digital carriers will ultimately grow scarcer and more expensive to repair or replace. Archives and museums are inevitably faced with the decision of whether to preserve audiovisual carriers after their content has been digitized. This paper o ers a draft decision- making framework developed by the Flemish Institute of Archiving (VIAA). Assuming that an institution already has a digital collection management system in place, the proposed framework addresses the concepts of favourability, possibility, value, preservation conditions and the risk for other carriers through a series of questions. The paper also addresses the disposal of carriers, should an organization decide that disposal is in the best interests of its collections.


2018 ◽  
Vol 2 ◽  
pp. e26479
Author(s):  
Sharon Grant ◽  
Janeen Jones ◽  
Kate Webbink ◽  
Rob Zschernitz

On the 9th of April 2010 the Field Museum received a momentous email from the ORNIS (ORnithology Network Information System) team informing them that they could now access the products of a nationwide georeferencing project; its bird collection could be, quite literally, put on the map. On the 7th of August 2017 those data (along with the sister datasets from FISHNet (FISH NETwork) and MaNIS (Mammal Network Information System) finally made their way into the Museum’s collection management system. It's easy to get data out, why is it so hard to get it back? To make it easier, what do we need to do in terms of coordination, staffing, and/or technological resources? How can tools like data quality flags better accommodate the needs of data-providers as well as data-users elsewhere along the collections data pipeline? We present a real life case studyof repatriating an enhanced dataset to its institute of origin, including details on timelines, estimates of effort, and lessons learned. The best laid repatriation protocols might not prepare us for everything, but following them more closely might save us some sanity.


2018 ◽  
Vol 2 ◽  
pp. e26083
Author(s):  
Teresa Mayfield

At an institution without a permanent collections manager or curators, who has time to publish data or research issues on that data? Collections with little or no institutional support often benefit from passionate volunteers who continually seek ways to keep them relevant. The University of Texas at El Paso Biodiversity Collections (UTEP-BC) has been cared for in this manner by a small group of dedicated faculty and emeritus curators who have managed with no budget to care for the specimens, perform and publish research about them, and publish a good portion of the collections data. An IMLS grant allowed these dedicated volunteers to hire a Collections Manager who would migrate the already published data from the collections and add unpublished specimen records from the in-house developed FileMaker Pro database to a new collection management system (Arctos) that would allow for better records management and ease of publication. Arctos is a publicly searchable web-based system, but most collections also see the benefit of participation with biodiversity data aggregators such as the Global Biodiversity Information Facility (GBIF), iDigBio, and a multitude of discipline-specific aggregators. Publication of biodiversity data to aggregators is loaded with hidden pathways, acronyms, and tech-speak with which a curator, registrar, or collections manager may not be familiar. After navigating the process to publish the data the reward is feedback! Now data can be improved, and everyone wins, right? In the case of UTEP-BC data, the feedback sits idle as the requirements of the grant under which the Collection Manager was hired take precedence. It will likely remain buried until long after the grant has run its course. Fortunately, the selection of Arctos as a collection management system allowed the UTEP-BC Collection Manager to confer with others publishing biodiversity data to the data aggregators. Members of the Arctos Community have carried on multiple conversations about publishing to aggregators and how to handle the resulting data quality flags. These conversations provide a synthesis of the challenges experienced by collections in over 20 institutions when publishing biodiversity data to aggregators and responding (or not) to their data quality flags. This presentation will cover the experiences and concerns of one Collection Manager as well as those of the Arctos Community related to publishing data to aggregators, deciphering their data quality flags, and development of appropriate responses to those flags.


Author(s):  
Anniina Kuusijärvi ◽  
Ville-Matti Riihikoski ◽  
Samuli Lehtonen ◽  
Gunilla Ståhls ◽  
Marko Hyvärinen ◽  
...  

The Nagoya Protocol (NP) of the Convention on Biological Diversity requires that genetic resource holders and users obtain, preserve and keep relevant documentation. Users and third parties need to be informed on terms of access, which utilisation is allowed, and which benefits need to be shared when respective genetic resources or associated traditional knowledge is utilised in the meaning of the NP. Following the recommendations in the Code of Conduct & Best Practices of the Consortium of European Taxonomic Facilities (CETAF) CETAF Legislations and Regulations Liaison Group 2019, institutions should implement appropriate data management systems to support compliance with the protocol and keep records on acquisition of biological material, utilization of genetic resources, transfers to third parties, benefits derived and shared, and deaccessioning of specimens or disposal of consumed samples. Here we describe how we have implemented the first set of tools to meet the NP requirements in the Kotka Collection Management System (CMS), which is used by eleven Natural History Museums in Finland. The Kotka CMS is used for storing and managing specimen data and for handling material transactions (loans, exchanges, donations and consumptive loans). Users can enter and store all necessary documentation for both incoming and outgoing material as material transactions, which hold information on e.g., the transaction type, description of the material, important dates, correspondent organization and contact person. Specimens are linked to transactions by their unique identifiers and each transaction also has a unique stable identifier. The first version of the tools for meeting the requirements of the Nagoya protocol on both in situ and ex situ accession of genetic resources have been integrated into the transaction section of the system. For genetic resource users to be able to enter, save and provide all the required information about an incoming genetic resource, we have implemented a set of fields to be completed in the transactions in Kotka CMS (Fig. 1). Users can record, for example, a possible IRCC number (Internationally Recognized Certificate of Compliance), acquisition date and providing country, description of the material, information on Prior Informed Consent, Mutually Agreed Terms, Material Transfer Agreement and other possible permits. The Finnish genomic resource legislation requires a notification within one month of acquisition to the Competent National Authority (CNA; The Finnish Environment Institute and Natural Resources Institute Finland) for any imported genetic resources. The required data for the notification is compiled in Kotka CMS and then sent to the CNA. All the documentation and conditions regulating the utilisation of each specimen and derived samples must follow with the specimen data at all times. To accomplish this all the necessary information and documents are linked from the material transactions to the relevant specimens by unique specimen or sample identifiers. In the specimen view page, links to the full transaction details and history are given, as a single specimen or a derived sample can be part of several different types of transactions. Users also see a summary of the transaction information directly in the specimen view, most importantly whether the specimen is available for genetic research or has any restrictions for use. The Kotka CMS transaction section makes use of the Application Programming Interface (API) provided by the Access and Benefit Sharing Clearing House (ABS-CH). Using the API, Kotka CMS validates the IRCC number if given and provides links to the ABS-CH, for example to the relevant country profile page, the contact details of the CNA, and specific requirements for access to genetic resources when applicable. This way, we provide Kotka CMS users up-to-date information from the original source to support their genetic resource management. We will further improve and develop the tools during the years 2019-2020. Now that the first version is in use, we will make adjustments according to user feedback. We also have a few changes planned, for example, the tools for transferring the necessary information on permits and other details with outgoing specimens to a user in another institution abroad will be updated. All users in Finnish natural history institutions have access to all the information directly in Kotka CMS, as it is a national system. Additionally, both specimen and transaction information searchability will be refined.


Sign in / Sign up

Export Citation Format

Share Document