COUNTER

Until more recently, COUNTER reports were predominantly used by content providers and consumers of electronic journals and databases. One of the most significant developments with COUNTER Release 4 is that it integrated book reports as part of the latest COUNTER Code of Practice. Release 4 makes it possible for academic libraries to assess e-books usage in a consistent, credible, and comparable manner. However, in implementing the COUNTER standards for book usage reporting, the variant practices among e-book vendors impose challenges for librarians to correctly interpret vendor COUNTER reports. Therefore, it is crucial for librarians to consult the Code of Practice and COUNTER implementation guidelines in order to better understand COUNTER reports by individual vendors. Chapter 2 discusses each COUNTER standard report for e-book usage data, pointing to potential issues as they have been implemented by e-books vendors.

2007 ◽  
Vol 25 (1) ◽  
pp. 65-79 ◽  
Author(s):  
Adina González Bonorino ◽  
Valeria E. Molteni

2021 ◽  
Author(s):  
Will Roy ◽  
Brian D. Cameron ◽  
Tim Ribaric

Introduction: “Usage metrics are an effective way for libraries to demonstrate the value of their institutional repositories, however, existing tools are not always reliable and can either undercount or overcount file downloads. As well, although statistics can sometimes be accessed through the various repository interfaces, without an agreed standard it is impossible to reliably assess and compare usage data across different IRs in any meaningful way.”[1] The Task Group for Standards for IR Usage Data has undertaken an information-gathering exercise to better understand both the existing practices of Canadian repositories, as well as the emerging tools and processes available for repositories to track and monitor usage more effectively. This exercise directly links to the broader goals of the Open Repositories Working Group, which are to “strengthen and add value to the network of Canadian open access repositories by collaborating more closely and adopting a broader range of services.”[2] Our recommended course of action is for all Canadian IRs to collectively adopt OpenAIREStatistics. This path aligns with the following recommendations which our group also advances: Recommendations: We suggest the following Mandatory (M) and Optional (O) recommendations: R1(M):All Canadian IRs should adopt the COUNTER Code of Practice. R2(M): All Canadian IRs should select a service that allows for interoperability with other web services via a fully open, or accessible, permissions-based API. R3(M): All Canadian IRs should usea statistics service that practices transparent communication and maintains a governance strategy. In addition, we strongly urge for the future that Canadian IRs consider the following advice. R4(O): Make further investments into understanding and utilizing the common log format (CLF). R5(O): Conduct research into the privacy implications of collecting use statistics via third party services with commercial interests and consider available alternatives. R6(O): Practice a healthy skepticism towards tools and solutions that promise “increased” usage statistics, and instead advocate for responsible collection assessment based on multiple aspects of use.


2015 ◽  
Vol 76 (4) ◽  
pp. 427-449 ◽  
Author(s):  
Alan Rubel ◽  
Mei Zhang

This is a study of the treatment of library patron privacy in licenses for electronic journals in academic libraries. We begin by distinguishing four facets of privacy and intellectual freedom based on the LIS and philosophical literature. Next, we perform a content analysis of 42 license agreements for electronic journals, focusing on terms for enforcing authorized use and collection and sharing of user data. We compare our findings to model licenses, to recommendations proposed in a recent treatise on licenses, and to our account of the four facets of intellectual freedom. We find important conflicts with each.


1999 ◽  
Vol 60 (5) ◽  
pp. 464-476 ◽  
Author(s):  
Karen G. Lawson ◽  
Nancy L. Pelzer

Little is known about how technology-based projects (computer software, articles in electronic journals, Internet-based materials, videotapes and audiotapes) are reviewed for promotion and/or tenure purposes in academic libraries. Reviewers might evaluate projects with traditional criteria or attempt to revise criteria to accommodate computer-related work. To address this issue in more detail, the authors conducted a study to assess how technology-based projects are evaluated in the promotion and/or tenure process for academic librarians in Association of Research Libraries. Survey results show that, while projects, particularly World Wide Web–based materials, are being evaluated in some ARL academic libraries, little has been developed as a core set of measures or assessments for promotion and/or tenure decisions.


Sign in / Sign up

Export Citation Format

Share Document