Handbook for Scientometrics: Science and Technology Development Indicators

Author(s):  
Mark Akoev ◽  
◽  
Valentina Markusova ◽  
Olga Moskaleva ◽  
Vladimir Pislyakov ◽  
...  

The Second edition Russian Scientometric Handbook is designed to provide an overview of the field of scientometrics. The Handbook describes the history of creation of the breakthrough concept of citation indexing by Dr. Eugene Garfield, and development of the first multidisciplinary scholarly citation index, the Science Citation Index. Application of scientometric tools and methods in research management and resource allocation is discussed. Authors survey various scientometric indicators relevant to individual researchers, journals, research institutions and whole countries. Authors explore new types of indicators, such as altmetrics, relationship between scientometric indicators and the nature of scientific communication, and various methods of visualizing scientometric information. Possibilities and limitations of various scientometric techniques are examined. Authors highlight the need for an informed and reasonable approach to the use of quantitative indicators for research assessment. The Handbook includes the first Russian translations of three articles by Dr. Eugene Garfield. The Handbook is intended for use by researchers, science analysts, universities and research institutions administrators, libraries and information centers staff, graduate students, and the general reader interested in scientometrics and research evaluation.

Author(s):  
Orlando Gregorio-Chaviano ◽  
Rafael Repiso ◽  
Antonio Calderón-Rehecho ◽  
Joaquín León-Marín ◽  
Evaristo Jiménez-Contreras

Within the current panorama of science evaluation, the limitations of citation indexes to study the social sciences and humanities have been the subject of wide debate. To resolve this situation, different products have been created for use in national contexts, since they cover certain aspects not contained in more international indices. An example is the In-RECS family, where an indicator such as the impact factor of Eugene Garfield is defined, but its contribution lies in the ability to evaluate research in Spain by obtaining citation indicators. This paper thus highlights the need to create new products for research evaluation in general, but particularly in the social sciences and humanities. The context in which different alternatives arise and are developed to evaluate existing journals is presented, along with Dialnet Metrics, a citation index developed by the Dialnet Foundation in collaboration with the EC3 Group and dozens of Spanish universities. Based on an analysis of the citations of source journals from different subject areas, Dialnet Metrics provides indicators to evaluate the research impact at different levels. This bibliometric product enables contextualized analysis at the micro (researchers), meso (journals), and macro (areas and universities) levels. Finally, the content, data volumes, and structure of this citation index are described quantitatively. Resumen Dentro del panorama actual de evaluación de la ciencia, las limitaciones de los índices de citación para estudiar las Ciencias Sociales y Humanidades han sido motivo de amplio debate. Para resolver esta situación, se han creado distintos productos para ser usados en contextos nacionales, dado que cubren ciertos aspectos no presentes en los índices de carácter más internacional. Como ejemplo se encuentran los de la familia In-RECS, donde se define un indicador similar al factor de impacto de Eugene Garfield, pero su aporte radica en la capacidad de evaluar la investigación en España mediante la obtención de indicadores de citas. Es por ello por lo que en este trabajo se expone la necesidad de crear nuevos productos para la evaluación de la investigación en general, pero particularizando en las Ciencias Sociales y Humanidades. Se muestra el contexto en el que surgen y se desarrollan las distintas alternativas de evaluación de revistas existentes y se presenta Dialnet Métricas. Este es un índice de citación realizado por la Fundación Dialnet en colaboración con el Grupo EC3 y decenas de universidades españolas. A partir del análisis de las referencias citadas de revistas fuente de distintos campos temáticos, Dialnet Métricas proporciona indicadores para evaluar el impacto de la investigación a varios niveles. Este producto bibliométrico posibilita el análisis contextualizado a nivel micro (investigadores), meso (revistas) y macro (áreas y universidades). Por último, se describen cuantitativamente los contenidos, volúmenes de datos y estructura de este índice de citas. Palabras clave


2017 ◽  
Vol 16 (6) ◽  
pp. 820-842 ◽  
Author(s):  
Marcelo Marques ◽  
Justin JW Powell ◽  
Mike Zapp ◽  
Gert Biesta

Research evaluation systems in many countries aim to improve the quality of higher education. Among the first of such systems, the UK’s Research Assessment Exercise (RAE) dating from 1986 is now the Research Excellence Framework (REF). Highly institutionalised, it transforms research to be more accountable. While numerous studies describe the system’s effects at different levels, this longitudinal analysis examines the gradual institutionalisation and (un)intended consequences of the system from 1986 to 2014. First, we analyse historically RAE/REF’s rationale, formalisation, standardisation, and transparency, framing it as a strong research evaluation system. Second, we locate the multidisciplinary field of education, analysing the submission behaviour (staff, outputs, funding) of departments of education over time to find decreases in the number of academic staff whose research was submitted for peer review assessment; the research article as the preferred publication format; the rise of quantitative analysis; and a high and stable concentration of funding among a small number of departments. Policy instruments invoke varied responses, with such reactivity demonstrated by (1) the increasing submission selectivity in the number of staff whose publications were submitted for peer review as a form of reverse engineering, and (2) the rise of the research article as the preferred output as a self-fulfilling prophecy. The funding concentration demonstrates a largely intended consequence that exacerbates disparities between departments of education. These findings emphasise how research assessment impacts the structural organisation and cognitive development of educational research in the UK.


Author(s):  
Valery V. Arutyunov ◽  

The article notes the increasing importance of ensuring information security in the data processing, storage, search and transmission in the information systems and information and telecommunications networks in the 21st century. Current research areas in the field of the information security are considered, such as biometric methods and the information protection tools, blockchain technology, cryptography (including quantum cryptography), Intrusion Detection Systems (IDS), the steganographic information protection methods, Data Leakage Prevention systems (DLPsystems), cyberbullying, obfuscation methods, and the information security management. The author analyzes the dynamics of changes in scientometric indicators for scientific activity (publication, citation and Hirsh index) in 2010-2019 with relation to the research areas concerned. A number of trends for those indicators were revealed during the analyzed period, including an explosive growth of publications in the following areas: blockchain technology, information security management, biometric methods and the information protection tools, quantum cryptography, obfuscation methods. After four years of stable demand for research results in the field of information security management, since 2015, that indicator has been falling, what was possibly caused by the 2014 crisis. The growth of the citation index until the end of the period was noted only for two areas: blockchain technology and cyberbullying. The maximum value in indicators of demand for the results of Russian scientists work, achieved by Russian scientists in 2010-2019, was identified in the area of cyberbullying, the minimum – in the field of DLP-systems. The analysis of annual scientometric indicators was carried out using the RSCI database (Russian science citation index).


To better understand all the aspects and components of scientific research activities, this book explores and discusses the research indicators applied for research evaluation and their categorization. The present chapter provides a broad overview of what this book comprises and its main assumptions.


2016 ◽  
Author(s):  
Aleksey V Belikov

The ever-growing need to evaluate researchers without actually reading their work has fertilized the soil for the appearance of hundreds of scientometric indicators. The reason why they continue to emerge is simple: no perfect one has been found yet. The major problem is that any indicator which starts to dominate the evaluation practices causes the evaluated to adjust their behavior accordingly, leading to depreciation of the indicator and diverse scientific malpractice, such as excessive self-citation, self-plagiarism, salami-slicing of publications, guest authorships, and so on. Hence, an indicator is needed that cannot be affected by the evaluated researchers at will, but still captures scientific excellence. Such an indicator is proposed here.


Author(s):  
Ken Peach

This chapter focuses on the review process, the process of writing a proposal and the evaluation of science. The usual way that science is funded these days is through a proposal to a funding agency; if it satisfies peer review and there are sufficient resources available, it is then funded. Peer review is at the heart of academic life, and is used to assess research proposals, progress, publications and institutions. Peer review processes are discussed and, in light of this discussion, the art of proposal writing. The particular features of making fellowship proposals and preparing for an institutional review are described. In addition, several of the methods used for evaluating and ranking research and research institutions are reviewed, including the Research Assessment Exercise and the Research Excellence Framework.


An unbiased and reasonable research evaluation should reflect the diversity and impact of research productivity. The evaluation of scientific research is essential to determine its achievement, reputation, growth, and progress of an individual or an institution. In this context, production and quality of scholarly content offer a strong foundation for its rational evaluation. The citations along with the number of scientific publications are predominantly used to evaluate research content. The use of scientometric indicators is of great use in the measurement and evaluation of the scientific research output, but at the same instant, it requires a great carefulness in use.


Sign in / Sign up

Export Citation Format

Share Document