scholarly journals Retractions from altmetric and bibliometric perspectives

2019 ◽  
Vol 70 (2-3) ◽  
pp. 98-110 ◽  
Author(s):  
Hadas Shema ◽  
Oliver Hahn ◽  
Athanasios Mazarakis ◽  
Isabella Peters

Abstract In the battle for better science, the research community must obliterate, at times, works from the publication record, a process known as retraction. Additionally, publications and papers accumulate an altmetric attention score, which is a complementary metric to citation-based metrics. We used the citations, Journal Impact Factor, time between publication and retraction and the reasons behind retraction in order to find determinants of the retracted papers´ altmetric attention score. To find these determinants we compared two samples, one of retractions with top altmetric attention scores and one of retractions with altmetric attention scores chosen at random. We used a binary choice model to estimate the probability of being retracted due to misconduct or error. The model shows positive effects of altmetric scores and the time between publication and retraction on the probability to be retracted due to misconduct in the top sample. We conclude that there is an association between retraction due to misconduct and higher altmetric attention scores within the top sample.

eLife ◽  
2013 ◽  
Vol 2 ◽  
Author(s):  
Randy Schekman ◽  
Mark Patterson

It is time for the research community to rethink how the outputs of scientific research are evaluated and, as the San Francisco Declaration on Research Assessment makes clear, this should involve replacing the journal impact factor with a broad range of more meaningful approaches.


2020 ◽  
Vol 15 (4) ◽  
pp. 315-322
Author(s):  
Ekaterina Batalova ◽  
Kirill Furmanov ◽  
Ekaterina Shelkova

We consider a panel model with a binary response variable that is a product of two unobservable factors, each determined by a separate binary choice equation. One of these factors is assumed to be time-invariant and may be interpreted as a latent class indicator. A simulation study shows that maximum likelihood estimates from even the shortest panel are much more reliable than those obtained from a cross-section. As an illustrative example, the model is applied to Russian Longitudinal Monitoring Survey data to estimate a proportion of the non-employed population who are participating in job search.


2021 ◽  
pp. 1-22
Author(s):  
Metin Orbay ◽  
Orhan Karamustafaoğlu ◽  
Ruben Miranda

This study analyzes the journal impact factor and related bibliometric indicators in Education and Educational Research (E&ER) category, highlighting the main differences among journal quartiles, using Web of Science (Social Sciences Citation Index, SSCI) as the data source. High impact journals (Q1) publish only slightly more papers than expected, which is different to other areas. The papers published in Q1 journal have greater average citations and lower uncitedness rates compared to other quartiles, although the differences among quartiles are lower than in other areas. The impact factor is only weakly negative correlated (r=-0.184) with the journal self-citation but strongly correlated with the citedness of the median journal paper (r= 0.864). Although this strong correlation exists, the impact factor is still far to be the perfect indicator for expected citations of a paper due to the high skewness of the citations distribution. This skewness was moderately correlated with the citations received by the most cited paper of the journal (r= 0.649) and the number of papers published by the journal (r= 0.484), but no important differences by journal quartiles were observed. In the period 2013–2018, the average journal impact factor in the E&ER has increased largely from 0.908 to 1.638, which is justified by the field growth but also by the increase in international collaboration and the share of papers published in open access. Despite their inherent limitations, the use of impact factors and related indicators is a starting point for introducing the use of bibliometric tools for objective and consistent assessment of researcher.


2020 ◽  
Vol 13 (3) ◽  
pp. 328-333
Author(s):  
Sven Kepes ◽  
George C. Banks ◽  
Sheila K. Keener

Sign in / Sign up

Export Citation Format

Share Document