scholarly journals All the Research That’s Fit to Print: Open Access and the News Media

2021 ◽  
pp. 1-35
Author(s):  
Teresa Schultz

Abstract The goal of the open access (OA) movement is to help everyone access the scholarly research, not just those who can afford to. However, most studies looking at whether OA has met this goal have focused on whether other scholars are making use of OA research. Few have considered how the broader public, including the news media, uses OA research. This study sought to answer whether the news media mentions OA articles more or less than paywalled articles by looking at articles published from 2010 through 2018 in journals across all four quartiles of the Journal Impact Factor using data obtained through Altmetric.com and the Web of Science. Gold, green and hybrid OA articles all had a positive correlation with the number of news mentions received. News mentions for OA articles did see a dip in 2018, although they remained higher than those for paywalled articles. Peer Review https://publons.com/publon/10.1162/qss_a_00139

2020 ◽  
Vol 49 (5) ◽  
pp. 35-58
Author(s):  
Matthias Templ

This article is motivated by the work as editor-in-chief of the Austrian Journal of Statistics and contains detailed analyses about the impact of the Austrian Journal of Statistics. The impact of a journal is typically expressed by journal metrics indicators. One of the important ones, the journal impact factor is calculated from the Web of Science (WoS) database by Clarivate Analytics. It is known that newly established journals or journals without membership in big publishers often face difficulties to be included, e.g., in the Science Citation Index (SCI) and thus they do not receive a WoS journal impact factor, as it is the case for example, for the Austrian Journal of Statistics. In this study, a novel approach is pursued modeling and predicting the WoS impact factor of journals using open access or partly open-access databases, like Google Scholar, ResearchGate, and Scopus. I hypothesize a functional linear dependency between citation counts in these databases and the journal impact factor. These functional relationships enable the development of a model that may allow estimating the impact factor for new, small, and independent journals not listed in SCI. However, only good results could be achieved with robust linear regression and well-chosen models. In addition, this study demonstrates that the WoS impact factor of SCI listed journals can be successfully estimated without using the Web of Science database and therefore the dependency of researchers and institutions to this popular database can be minimized. These results suggest that the statistical model developed here can be well applied to predict the WoS impact factor using alternative open-access databases. 


2008 ◽  
Vol 9 (7) ◽  
pp. 582-590 ◽  
Author(s):  
Xiu-fang Wu ◽  
Qiang Fu ◽  
Ronald Rousseau

Author(s):  
Ran Na

Abstract Objectives: Both citations and Altmetrics are indexes of influence of a publication, potentially useful, but to what extent that the professional-academic citation and media-dominated Altmetrics are consistent with each other is a topic worthy of being investigated. The objective is to show their correlation. Methods: DOI and citation information of COVID-19 researches were obtained from the Web of Science, its Altmetric indicators were collected from the Altmetrics. Correlation between the immediacy of citation and Altmetrics of COVID-19 research was studied by artificial neural networks. Results: Pearson coefficients are 0.962, 0.254, 0.222, 0.239, 0.363, 0.218, 0.136, 0.134, and 0.505 (p<0.01) for dimensions citation, attention score, journal impact factor, news, blogs, Twitter, Facebook, video, and Mendeley correlated with the SCI citation, respectively. The citations from the Web of Science and that from the Altmetrics have deviance large enough in the current. Altmetric score isn’t precise to describe the immediacy of citations of academic publication in COVID-19 research. Conclusions: The effects of news, blogs, Twitter, Facebook, video, and Mendeley on SCI citations are similar to that of the journal impact factor. This paper performs a pioneer study for investigating the role of academic topics across Altmetric sources on the dissemination of scholarly publications.


2020 ◽  
Vol 13 (5) ◽  
pp. 723-727
Author(s):  
Alberto Ortiz

Abstract The Clinical Kidney Journal (ckj) impact factor from Clarivate’s Web of Science for 2019 was 3.388. This consolidates ckj among journals in the top 25% (first quartile, Q1) in the Urology and Nephrology field according to the journal impact factor. The manuscripts contributing the most to the impact factor focused on chronic kidney disease (CKD) epidemiology and evaluation, CKD complications and their management, cost-efficiency of renal replacement therapy, pathogenesis of CKD, familial kidney disease and the environment–genetics interface, onconephrology, technology, SGLT2 inhibitors and outcome prediction. We provide here an overview of the hottest and most impactful topics for 2017–19.


2018 ◽  
Vol XVI (2) ◽  
pp. 369-388 ◽  
Author(s):  
Aleksandar Racz ◽  
Suzana Marković

Technology driven changings with consecutive increase in the on-line availability and accessibility of journals and papers rapidly changes patterns of academic communication and publishing. The dissemination of important research findings through the academic and scientific community begins with publication in peer-reviewed journals. Aim of this article is to identify, critically evaluate and integrate the findings of relevant, high-quality individual studies addressing the trends of enhancement of visibility and accessibility of academic publishing in digital era. The number of citations a paper receives is often used as a measure of its impact and by extension, of its quality. Many aberrations of the citation practices have been reported in the attempt to increase impact of someone’s paper through manipulation with self-citation, inter-citation and citation cartels. Authors revenues to legally extend visibility, awareness and accessibility of their research outputs with uprising in citation and amplifying measurable personal scientist impact has strongly been enhanced by on line communication tools like networking (LinkedIn, Research Gate, Academia.edu, Google Scholar), sharing (Facebook, Blogs, Twitter, Google Plus) media sharing (Slide Share), data sharing (Dryad Digital Repository, Mendeley database, PubMed, PubChem), code sharing, impact tracking. Publishing in Open Access journals. Many studies and review articles in last decade have examined whether open access articles receive more citations than equivalent subscription toll access) articles and most of them lead to conclusion that there might be high probability that open access articles have the open access citation advantage over generally equivalent payfor-access articles in many, if not most disciplines. But it is still questionable are those never cited papers indeed “Worth(less) papers” and should journal impact factor and number of citations be considered as only suitable indicators to evaluate quality of scientists? “Publish or perish” phrase usually used to describe the pressure in academia to rapidly and continually publish academic work to sustain or further one’s career can now in 21. Century be reformulate into “Publish, be cited and maybe will not Perish”.


2018 ◽  
Author(s):  
LM Hall ◽  
AE Hendricks

AbstractBackgroundRecently, there has been increasing concern about the replicability, or lack thereof, of published research. An especially high rate of false discoveries has been reported in some areas motivating the creation of resource-intensive collaborations to estimate the replication rate of published research by repeating a large number of studies. The substantial amount of resources required by these replication projects limits the number of studies that can be repeated and consequently the generalizability of the findings.Methods and findingsIn 2013, Jager and Leek developed a method to estimate the empirical false discovery rate from journal abstracts and applied their method to five high profile journals. Here, we use the relative efficiency of Jager and Leek’s method to gather p-values from over 30,000 abstracts and to subsequently estimate the false discovery rate for 94 journals over a five-year time span. We model the empirical false discovery rate by journal subject area (cancer or general medicine), impact factor, and Open Access status. We find that the empirical false discovery rate is higher for cancer vs. general medicine journals (p = 5.14E-6). Within cancer journals, we find that this relationship is further modified by journal impact factor where a lower journal impact factor is associated with a higher empirical false discovery rates (p = 0.012, 95% CI: -0.010, -0.001). We find no significant differences, on average, in the false discovery rate for Open Access vs closed access journals (p = 0.256, 95% CI: -0.014, 0.051).ConclusionsWe find evidence of a higher false discovery rate in cancer journals compared to general medicine journals, especially those with a lower journal impact factor. For cancer journals, a lower journal impact factor of one point is associated with a 0.006 increase in the empirical false discovery rate, on average. For a false discovery rate of 0.05, this would result in over a 10% increase to 0.056. Conversely, we find no significant evidence of a higher false discovery rate, on average, for Open Access vs. closed access journals from InCites. Our results provide identify areas of research that may need of additional scrutiny and support to facilitate replicable science. Given our publicly available R code and data, others can complete a broad assessment of the empirical false discovery rate across other subject areas and characteristics of published research.


BMJ Open ◽  
2021 ◽  
Vol 11 (9) ◽  
pp. e048581
Author(s):  
Fernanda S Tonin ◽  
Ariane G Araujo ◽  
Mariana M Fachi ◽  
Vinicius L Ferreira ◽  
Roberto Pontarolo ◽  
...  

ObjectiveWe assessed the extent of lag times in the publication and indexing of network meta-analyses (NMAs).Study designThis was a survey of published NMAs on drug interventions.SettingNMAs indexed in PubMed (searches updated in May 2020).Primary and secondary outcome measuresLag times were measured as the time between the last systematic search and the article submission, acceptance, online publication, indexing and Medical Subject Headings (MeSH) allocation dates. Time-to-event analyses were performed considering independent variables (geographical origin, Journal Impact Factor, Scopus CiteScore, open access status) (SPSS V.24, R/RStudio).ResultsWe included 1245 NMAs. The median time from last search to article submission was 6.8 months (204 days (IQR 95–381)), and to publication was 11.6 months. Only 5% of authors updated their search after first submission. There is a very slightly decreasing historical trend of acceptance (rho=−0.087; p=0.010), online publication (rho=−0.080; p=0.008) and indexing (rho=−0.080; p=0.007) lag times. Journal Impact Factor influenced the MeSH allocation process, but not the other lag times. The comparison between open access versus subscription journals confirmed meaningless differences in acceptance, online publication and indexing lag times.ConclusionEfforts by authors to update their search before submission are needed to reduce evidence production time. Peer reviewers and editors should ensure authors’ compliance with NMA standards. The accuracy of these findings depends on the accuracy of the metadata used; as we evaluated only NMA on drug interventions, results may not be generalisable to all types of studies.


Sign in / Sign up

Export Citation Format

Share Document