scholarly journals Relationship between publication indicators and citation impact indicators for publications in business, management, and accounting listed in Scopus from 2015 to 2019

2021 ◽  
Vol 8 (1) ◽  
pp. 18-25
Author(s):  
Hyunju Jang

Purpose: This study examined whether article-level publication indicators were related to citation impact indicators in the business, management, and accounting categories listed in Scopus. Article-level publication indicators included the number of authors, countries, and keywords, as well as title length, while citation impact indicators included the field-weighted citation impact (FWCI) at the article level and Scimago Journal Rank (SJR) at the journal level. The optimal values of four article-level publication indicators for maximizing the FWCI and SJR were calculated.Methods: All publication and citation impact indicators were gathered for articles and reviews in the business, management, and accounting fields published from 2015 and 2019 and listed in Scopus and SciVal. Correlations between four article-level citation indicators and each citation impact indicator were analyzed.Results: The number of authors was positively associated with the FWCI, while the number of countries and keywords was not associated with the FWCI or SJR. Title length was negatively associated with the FWCI and SJR. The optimal publication indicators to maximize the FWCI were four authors, three more countries, six keywords, and a title word count of 14 to 19. The optimal publication indicators to maximize the SJR were three to four coauthors, three to four countries of collaborators, five keywords, and a title word count of two to seven.Conclusion: Authors aiming to get higher citations and publish in higher-ranking SJR journals in the business, management, and accounting categories are recommended to pay close attention to design of research team and the number of keywords and impactful title length so that the publication will have a higher likelihood of being accepted and receiving citations.

2020 ◽  
Vol 30 (1) ◽  
pp. 128-133
Author(s):  
Marina Njire Braticevic ◽  
Ivana Babic ◽  
Irena Abramovic ◽  
Anja Jokic ◽  
Martina Horvat

Introduction: First impression on potential readers is created by the title; therefore, authors should give importance to the title structure. The aim of this study was to establish whether articles created by a smaller number of authors and with shorter, descriptive or declarative titles gain more citations and whether article title length and number of authors correlate to the number of citations. Material and methods: A cross-sectional study on article citation data for 30 scientific journals published in 2016 in Medical Laboratory Technology field according to Web of Science database was conducted. The type of article, type of title, as well as number of words in the title and number of authors was recorded. Results: In the group of original articles (N = 2623), articles with declarative titles (N = 336, 13%) showed statistically higher number of citations in multiple comparison analysis when compared to descriptive titles (P < 0.001). No correlation was found between number of citations and title word count (r = 0.07, P < 0.001) nor between number of citations and number of authors in group of original articles (r = 0.09, P < 0.001). Original articles with descriptive titles longer than 15 words or with more than six authors are cited more (P = 0.005 and P < 0.001, respectively). Conclusion: Based on results of our study, titles do matter. Therefore, authors of original articles might want to consider including their findings in the title and having longer titles.


2018 ◽  
Vol 17 (3) ◽  
Author(s):  
Rizka Rahmaida ◽  
Mia Amelia

Collaboration becomes trend in scientific publication for last decade. Many previous studies showed that collaboration had impact on quality of publication in terms of citation impact. This study aimed to investigate the influence of different pattern of collaboration on the citation impact of publications on biodiversity research from Indonesian researcher. A one thousand six hundred and ninty nine (1,699) articles were published by researchers affiliated with institution located in Indonesia from 1990-2012 based on Scopus database. Different pattern of collaboration were investigated in different level. Based on the result, only 4.4% of those publications were single author publications, 11.4% of those were intra-institution publications, and 17.4% of those were domestic collaboration. Using linear regression analysis, the result of the study showed that there was a significant positive correlation between the number of authors and the number of citations in international publication on biodiversity research from Indonesian researcher. In addition, publications with a higher number of institutions have received higher number of citations. Publication with a higher number of foreign collaborating countries also received more citations.


2004 ◽  
Vol 60 (6) ◽  
pp. 658-672 ◽  
Author(s):  
A.J. Nederhof ◽  
M.S. Visser

Publications ◽  
2018 ◽  
Vol 6 (3) ◽  
pp. 39 ◽  
Author(s):  
Dmitry Kochetkov

Recently, more and more countries are entering the global race for university competitiveness. On the one hand, global rankings are a convenient tool for quantitative analysis. On the other hand, their indicators are often difficult to quickly calculate and they often contradict each other. The author of this paper hoped to use widely available indicators for a quick analysis of the University’s publication strategy and opted for the normalized citation indicators available in the SciVal analytical tool, namely, Source Normalized Impact per Paper (SNIP) and Field-Weighted Citation Impact (FWCI). The author demonstrated the possibility of applying the correlation analysis to the impact indicators of a document and a journal on a sample of social and humanitarian fields at Peoples’ Friendship University of Russia (PFUR, “RUDN” in Russian). A dot diagram of university (or country) documents was used to form a two-factor matrix (SNIP and FWCI) that was further divided into four quadrants. Such an analysis illustrated the present situation in that discipline. An analysis of the RUDN university publications revealed problems and prospects in the development of social sciences and humanities. A serious problem observed was that high-quality results were often published in low-impact journals that narrowed the results’ potential audience and, accordingly, the number of citations. A particular attention was paid to the application of the results in practice.


2018 ◽  
Vol 7 (2.7) ◽  
pp. 28 ◽  
Author(s):  
K Varada Rajkumar ◽  
Yesubabu Adimulam ◽  
K Subrahmanyam

In previous days the quality of journal is measured based on many metrics such as impact factor; SNIP(Source Normalized Impact Per paper),SJR( SCImago Journal Rank) and IPP(Impact Per Publication). It is very hard to find that the research papers to be published in which journal? CiteScore is a better way of measuring the citation impact of sources such as journals. CiteScore is a metrics product for journal from elsevier, using the citation data from the scopusdatabase to rank journals. CiteScore metrics is a comprehensive current and free metrics for source titles in scopus. Apart from Impact factor, CiteScore is becoming increasingly important in the context of evaluating metrics for all journals. CiteScore metrics are available for 37956 titles in scopus. It is not limited to journals as also conference proceedings, trade, publications and book series. The metrics are available 6 years period from 2011 to 2016. For a subset of CiteScore dataset clustering and regression algorithms can be implemented to study the data points that lie equally distant from one another.


2017 ◽  
Vol 1 (3) ◽  
pp. 6-26 ◽  
Author(s):  
Loet Leydesdorff ◽  
Wouter de Nooy ◽  
Lutz Bornmann

AbstractPurposeRamanujacharyulu developed the Power-weakness Ratio (PWR) for scoring tournaments. The PWR algorithm has been advocated (and used) for measuring the impact of journals. We show how such a newly proposed indicator can empirically be tested.Design/methodology/approachPWR values can be found by recursively multiplying the citation matrix by itself until convergence is reached in both the cited and citing dimensions; the quotient of these two values is defined as PWR. We study the effectiveness of PWR using journal ecosystems drawn from the Library and Information Science (LIS) set of the Web of Science (83 journals) as an example. Pajek is used to compute PWRs for the full set, and Excel for the computation in the case of the two smaller sub-graphs: (1) JASIST+ the seven journals that cite JASIST more than 100 times in 2012; and (2) MIS Quart+ the nine journals citing this journal to the same extent.FindingsA test using the set of 83 journals converged, but did not provide interpretable results. Further decomposition of this set into homogeneous sub-graphs shows that—like most other journal indicators—PWR can perhaps be used within homogeneous sets, but not across citation communities. We conclude that PWR does not work as a journal impact indicator; journal impact, for example, is not a tournament.Research limitationsJournals that are not represented on the “citing” dimension of the matrix-for example, because they no longer appear, but are still registered as “cited” (e.g. ARIST)-distort the PWR ranking because of zeros or very low values in the denominator.Practical implicationsThe association of “cited” with “power” and “citing” with “weakness” can be considered as a metaphor. In our opinion, referencing is an actor category and can be studied in terms of behavior, whereas “citedness” is a property of a document with an expected dynamics very different from that of “citing.” From this perspective, the PWR model is not valid as a journal indicator.Originality/valueArguments for using PWR are: (1) its symmetrical handling of the rows and columns in the asymmetrical citation matrix, (2) its recursive algorithm, and (3) its mathematical elegance. In this study, PWR is discussed and critically assessed.


Sign in / Sign up

Export Citation Format

Share Document