scholarly journals Evaluation of stakeholder views on peer review of NIHR applications for funding: a qualitative study

BMJ Open ◽  
2018 ◽  
Vol 8 (12) ◽  
pp. e022548 ◽  
Author(s):  
Sheila Turner ◽  
Abby Bull ◽  
Fay Chinnery ◽  
Jeremy Hinks ◽  
Nicola Mcardle ◽  
...  

ObjectivesInnovations resulting from research have both national and global impact, so selecting the most promising research studies to fund is crucial. Peer review of research funding applications is part of the selection process, and requires considerable resources. This study aimed to elicit stakeholder opinions about which factors contribute to and influence effective peer review of funding applications to the UK National Institute for Health Research (NIHR), and to identify possible minor improvements to current processes and any major changes or potential innovations to achieve a more efficient peer review process.DesignQualitative interviews with 30 stakeholders involved in the peer review process.ParticipantsParticipants were drawn from three NIHR coordinating centres and represented four types of stakeholders: board members with responsibility for making funding decisions, applicants, external peer reviewers and NIHR staff.MethodsAll interviews were conducted by telephone apart from three that were face to face with NIHR staff. Data were analysed using a thematic template method.ResultsThe responses from NIHR staff, board members and reviewers differed from those received from applicants. The first three groups focused on how well the process of peer review did or did not function. The applicants mentioned these points but in addition often reflected on how their personal application was assessed. Process improvements suggested included: developing a more proportionate review process; providing greater guidance, feedback, training, acknowledgement or incentives for peer reviewers; reducing the time commitment and amount of paperwork; and asking reviewers to comment on the importance, strengths and weaknesses of applications and flaws which are potentially ‘fixable’.ConclusionsOverall, participants were supportive of the need for peer review in evaluating applications for research funding. This study revealed which parts of the process are working well and are valued, and barriers, difficulties and potential areas for improvement and development.

2017 ◽  
Vol 33 (S1) ◽  
pp. 102-102
Author(s):  
Sheila Turner ◽  
Judith Lathlean ◽  
Fay Chinnery ◽  
Rebecca Moran ◽  
Eleanor Guegan ◽  
...  

INTRODUCTION:It takes on average 17 years to translate a promising laboratory development into better patient treatments or services. About 10 years of this innovation process lies within the National Institute for Health Research (NIHR) research pathway. Innovations developed through research have both national and global impact, so selecting the most promising studies to fund is crucial. Peer review of applications is part of the NIHR research funding process, but requires considerable resources. The NIHR is committed to improving efficiency and proportionality of this process. This study is part of a wider piece of work being undertaken by NIHR (1) to reduce the complexity of the funding pathway and thus make a real difference to patients lives.METHODS:This study elicited the views of various stakeholders concerning current and possible future methods for peer review of applications for research funding. Stakeholder groups included: members of boards with responsibility for making funding decisions; applicants (both successful and unsuccessful); peer reviewers and NIHR staff. Qualitative interviews were conducted with stakeholders selected from each group, and results were analyzed and integrated using a thematic template analytical method. The results were used to inform a larger online opinion survey which will be reported separately.RESULTS:The views and insights of thirty stakeholders across the four groups about the peer review process of applications for funding will be presented. Findings generalizable to other funding programs outside the NIHR will be emphasized. The key themes which emerged included: strengths and weaknesses of applications, feedback, targeting and acknowledgement of peer reviewers.CONCLUSIONS:The results of our study of peer review processes carried out by one national research funder has relevance for other funding organizations, both within our country and internationally.


Author(s):  
Ann Blair Kennedy, LMT, BCTMB, DrPH

  Peer review is a mainstay of scientific publishing and, while peer reviewers and scientists report satisfaction with the process, peer review has not been without criticism. Within this editorial, the peer review process at the IJTMB is defined and explained. Further, seven steps are identified by the editors as a way to improve efficiency of the peer review and publication process. Those seven steps are: 1) Ask authors to submit possible reviewers; 2) Ask reviewers to update profiles; 3) Ask reviewers to “refer a friend”; 4) Thank reviewers regularly; 5) Ask published authors to review for the Journal; 6) Reduce the length of time to accept peer review invitation; and 7) Reduce requested time to complete peer review. We believe these small requests and changes can have a big effect on the quality of reviews and speed in which manuscripts are published. This manuscript will present instructions for completing peer review profiles. Finally, we more formally recognize and thank peer reviewers from 2018–2020.


F1000Research ◽  
2016 ◽  
Vol 5 ◽  
pp. 683 ◽  
Author(s):  
Marco Giordan ◽  
Attila Csikasz-Nagy ◽  
Andrew M. Collings ◽  
Federico Vaggi

BackgroundPublishing in scientific journals is one of the most important ways in which scientists disseminate research to their peers and to the wider public. Pre-publication peer review underpins this process, but peer review is subject to various criticisms and is under pressure from growth in the number of scientific publications.MethodsHere we examine an element of the editorial process ateLife, in which the Reviewing Editor usually serves as one of the referees, to see what effect this has on decision times, decision type, and the number of citations. We analysed a dataset of 8,905 research submissions toeLifesince June 2012, of which 2,750 were sent for peer review, using R and Python to perform the statistical analysis.ResultsThe Reviewing Editor serving as one of the peer reviewers results in faster decision times on average, with the time to final decision ten days faster for accepted submissions (n=1,405) and 5 days faster for papers that were rejected after peer review (n=1,099). There was no effect on whether submissions were accepted or rejected, and a very small (but significant) effect on citation rates for published articles where the Reviewing Editor served as one of the peer reviewers.ConclusionsAn important aspect ofeLife’s peer-review process is shown to be effective, given that decision times are faster when the Reviewing Editor serves as a reviewer. Other journals hoping to improve decision times could consider adopting a similar approach.


2019 ◽  
Author(s):  
Malte Elson ◽  
Markus Huff ◽  
Sonja Utz

Peer review has become the gold standard in scientific publishing as a selection method and a refinement scheme for research reports. However, despite its pervasiveness and conferred importance, relatively little empirical research has been conducted to document its effectiveness. Further, there is evidence that factors other than a submission’s merits can substantially influence peer reviewers’ evaluations. We report the results of a metascientific field experiment on the effect of the originality of a study and the statistical significance of its primary outcome on reviewers’ evaluations. The general aim of this experiment, which was carried out in the peer-review process for a conference, was to demonstrate the feasibility and value of metascientific experiments on the peer-review process and thereby encourage research that will lead to understanding its mechanisms and determinants, effectively contextualizing it in psychological theories of various biases, and developing practical procedures to increase its utility.


2019 ◽  
Author(s):  
Damian Pattinson

In recent years, funders have increased their support for early sharing of biomedical research through the use of preprints. For most, such as the COAlitionS group of funders (ASAPbio 2019) and the Gates foundation, this takes the form of active encouragement, while for others, it is mandated. But despite these motivations, few authors are routinely depositing their work as a preprint before submitting to a journal. Some journals have started offering authors the option of posting their work early at the point at which it is submitted for review. These include PLOS, who offer a link to BiorXiv, the Cell journals, who offer SSRN posting through ‘Sneak Peak’, and Nature Communications, who offer posting to any preprint and a link from the journal page called ‘Under Consideration’. Uptake has ranged from 3% for the Nature pilot, to 18% for PLOS (The Official Plos Blog 2018). In order to encourage more researchers to post their work early, we have been offering authors who submit to BMC Series titles the opportunity to post their work as a preprint on Research Square, a new platform that lets authors share and improve their research. To encourage participation, authors are offered a greater amount of control and transparency over the peer review process if they opt in. First, they are given a detailed peer review timeline which updates in real time every time an event occurs on their manuscript (reviewer invited, reviewer accepts etc). Second, they are encouraged to share their preprint with colleagues, who are able to post comments on the paper. These comments are sent to the editor when they are making their decision. Third, authors can suggest potential peer reviewers, recommendations which are also passed onto the editor to vet and invite. Together, these incentives have had a positive impact on authors choosing to post a preprint. Among the journals that offer this service, the average opt-in rate is 40%. This translates to over 3,000 manuscripts (as of July 2019) that have been posted to Research Square since the launch of the service in October 2018. In this talk I will demonstrate the functionality of Research Square, and provide demographic and discipline data on which areas are most and least likely to post.


2019 ◽  
Vol 44 (6) ◽  
pp. 994-1019 ◽  
Author(s):  
Lambros Roumbanis

At present, peer review is the most common method used by funding agencies to make decisions about resource allocation. But how reliable, efficient, and fair is it in practice? The ex ante evaluation of scientific novelty is a fundamentally uncertain endeavor; bias and chance are embedded in the final outcome. In the current study, I will examine some of the most central problems of peer review and highlight the possible benefits of using a lottery as an alternative decision-making mechanism. Lotteries are driven by chance, not reason. The argument made in the study is that the epistemic landscape could benefit in several respects by using a lottery, thus avoiding all types of bias, disagreement, and other limitations associated with the peer review process. Funding agencies could form a pool of funding applicants who have minimal qualification levels and then select randomly within that pool. The benefits of a lottery would not only be that it saves time and resources, but also that it contributes to a more dynamic selection process and increases the epistemic diversity, fairness, and impartiality within academia.


F1000Research ◽  
2016 ◽  
Vol 5 ◽  
pp. 683 ◽  
Author(s):  
Marco Giordan ◽  
Attila Csikasz-Nagy ◽  
Andrew M. Collings ◽  
Federico Vaggi

BackgroundPublishing in scientific journals is one of the most important ways in which scientists disseminate research to their peers and to the wider public. Pre-publication peer review underpins this process, but peer review is subject to various criticisms and is under pressure from growth in the number of scientific publications.MethodsHere we examine an element of the editorial process ateLife, in which the Reviewing Editor usually serves as one of the referees, to see what effect this has on decision times, decision type, and the number of citations. We analysed a dataset of 8,905 research submissions toeLifesince June 2012, of which 2,747 were sent for peer review. This subset of 2747 papers was then analysed in detail.  ResultsThe Reviewing Editor serving as one of the peer reviewers results in faster decision times on average, with the time to final decision ten days faster for accepted submissions (n=1,405) and five days faster for papers that were rejected after peer review (n=1,099). Moreover, editors acting as reviewers had no effect on whether submissions were accepted or rejected, and a very small (but significant) effect on citation rates.ConclusionsAn important aspect ofeLife’s peer-review process is shown to be effective, given that decision times are faster when the Reviewing Editor serves as a reviewer. Other journals hoping to improve decision times could consider adopting a similar approach.


2019 ◽  
Author(s):  
Malte Elson ◽  
Markus Huff ◽  
Sonja Utz

Peer review has become the gold standard in scientific publishing as a selection method and a refinement scheme for research reports. However, despite its pervasiveness and conferred importance, relatively little empirical research has been conducted to document its effectiveness. Further, there is evidence that factors other than a submission’s merits can substantially influence peer reviewers’ evaluations. We report the results of a metascientific field experiment on the effect of the originality of a study and the statistical significance of its primary outcome on reviewers’ evaluations. The general aim of this experiment, which was carried out in the peer-review process for a conference, was to demonstrate the feasibility and value of metascientific experiments on the peer-review process and thereby encourage research that will lead to understanding its mechanisms and determinants, effectively contextualizing it in psychological theories of various biases, and developing practical procedures to increase its utility.


2019 ◽  
Vol 9 (2) ◽  
pp. 216-230
Author(s):  
Sabina Siebert ◽  
Stephanie Schreven

This article explores an intervention that practises the ‘art of deception’ in the context of biomedical publishing. Specifically, we explore the science hoax aimed at revealing problems in the peer review process. We pose a question – are science hoaxes based on deception ever justified? Drawing on interviews with biomedical scientists in the UK, we identify the issue of trust as the key element in the scientists’ evaluations of hoaxes. Hoaxes are seen by some to increase trust, and are seen by others to damage trust. Trust in science is thus a Protean concept: it can be used to argue for two completely different, and sometimes contradictory, positions. In this case, the same argument of trust was recognizably invoked to defend the hoaxes, and to argue against them.


2017 ◽  
Vol 33 (1) ◽  
pp. 129-144 ◽  
Author(s):  
Jay C. Thibodeau ◽  
L. Tyler Williams ◽  
Annie L. Witte

ABSTRACT In the new research frontier of data availability, this study develops guidelines to aid accounting academicians as they seek to evidence data integrity proactively in the peer-review process. To that end, we explore data integrity issues associated with two emerging data streams that are gaining prominence in the accounting literature: online labor markets and social media sources. We provide rich detail surrounding academic thought about these data platforms through interview data collected from a sample of former senior journal editors and survey data collected from a sample of peer reviewers. We then propound a set of best practice considerations that are designed to mitigate the perceived risks identified by our assessment.


Sign in / Sign up

Export Citation Format

Share Document