scholarly journals Dental research: where are we now?

2011 ◽  
Vol 93 (6) ◽  
pp. 207-207
Author(s):  
Gerry Linden

When I was appointed as a senior lecturer in periodontology in the mid-1980s academic posts were attractive and sought after, with the promise of involvement in research that would underpin teaching. Within a few years regular reviews of research were introduced through the research assessment exercise (RAE), now rechristened the research excellence framework (REF). The RAE assessed the quality of research in all subjects in all UK universities. The results have informed the selective distribution of resources from government to the universities to support research.

2010 ◽  
Vol 14 (1) ◽  
pp. 11-16 ◽  
Author(s):  
Sebastian Macmillan

Like the Research Assessment Exercise (RAE) that preceded it, the UK government's proposed Research Excellence Framework (REF) is a means of allocating funding in higher education to support research. As with any method for the competitive allocation of funds it creates winners and losers and inevitably generates a lot of emotion among those rewarded or penalised. More specifically, the ‘winners’ tend to approve of the method of allocation and the ‘losers’ denigrate it as biased against their activities and generally unfair. An extraordinary press campaign has been consistently waged against research assessment and its methods by those involved in architectural education, which I will track over a decade and a half. What follows will question whether this campaign demonstrates the sophistication and superior judgment of those who have gone into print, or conversely whether its mixture of misinformation and disinformation reveals not just disenchantment and prejudice, but a naivety and a depth of ignorance about the fundamentals of research that is deeply damaging to the credibility of architecture as a research-based discipline. With the recent consultation process towards a new cycle of research assessment, the REF, getting under way, I aim to draw attention to the risk of repeating past mistakes.


2019 ◽  
Vol 28 (3) ◽  
pp. 209-217 ◽  
Author(s):  
Lai Ma ◽  
Michael Ladisch

Abstract Evaluative metrics have been used for research assessment in most universities and funding agencies with the assumption that more publications and higher citation counts imply increased productivity and better quality of research. This study investigates the understanding and perceptions of metrics, as well as the influences and implications of the use of evaluative metrics on research practices, including choice of research topics and publication channels, citation behavior, and scholarly communication in Irish universities. Semi-structured, in-depth interviews were conducted with researchers from the humanities, the social sciences, and the sciences in various career stages. Our findings show that there are conflicting attitudes toward evaluative metrics in principle and in practice. The phenomenon is explained by two concepts: evaluation complacency and evaluation inertia. We conclude that evaluative metrics should not be standardized and institutionalized without a thorough examination of their validity and reliability and without having their influences on academic life, research practices, and knowledge production investigated. We also suggest that an open and public discourse should be supported for the discussion of evaluative metrics in the academic community.


2017 ◽  
Vol 39 (2) ◽  
pp. 48-49
Author(s):  
Gabriele Butkute

In an age where huge amounts of data is collected on everything we do – from our Google searches to our GPS coordinates – we like to be able to count, measure and assess things. This includes measuring the impact and quality of research in the UK, through an assessment method known as the Research Excellence Framework (REF).


2002 ◽  
Vol 24 (2) ◽  
pp. 31
Author(s):  
Mike Withnall

Richard Reece says in his Editorial (see page 3) that the Research Assessment Exercise (RAE) has fulfilled its purpose of driving up research quality, and that there is no need for future exercises. The Director of the Science Policy Research Unit at Sussex University has reached a similar conclusion: after three rounds of the RAE there is little scope for further gains in efficiency within university departments and the cost of the exercise exceeds the benefits. But there will continue to be a need for some form of assessment of research. The Funding Councils are accountable for the quality of the work that they support. A lack of periodic assessment could lead to complacency and ossification, and universities established since 1992 may feel that they have not yet had sufficient time to develop top-ranking departments.


Author(s):  
Ken Peach

This chapter focuses on the review process, the process of writing a proposal and the evaluation of science. The usual way that science is funded these days is through a proposal to a funding agency; if it satisfies peer review and there are sufficient resources available, it is then funded. Peer review is at the heart of academic life, and is used to assess research proposals, progress, publications and institutions. Peer review processes are discussed and, in light of this discussion, the art of proposal writing. The particular features of making fellowship proposals and preparing for an institutional review are described. In addition, several of the methods used for evaluating and ranking research and research institutions are reviewed, including the Research Assessment Exercise and the Research Excellence Framework.


BioMedica ◽  
2020 ◽  
Vol 36 (4) ◽  
pp. 327-328
Author(s):  
Dr. Inayatullah Padhiar

Dental Journalism in Pakistan has a very chequered history. There are many reasons for this worsening status of dental research; lack of research facilities, funding, ownership and training of the stake-holders are a few of those. The need of the hour is that the eminent and seniour dental scholars, the research institutes and the professional bodies should work together and plot the strategies to strengthen a research oriented mind-set and guidance for the future students so that quality of research, both at clinical and basic sciences levels, is improved and publications in the national journals may be encouraged for uplifting their status in the international journalism.


2000 ◽  
Vol 8 ◽  
pp. 48 ◽  
Author(s):  
Orlan Lee

The Research Assessment Exercises (RAEs) in hugely expanded universities in Britain and Hong Kong attempt mammoth scale ratings of "quality of research." If peer review on that scale is feasible for "quality of research," is it less so for "quality of teaching"? The lessons of the Hong Kong Teaching and Learning Quality Process Reviews (TLQPRs), of recent studies on the influence of grade expectation and workload on student ratings, of attempts to employ agency theory both to improve teaching quality and raise student ratings, and of institutional attempts to refine the peer review process, all suggest that we can "put teaching on the same footing as research" and include professional regard for teaching content and objectives, as well as student ratings of effectiveness and personality appeal, in the process.


Sign in / Sign up

Export Citation Format

Share Document