Industry Actor Use of Research Evidence: Critical Analysis of Australian Alcohol Policy Submissions

2020 ◽  
Vol 81 (6) ◽  
pp. 710-718
Author(s):  
Julia Stafford ◽  
Kypros Kypri ◽  
Simone Pettigrew
2013 ◽  
Vol 21 (78) ◽  
pp. 101-114
Author(s):  
Harvey Goldstein

The paper explores some of the issues involved in evaluating educational policy initiatives. It gives examples of how research findings can be evaluated and draws lessons for the ways in which policymakers can interact usefully with researchers. It argues that while central government's use of research evidence is often highly selective and concerned with its own perceived short term interests, a broader view of the research process is more productive and beneficial. The issues of class size, school league tables and the effects of homework are studied in detail and the often provisional nature of research evidence is emphasised as well as the uncertainty surrounding the findings of individual studies.


2019 ◽  
Vol 17 (1) ◽  
Author(s):  
Ahmad Firas Khalid ◽  
John N. Lavis ◽  
Fadi El-Jardali ◽  
Meredith Vanstone

Abstract Background Humanitarian action in crisis zones is fraught with many challenges, including lack of timely and accessible research evidence to inform decision-making about humanitarian interventions. Evidence websites have the potential to address this challenge. Evidence Aid is the only evidence website designed for crisis zones that focuses on providing research evidence in the form of systematic reviews. The objective of this study is to explore stakeholders’ views of Evidence Aid, contributing further to our understanding of the use of research evidence in decision-making in crisis zones. Methods We designed a qualitative user-testing study to collect interview data from stakeholders about their impressions of Evidence Aid. Eligible stakeholders included those with and without previous experience of Evidence Aid. All participants were either currently working or have worked within the last year in a crisis zone. Participants were asked to perform the same user experience-related tasks and answer questions about this experience and their knowledge needs. Data were analysed using a deductive framework analysis approach drawing on Morville’s seven facets of the user experience — findability, usability, usefulness, desirability, accessibility, credibility and value. Results A total of 31 interviews were completed with senior decision-makers (n = 8), advisors (n = 7), field managers (n = 7), analysts/researchers (n = 5) and healthcare providers (n = 4). Participant self-reported knowledge needs varied depending on their role. Overall, participants did not identify any ‘major’ problems (highest order) and identified only two ‘big’ problems (second highest order) with using the Evidence Aid website, namely the lack of a search engine on the home page and that some full-text articles linked to/from the site require a payment. Participants identified seven specific suggestions about how to improve Evidence Aid, many of which can also be applied to other evidence websites. Conclusions Stakeholders in crisis zones found Evidence Aid to be useful, accessible and credible. However, they experienced some problems with the lack of a search engine on the home page and the requirement for payment for some full-text articles linked to/from the site.


PLoS ONE ◽  
2011 ◽  
Vol 6 (7) ◽  
pp. e21704 ◽  
Author(s):  
Lois Orton ◽  
Ffion Lloyd-Williams ◽  
David Taylor-Robinson ◽  
Martin O'Flaherty ◽  
Simon Capewell

The Lancet ◽  
2011 ◽  
Vol 378 (9804) ◽  
pp. 1697 ◽  
Author(s):  
Nancy Cartwright

Sign in / Sign up

Export Citation Format

Share Document