scholarly journals Enabling knowledge brokerage intermediaries to be evidence-informed

Author(s):  
David Gough ◽  
Chris Maidment ◽  
Jonathan Sharples

Target audience: What Works Centres; other intermediary brokerage agencies; their funders and users; and researchers of research use.Background: Knowledge brokerage and knowledge mobilisation (KM) are generic terms used to describe activities to enable the use of research evidence to inform policy, practice and individual decision making. Knowledge brokerage intermediary (KBI) initiatives facilitate such use of research evidence. This debate paper argues that although the work of KBIs is to enable evidence-informed decision making (EIDM), they may not always be overt and consistent in how they follow the principles of EIDM in their own practice.Key points for discussion: Drawing on examples from existing brokerage initiatives, four areas are suggested where KBIs could be more evidence-informed in their work: (1) needs analysis: evidence-informed in their analysis of where and how the KBI can best contribute to the existing evidence ecosystem; (2) methods and theories of change: evidence-informed in the methods that the KBI uses to achieve its goals; (3) evidence standards: credible standards for making evidence claims; and (4) evaluation and monitoring: evidence-informed evaluation of their own activities and contribution to the knowledge base on evidence use. For each of these areas, questions are suggested for considering the extent that the principles are being followed in practice.Conclusions and implications: KBIs work with evidence but they may not always be evidence-informed in their practice. KBIs could benefit from more overtly attending to the extent that they apply the logic of EIDM to how they work. In doing so, KBIs can advance both the study, and practice, of using research evidence to inform decision making.

2019 ◽  
Vol 17 (1) ◽  
Author(s):  
Ahmad Firas Khalid ◽  
John N. Lavis ◽  
Fadi El-Jardali ◽  
Meredith Vanstone

Abstract Background Humanitarian action in crisis zones is fraught with many challenges, including lack of timely and accessible research evidence to inform decision-making about humanitarian interventions. Evidence websites have the potential to address this challenge. Evidence Aid is the only evidence website designed for crisis zones that focuses on providing research evidence in the form of systematic reviews. The objective of this study is to explore stakeholders’ views of Evidence Aid, contributing further to our understanding of the use of research evidence in decision-making in crisis zones. Methods We designed a qualitative user-testing study to collect interview data from stakeholders about their impressions of Evidence Aid. Eligible stakeholders included those with and without previous experience of Evidence Aid. All participants were either currently working or have worked within the last year in a crisis zone. Participants were asked to perform the same user experience-related tasks and answer questions about this experience and their knowledge needs. Data were analysed using a deductive framework analysis approach drawing on Morville’s seven facets of the user experience — findability, usability, usefulness, desirability, accessibility, credibility and value. Results A total of 31 interviews were completed with senior decision-makers (n = 8), advisors (n = 7), field managers (n = 7), analysts/researchers (n = 5) and healthcare providers (n = 4). Participant self-reported knowledge needs varied depending on their role. Overall, participants did not identify any ‘major’ problems (highest order) and identified only two ‘big’ problems (second highest order) with using the Evidence Aid website, namely the lack of a search engine on the home page and that some full-text articles linked to/from the site require a payment. Participants identified seven specific suggestions about how to improve Evidence Aid, many of which can also be applied to other evidence websites. Conclusions Stakeholders in crisis zones found Evidence Aid to be useful, accessible and credible. However, they experienced some problems with the lack of a search engine on the home page and the requirement for payment for some full-text articles linked to/from the site.


PLoS ONE ◽  
2011 ◽  
Vol 6 (7) ◽  
pp. e21704 ◽  
Author(s):  
Lois Orton ◽  
Ffion Lloyd-Williams ◽  
David Taylor-Robinson ◽  
Martin O'Flaherty ◽  
Simon Capewell

Author(s):  
Robert Asen ◽  
Whitney Gent

Participating in the growing scholarly attention to the roles of rhetoric and argumentation in policymaking, we examine how the use of research evidence operates in explicitly argumentative legislative hearings characterised by partisanship and polarisation. Conducting a rhetorical analysis of three legislative hearings in the US state of Wisconsin, we discovered that partisanship and polarisation did not influence argument and the use of research evidence uniformly. Instead, legislators and committee witnesses employed a range of uses for research evidence. To understand this usage, we have developed a framework that foregrounds situations of research use. These situations consist of conditions of polarisation (visibility, bipartisan leadership, familiarity, and controversy), modes of interaction (participation, cooperation and (dis)qualification), and conceptions of research use (necessity, relevance, and sufficiency). This situational model recognises that symbolic use provides the foundation for the use of research evidence in legislative settings. This model also reconfigures the relationship between research evidence and decision making.


Health ◽  
2018 ◽  
Vol 10 (04) ◽  
pp. 502-515
Author(s):  
Patricia Katowa-Mukwato ◽  
Lonia Mwape ◽  
Mwaba Chileshe Siwale ◽  
Emmanuel Mwila Musenge ◽  
Margaret Maimbolwa

Author(s):  
David Christian Rose ◽  
Caroline Kenny ◽  
Abbi Hobbs ◽  
Chris Tyler

Despite claims that we now live in a post-truth society, it remains commonplace for policy makers to consult research evidence to increase the robustness of decision making. Few scholars of evidence-policy interfaces, however, have used legislatures as sites of study, despite the fact that they play a critical role in modern democracies. There is thus limited knowledge of how research evidence is sourced and used in legislatures, which presents challenges for academics and science advisory groups, as well as to others interested in ensuring that democratic decisions are evidence-informed. Here, we present results from an empirical study into the use of research in the UK Parliament, obtained through the use of a mixed methodology, including interviews and surveys of 157 people in Parliament, as well as an ethnographic investigation of four committees. Here we are specifically interested in identifying the factors affecting the use of research evidence in Parliament with the aim of improving its use. We focus on providing advice for the Higher Education Sector, which includes improving knowledge of, and engagement in, parliamentary processes, reform of academic incentives to stimulate the production of policy-relevant information and to assist engagement, and working with trusted knowledge brokers. Implementing this advice should improve the chances that parliamentary decision making is informed by research evidence.


2017 ◽  
Vol 5 (5) ◽  
pp. 1-138 ◽  
Author(s):  
Paul M Wilson ◽  
Kate Farley ◽  
Liz Bickerdike ◽  
Alison Booth ◽  
Duncan Chambers ◽  
...  

BackgroundThe Health and Social Care Act 2012 (Great Britain.Health and Social Care Act 2012. London: The Stationery Office; 2012) has mandated research use as a core consideration of health service commissioning arrangements. We evaluated whether or not access to a demand-led evidence briefing service improved the use of research evidence by commissioners, compared with less intensive and less targeted alternatives.DesignControlled before-and-after study.SettingClinical Commissioning Groups (CCGs) in the north of England.Main outcome measuresChange at 12 months from baseline of a CCG’s ability to acquire, assess, adapt and apply research evidence to support decision-making. Secondary outcomes measured individual clinical leads’ and managers’ intentions to use research evidence in decision-making.MethodsNine CCGs received one of three interventions: (1) access to an evidence briefing service; (2) contact plus an unsolicited push of non-tailored evidence; or (3) an unsolicited push of non-tailored evidence. Data for the primary outcome measure were collected at baseline and 12 months post intervention, using a survey instrument devised to assess an organisation’s ability to acquire, assess, adapt and apply research evidence to support decision-making. In addition, documentary and observational evidence of the use of the outputs of the service was sought and interviews with CCG participants were undertaken.ResultsMost of the requests were conceptual; they were not directly linked to discrete decisions or actions but were intended to provide knowledge about possible options for future actions. Symbolic use to justify existing decisions and actions were less frequent and included a decision to close a walk-in centre and to lend weight to a major initiative to promote self-care already under way. The opportunity to impact directly on decision-making processes was limited to work to establish disinvestment policies. In terms of impact overall, the evidence briefing service was not associated with increases in CCGs’ capacity to acquire, assess, adapt and apply research evidence to support decision-making, individual intentions to use research findings or perceptions of CCGs’ relationships with researchers. Regardless of the intervention received, at baseline participating CCGs indicated that they felt that they were inconsistent in their research-seeking behaviours and their capacity to acquire research remained so at follow-up. The informal nature of decision-making processes meant that there was little or no traceability of the use of evidence.LimitationsLow baseline and follow-up response rates (of 68% and 44%, respectively) and missing data limit the reliability of these findings.ConclusionsAccess to a demand-led evidence briefing service did not improve the uptake and use of research evidence by NHS commissioners compared with less intensive and less targeted alternatives. Commissioners appear to be well intentioned but ad hoc users of research.Future workFurther research is required on the effects of interventions and strategies to build individual and organisational capacity to use research. Resource-intensive approaches to providing evidence may best be employed to support instrumental decision-making. Comparative evaluation of the impact of less intensive but targeted strategies on the uptake and use of research by commissioners is warranted.FundingThe National Institute for Health Research Health Services and Delivery Research programme.


Sign in / Sign up

Export Citation Format

Share Document