Research Policy

  • Reference work entry
  • First Online: 01 January 2022
  • pp 2429–2436
  • Cite this reference work entry

research policy

  • Haihong Zhang 2 &
  • Yali Cong 3  

41 Accesses

Research policies, aimed at excellence in research, are designed and implemented to ensure that good research practices are pursued in balance with ethical considerations. As there is increasing overlap between good research practices and research ethics, and given that there are other chapters in this book that include detailed discussion of conflicts of interest, research misconduct and related topics, this chapter mainly focuses on the ethical aspects of research policies aimed at protecting human subjects in research. The history and development of the Declaration of Helsinki , International Conference on Harmonization: Good Clinical Practice (ICH-GCP) guidelines, and Council for International Organizations of Medical Sciences (CIOMS) guidelines, as well as ethical principles specified in these documents, were examined. Problems and challenges faced by different layers of research policies, for instance international guidelines, national laws/regulations, and institutional policies and procedures, are discussed. We conclude that lessons should be learnt from history and all stakeholders involved in human subject research bear the responsibility to develop new research policies. However, the diversity and uniqueness of a particular context should be considered seriously during the process of policy making, and once a research policy is developed, it still needs regular re-examination to ensure that any required updates are undertaken.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Code of Federal Regulations, Title 45 Public Welfare, Part 46, Protection of Human Subjects, 45 CFR 46.102 (d). http://www.hhs.gov/ohrp/humansubjects/guidance/45cfr46.html . Accessed 30 Aug 2015.

Data from National Bureau of Statistics of the People’s Republic of China. http://www.stats.gov.cn/tjsj/zxfb/201502/t20150226_685799.html . Accessed 29 Aug 2015.

Emanuel, EJ., Wood, A., Fleischman, A. et al. (2004). Oversight of human participants research: identifying problems to evaluate reform proposals. Ann Int Med, 141 (4), 282–291.

Google Scholar  

Ethical Considerations in Biomedical HIV Prevention Trials. http://www.unaids.org/sites/default/files/en/media/unaids/contentassets/documents/unaidspublication/2012/jc1399_ethical_considerations_en.pdf . Accessed 19 Jan 2016.

Ethical Considerations in HIV Preventive Vaccine Research. http://www.who.int/immunization/research/forums_and_initiatives/ethical_considerations_vaccine_research_unaids_en_2000.pdf . Accessed 19 Jan 2016.

Ethical principles and guidelines for the protection of human subjects of research (the Belmont Report). April 18, 1979. http://www.hhs.gov/ohrp/policy/belmont.html . Accessed 29 Aug 2015.

International ethical guidelines for biomedical research involving human subjects. http://www.cioms.ch/index.php/12-newsflash/331-cioms-working-group-of-the-revision-of-the-2002-cioms-ethical-guidelines-for-biomedical-research . Accessed 29 Aug 2015.

International Ethical Guidelines for Epidemiological Studies. http://www.cioms.ch/index.php/publications/available-publications/540/view_bl/65/bioethics-and-health-policy-guidelines-and-other-normative-documents/47/international-ethical-guidelines-for-epidemiological-studies?tab=getmybooksTab&is_show_data=1 . Accessed 19 Jan 2016.

International Conference on Harmonization: E6 Good Clinical Practice (ICH-GCP E6). http://www.ich.org/products/guidelines/efficacy/efficacy-single/article/good-clinical-practice.html . Accessed 30 Aug 2015.

Kass, N., Faden, R., Goodman, S., Pronovost, P., & Beauchamp, T. (2013). The research-treatment distinction: A problematic approach for determining which activities should have ethical oversight. Hastings Center Report, 43 (S1), S4–S15. doi:10.1002/hast.133.

Moreno, J. D. (1997). Reassessing the influence of the Nuremberg Code on American medical ethics. Journal of Contemporary Health Law and Policy, 13 (2), 347–360.

Rosenthal, D., & Perlman, B. (1986). Ethical dimensions of public policy & administration. Polity, 19 (1), 56–73.

United Nations Human Rights, Office of the Commissioner. International covenant on civil and political rights. http://www.ohchr.org/EN/ProfessionalInterest/Pages/CCPR.aspx . Accessed 30 Aug 2015.

United Nations. The Universal Declaration of Human Rights. http://www.un.org/en/documents/udhr/ . Accessed 30 Aug 2015.

World Medical Association. (2013). World Medical Association Declaration of Helsinki: Ethical principles for medical research involving human subjects. JAMA, 310 (20), 2191–2194. doi:10.1001/jama.2013.281053.

Further Readings

Emanuel, E., Wendler, D., & Grady, C. (2000). What makes clinical research ethical? JAMA, 283 (20), 2701–2711.

Faden, R., Kass, N., Goodman, S., Pronovost, P., Tunis, S., & Beauchamp, T. (2013). An ethics framework for a learning health care system: a departure from traditional research ethics and clinical ethics. Hastings Center Report, 43 (s1), S16–S27. doi:10.1002/hast.134.

Presidential Commission for the Study of Bioethics. Moral science: protecting participants in human subjects research. Dec 2011. http://bioethics.gov/sites/default/files/Moral%20Science%20June%202012.pdf . Accessed 30 Aug 2015.

Download references

Author information

Authors and affiliations.

Peking University Health Science Center, No. 38 Xueyuan Road, Haidian District, Beijing, China

Haihong Zhang

Institute for Medical Humanities, Peking University, Haidian District, Beijing, China

You can also search for this author in PubMed   Google Scholar

Corresponding authors

Correspondence to Haihong Zhang or Yali Cong .

Editor information

Editors and affiliations.

Center for Healthcare Ethics, Duquesne University, Pittsburgh, PA, USA

Henk ten Have

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this entry

Cite this entry.

Zhang, H., Cong, Y. (2016). Research Policy. In: ten Have, H. (eds) Encyclopedia of Global Bioethics. Springer, Cham. https://doi.org/10.1007/978-3-319-09483-0_378

Download citation

DOI : https://doi.org/10.1007/978-3-319-09483-0_378

Published : 19 January 2022

Publisher Name : Springer, Cham

Print ISBN : 978-3-319-09482-3

Online ISBN : 978-3-319-09483-0

eBook Packages : Religion and Philosophy Reference Module Humanities and Social Sciences Reference Module Humanities

Share this entry

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • My Account Login
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Open access
  • Published: 28 May 2019

Transforming evidence for policy and practice: creating space for new conversations

  • Kathryn Oliver   ORCID: orcid.org/0000-0002-4326-5258 1 &
  • Annette Boaz 2  

Palgrave Communications volume  5 , Article number:  60 ( 2019 ) Cite this article

17k Accesses

53 Citations

157 Altmetric

Metrics details

  • Science, technology and society

A Correction to this article was published on 29 August 2019

This article has been updated

For decades, the question of how evidence influences policy and practice has captured our attention, cutting across disciplines and policy/practice domains. All academics, funders, and publics have a stake in this conversation. There are pockets of great expertise about evidence production and use, which all too often remains siloed. Practical and empirical lessons are not shared across disciplinary boundaries and theoretical and conceptual leaps remain contained. This means that we are not making the most of vast and increasing investment in knowledge production. Because existing lessons about how to do and use research well are not shared, funders and researchers are poorly equipped to realise the potential utility of research, and waste resources on—for example—ineffective strategies to create research impact. It also means that the scarce resources available to study evidence production and use are misspent on overly-narrow or already-answered questions. Patchy and intermittent funding has failed to build broadly relevant empirical or theoretical knowledge about how to make better use of evidence, or to build the communities required to act on this knowledge. To transform how we as a community think about what evidence is, how to generate it, and how to use it well, we must better capture lessons being learned in our different research and practice communities. We must find ways to share this knowledge, to embed it in the design of our research systems and practices, and work jointly to establish genuine knowledge gaps about evidence production and use. This comment sets out one vision of how that might be accomplished, and what might result.

Are we investing wisely in research for society?

For decades, conversations between research funders, users, and producers have focused on different aspects of what evidence is, the roles it plays in policy and practice, and the different ways in which roles can be enhanced and supported. Most researchers feel unequivocally that ‘more research’ is always better—and funders and governments seem to agree (Sarewitz, 2018 ). Governments are increasingly using investments explicitly to help create the evidence base for better decision-making. For example, funding has been explicitly focused on the United Nation’s Sustainable Development Goals (UKRI-UNDP, 2018 ). The UK government has made several targeted investments, including the £1.5 billion Global Challenges Research Fund to address substantive social problems, (Gov. UK, 2016 ; UKRI, 2017 ), and in health, the thirteen (up from nine) Collaborations for Leadership in Applied Health Research and Care, which received £232 million 2008–2019 (NIHR, 2009 ). This investment looks set to continue with a further £150 million allocated to the Applied Research Collaborations (NIHR, 2018 ). In the US, the Trump administration recently signed into law $176.8 billion for research and development, of which $543 million is specifically for translational health research (Science, 2018 ). These funds are made available to researchers with an effective proviso that the research is targeted towards questions of direct interest to policymakers and practitioners.

There has also been an increase in the infrastructure governments provide, such as scientific advisory posts and professionals (Doubleday and Wilsdon, 2012 ; Gluckman, 2014 ), and a range of secondments and fellowship opportunities designed to ‘solve’ the problem of limited academic-policy engagement (Cairney and Oliver, 2018 ). The UK Government recently asked departments to produce research priority areas (Areas of Research Interest (ARIs)), to guide future academic-policy collaboration (Nurse, 2015 ). Yet, there has been almost no evaluation of these activities. There is limited evidence about how to build the infrastructure to use evidence in impactful ways and limited evidence about the impact of this investment (Kislov et al., 2018 ). We simply do not know whether the growth of funding, infrastructure, or initiatives has actually improved research quality, or led to improvements for populations, practice or policy.

Thus, despite our ever-growing knowledge about our world, physical and social, it is not easy to find answers to the challenges facing us and our governments. Spending ever-increasing amounts on producing research evidence is not likely to help, if we do not understand how to make the most of these investments. Discussions about wastage within the research system often focus on valid concerns about reproducibility and quality (Bishop, 2019 ), but until we also understand the broader political and societal pressure shaping what evidence is produced and how, we will not be able to reduce this waste (Sarewitz, 2018 ). In short, our research systems are not guided by current theory about what types of knowledge are most valuable to help address societal problems, or how to produce useful evidence, or how to use this knowledge in policy and practice setting.

Who knows about how to improve evidence production and use?

Fortunately, even if under-used, there is a significant body of academic and practical knowledge about how evidence is produced and used. Several disciplines take the question of evidence production and use as a core concern, and this inherently transdisciplinary space has become populated by research evidence from different academic and professional traditions, jurisdictions and contexts.

Much of the funded research into knowledge production and use has been conducted in health and health care, and other applied disciplines. Although there are perennial inquiries about the ‘best’ research methods which should inform policy and practice (Haynes et al., 2016 ), this field has offered some very practical insights, from identifying factors which influence evidence use (Innvaer et al., 2002 ; Oliver et al., 2014 ; Orton et al., 2011 ), to identifying types of evidence used in different contexts (Dobrow et al., 2004 ; Oliver and de Vocht, 2015 ; Whitehead et al., 2004 ). Researchers have explored strategies to increase evidence use (Dobbins et al., 2009 ; Haynes et al., 2012 ; Lavis et al., 2003 ), and developed structures to support knowledge production and use—in the UK, see, for example, the What Works Centres, Policy Research Units, Health Research Networks and so forth (Ferlie, 2019 ; Gough et al., 2018 ). Similar examples can be found in the US (Tseng et al., 2018 ; Nutley and Tseng, 2014 ) and the Netherlands (Wehrens et al., 2010 ). Alongside these practical tools, critical research has helped us to understand the importance of diverse evidence bases (e.g., Brett et al., 2014 ; Goodyear-Smith et al., 2015 ), of including patients and stakeholders in decision-making (Boaz et al., 2016 ; Liabo and Stewart, 2012 ), and to contextualise the drive for increased impact outcomes (Boaz et al., 2019 ; Locock and Boaz, 2004 ; Nutley et al., 2000 ).

The social sciences have provided research methods to investigate the various interfaces between different disciplines and their potential audiences. Acknowledging insights from philosophy, critical theory and many other field (see, e.g., Douglas, 2009 ), we highlight two particular perspectives. Firstly, policy studies has helped us to understand the processes of decision-making and the (political) role of evidence within it (Dye, 1975 ; Lindblom, 1990 ; Weiss, 1979 ). A subfield of ‘the politics of evidence-based policymaking’ has grown up, using an explicitly political-science lens to examine questions of evidence production and use (Cairney, 2016b ; Hawkins and Ettelt, 2018 ; Parkhurst, 2017 ). Political scientists have commented on the ways in which political debate has been leveraged by scientific knowledge, with particular focuses on social justice, and the uses of evidence to support racist and sexist oppression (Chrisler, 2015 ; Emejulu, 2018 ; Lopez and Gadsden, 2018 ; Malbon et al., 2018 ; Scott, 2011 ).

Secondly, the field of Science and Technology studies (STS) treats the practice and purpose science itself as an object of study. Drawing on philosophies of science and sociologies of knowledge and practice, early theorists described science as an esoteric activity creating knowledge through waves of experimentation (Kuhn, 1970 ; Popper, 1963 ). This was heavily critiqued by social constructivists, who argue that all knowledge was inherently bound to cultural context and practices (Berger and Luckmann, 1966 ; Collins and Evans, 2002 ; Funtowicz and Ravetz, 1993 ). Although some took this to mean that science was just another way of interpreting reality of equal status with other belief systems, most see these insights as demonstrating the importance of understanding the social context within which scientific practices and objects were conducted and described (Latour and Woolgar, 2013 ; Shapin, 1995 ). Similarly, Wynne showed how social and cultural factors determine what we consider ‘good’ evidence or expertise (Wynne, 1992 ). More recently, scholars have focused on how science and expertise is politicised through funding and assessment environments (Hartley et al., 2017 ; Jasanoff, 2005 ; Jasanoff and Polsby, 1991 ; Prainsack, 2018 ), the cultures and practices of research (Fransman, 2018 ; Hartley, 2016 ), through the modes of communication with audiences, and on the role for scientific advice around emerging technologies and challenges (Lee et al., 2005 ; Owen et al., 2012 ; Pearce et al., 2018 ; Smallman, 2018 ; Stilgoe et al., 2013 ).

Are we acting on these lessons?

However, funders and researchers rarely draw on the learning from these different fields; nor is learning shared between disciplines and professions (Oliver and Boaz, 2018 ). Thus, we have sociologists of knowledge producing helpful theory about the complex and messy nature of decision-making and the political nature of knowledge (e.g., Lancaster, 2014 ); but this is not drawn on by designers of research partnerships or evaluators of research impact (Chapman et al., 2015 ; Reed and Evely, 2016 ; Ward, 2017 ). This leaves individual researchers with the imperative to do high quality research and to demonstrate impact, but with little useful advice about how as individuals or institutions they might achieve or measure impact (Oliver and Cairney, 2019 ), leading to enormous frustration, duplicated and wasted effort. Even more damagingly, researchers produce poor policy recommendations, or naively engage in political debates with no thought about the possible costs and consequences for themselves, the wider sector, or publics.

We recognise that engaging meaningfully with literatures from multiple disciplines is too challenging a labour for many. The personal and institutional investment required to engage with the practical and scholarly knowledge about evidence production and use is—on top of other duties—beyond most of us. Generating consensus about the main lessons is itself challenging, although initial attempts have been made (Oliver and Pearce, 2017 ). Across the diverse literature on evidence use, terms are defined and mobilised differently. Working out what the terms are implying and what is at stake in the alternative mobilisation of these terms is a huge task. Many researchers are only briefly able to enter this broader debate, through tacked-on projects attached to larger grants. There is no obvious career pathway for those who want to remain at this higher level. There are simply too many threads pulling researchers and practitioners back into their ‘home’ disciplines and domains, which prevents people undertaking the labour of learning the key lessons from multiple fields.

Yet the history of research in this area, scattered and patchy though it is, shows us how necessary this labour is if useful, meaningful research is to be done and used (DuMont, 2019 ). Too much time and energy has been spent investigating questions which have been long-since answered—such as whether RCTs should be used to investigate policy issues, whether we need a pluralistic approach to research design; whether to invest in relationships as well as data production. But governments and universities have also failed to create environments where knowledge producers are welcome and useful in decision-making environments; where their own staff feel able to freely discuss and experiment with ideas; and universities consistently fail to reward or support those who want to create social change or work at the interfaces between knowledge production and use.

This failure to draw together key lessons also means that the scarce resources allocated to the study of evidence production and use have been misspent. There has been no sustained interdisciplinary funding for empirical research studies into evidence production and use in the UK, and in the US only over the last 15 years (DuMont, 2019 ). This has led to a dearth of shared empirical and theoretical evidence, but also a lack of community, which has had a detrimental effect on the scholarship in this space. All too often, research funding goes towards already-answered questions (such as whether bibliometrics are a good way to capture impact). We must ensure that new research on evidence production and use addresses genuine gaps. That can only be done by making existing knowledge more widely available and working together to generate collaborative research agendas.

An unfortunate side-effect of this lack of community is that many who enter it do so with the sense that it is a new, ‘emerging’ field, which will generate silver-bullet solutions for researchers and funders. Because it is new to them, researchers feel it must be new to all—not realising that their own journey has been undertaken by many others before them. For instance, there are many initiatives which claim to be ‘newly addressing’ the problem of ‘evidence use’, ‘research on research’, the ‘science of science’, ‘meta-science’, or some other variant. Whether they explore the allocation and impact of research funding and evaluation, the infrastructure of policy research units or the practice of collaborative research, they all make vital contributions. But to claim as many do that it is an ‘emerging field’ illustrates how easy it is, even with the best of intentions, to ignore existing expertise on the production and use of evidence. We must better articulate the difference between these pieces of the puzzle, and the difference those differences make. Too many are claiming that their piece provides the whole picture. In turn, funders feel they have done their part by funding this small piece of research, but remain ignorant of the existing knowledge, and indeed of the real gaps.

Research on evidence production and use is often therefore not as useful as it should be. Failing to draw on existing literature, the solutions proposed by most commentators on the evidence-policy/practice ‘gap’ often do not take into account the realities of complex and messy decision-making, or the contested and political nature of knowledge construction—leading to a situation where an author synthesising lessons from across the field can end up sharing a set of normative statements that might imply that there has been no conceptual leap in 20 years (see e.g., French, 2018 ; Gamoran, 2018 ).

Evidence and policy/practice studies: our tasks

There are therefore two key tasks for those primarily engaged in researching and teaching evidence production and use for policy and practice, which are to (1) identify and share key lessons more effectively and (2) to build a community enabling transdisciplinary evidence to be produced and used, which addresses real gaps in the evidence base and helps decision-makers transform society for the better. We close with some suggestions about possible steps we can take towards these goals.

Firstly, we must better communicate our key lessons. We would like to help people articulate the hard-won, often disciplinary-specific lessons from their own work for others—and to work with partners to embed them into the design, practice and evaluation of research. For instance, critical perspectives on power can describe the lines of authority and the institutional governance surrounding decision-making (Bachrach and Baratz, 1962 ; Crenson, 1971 ; Debnam, 1975 ); the interpersonal dynamics which determine everything from the credibility of evidence to the placement of topics on policy agendas (Oliver and Faul, 2018 ; Tchilingirian, 2018 ; White, 2008 ); to the practice of research itself, and the ways in which assumed and enacted power leads to the favouring of certain methodologies and narratives (Hall and Tandon, 2017 ; Pearce and Raman, 2014 ). How might this translate into infrastructure and funding to support equitable research partnerships (Fransman et al., 2018 )? What other shared theory and practical insights might help us transform how we do and use research?

Secondly, we must generate research agendas collaboratively. In our view, the only way to avoid squandering resources on ineffective research on research is to work together to share emerging ideas, and to produce genuinely transdisciplinary questions. We made a start on this task at recent meetings. A 2018 Nuffield Foundation-funded symposium brought together leading scholars, practitioners and policymakers, and funders, to share learning about evidence use and to identify key gaps. We followed this meeting with a broader discussion at the William T. Grant Use of Research Evidence meeting in March 2019 which has also contributed to our thinking.

We initiated the conversation with a Delphi exercise to identify key research questions prior to the meeting. We refined the list, and during the meeting we asked participants to prioritise these. This was a surprisingly challenging process, which revealed that even to reach common understanding about the meaning of a research question, let alone the importance, discussants had to wade through decades-worth of assumptions, biases, preferences, language nuances and habits.

Based on this analysis, we identify three main areas of work which are required to transform how we think about to create and use evidence (Table 1 ):

Transforming knowledge production

Transforming translation and mobilisation

Transforming decision-making

The topics below were selected to indicate the broad range of empirical and normative questions which need broader discussion, and are by no means definitive. Of course, much research on some topics has already been done, but we have included them—because even if research already exists, it is not widely enough known to routinely inform research users, funders or practitioners about how to better produce or use evidence. We observe that much of the very limited funding to investigate evidence production and use has gone to either developing metrics (responsible or otherwise, Row 2 column 4) or tools to increase uptake (Row 2, column 4), to the relative neglect of everything else. There are significant gaps which can only be addressed jointly across disciplines and sectors, and we welcome debates, additions, and critiques about how to do this better.

A shared research agenda

As we note above, these topics are drawn from proposed questions and discussions by an interdisciplinary group of scholars, practitioners, funders and other stakeholders. It became clear during this process that many were unaware of relevant research which had already been undertaken under these headings. These topics reflect our own networks and knowledge of the field, so cannot be regarded as definitive. We need and welcome partnership with others working in this space to attempt to broaden the conversation as much as possible. We have selected a proportion of the selected topics to illustrate a number of points.

First, that no one discipline or researcher could possibly have the skills or knowledge to answer all of these questions. Interdisciplinary teams can be difficult to assemble, but clearly required. We need leadership in this space to help spot opportunities to foster interdisciplinary research and learning.

Second that all of these topics could be framed and addressed in multiple ways, and many have been. Many are discussed, but there is little consensus; or there is consensus within disciplines but not between them. Some topics have been funded and others have not. We feel there is an urgent need to identify where research investment is required, where conversations need to be supported, and where and how to draw out the value of existing knowledge. Again, we need leadership to help us generate collaborative research agendas.

Third, that while we all have our own interests, the overall picture is far more diverse, and that there is a need for all working in this area to clearly define what their contributions are in relation to the existing evidence and communities. A shared space to convene and learn from one another would help us all understand the huge and exciting space within which we are working.

Finally, this is an illustrative set of topics, and not an exhaustive one. We would not claim to be setting the definitive research agenda in this paper. Rather, we are setting out the need to learn from one another and to work together in the future. Below, we describe some examples of the type of initial discussions which might help us to move forward, using our three themes of knowledge production, knowledge mobilisation, and decision-making. We have cited relevant studies which set out research questions or provide insights. By doing so, we hope to demonstrate the breadth of disciplines and approaches which are being used to explore these questions; and the potential value of bringing these insights together.

Firstly, we must understand who is involved in shaping and producing the evidence base. Much has been written about the need to produce more robust, meaningful research which minimises research waste through improving quality and reporting (Chalmers et al., 2014 ; Glasziou and Chalmers, 2018 ; Ioannidis, 2005 ), and the infrastructure, funding and training which surround knowledge production and evaluation have attracted critical perspectives (Bayley and Phipps, 2017 ; Gonzalez Hernando and Williams, 2018 ; Katherine Smith and Stewart, 2017 ). Current discourses around ‘improving’ research focus on making evidence more rigorous, certain, and relevant; but how are these terms interpreted locally in different policy and practice contexts? How are different forms of knowledge and evidence assessed, and how do these criteria shape the activities of researchers?

Enabling researchers to reflect on their own role in the ‘knowledge economy’—that is, the production and services attached to knowledge-intensive activities (usually but not exclusively referring to technological innovation (Powell and Snellman, 2004 ))—requires engagement with this history.

This might mean asking questions about who is able to participate in the practice and evaluation of research. Who is able to ask and answer questions? What questions are asked and why? Who gets to influence research agendas? We know that there are barriers to participation in research for minority groups, and for many research users (Chrisler, 2015 ; Duncan and Oliver, 2017 ; Scott et al., 2009 ). At a global level, how are research priorities set by, for example, international funders and philanthropists? How can we ensure that local and indigenous interests and priorities are not ignored by predominantly Western research practices? How are knowledge ‘gaps’ or areas of ‘non-knowledge’ constructed, and what are the power relationships underpinning that process (Nielsen and Sørensen, 2017 )? There are important questions about what it means to do ethical research in the global society, with honesty about normative stances and values (Callard and Fitzgerald 2015 ), which apply to the practices we engage in as much as the substantive topics we focus on (Prainsack et al., 2010 ; Shefner et al., 2014 ).

It might also mean asking about how we do research. Many argue that research (particularly funded through responsive-mode arrangements) progresses in an incremental way, with questions often driven by ease, rather than public need (Parkhurst, 2017 ). Is this the most efficient way to generate new knowledge? How does this compare with, for example, random research funding (Shepherd et al., 2018 )? Stakeholder engagement is said to be required for impact, yet we know it is costly and time-consuming (Oliver et al., 2019 , 2019a ). How can universities and funders support researchers and users to work together long-term, with career progression and performance management untethered from simplistic (or perhaps any) metrics of impact? Is coproduced research truly more holistic, useful, and relevant? Or does inviting in different interests to deliberate on research findings, even processes, distort agendas and politicise research (Parkhurst and Abeysinghe, 2016 )? What are the costs and benefits to these different systems and practices? We know little about whether (and if so how well) each of these modes of evidence production leads to novel, useful, meaningful knowledge; nor how these modes influence the practice or outputs of research.

Transforming evidence translation and mobilisation

Significant resources are put into increasing ‘use’ of evidence, through interventions (Boaz et al., 2011 ) or research partnerships (Farrell et al., 2019 ; Tseng et al., 2018 ). Yet ‘use’ is not a straightforward concept. Using research well implies the existence of a diverse and robust evidence base; a range of pathways for evidence to reach decision-makers; both users and producers of knowledge having the capacity and willingness to engage in relationship-building and deliberation about policy and practice issues; research systems supporting individuals and teams to develop and share expertise.

More attention should be paid to how evidence is discussed, made sense of, negotiated and communicated—and the consequences of different approaches. This includes examining the roles of people involved in the funding of research, through to the ways in which decision-makers access and discuss evidence of different kinds. How can funders and universities create infrastructure and incentives to support researchers to do impactful research, and to inhabit boundary spaces between knowledge production and use? We know that potential users of research may sit within or outside government, with different levels and types of agency, making different types of decisions in different contexts (Cairney, 2018 ; Sanderson, 2000 ). Yet beyond ‘tailoring your messages’, existing advice to academics does not help them navigate this complex system (Cairney and Oliver, 2018 ). To take this lesson seriously, we might want to think about the emergence of boundary spanning- organisations and individuals which help to interface between research producers (primarily universities, but also civil society) and users (Bednarek et al., 2016 ; Cvitanovic et al., 2016 ; Stevenson, 2019 ). What types of interfacing are effective, and how—and how do interactions between evidence producers and users shape both evidence and policy? How might policies on data sharing and open science influence innovation and knowledge mobilisation practices?

Should individual academics engage in advocacy for policy issues (Cairney, 2016a ; Smith et al., 2015 ), using emotive stories or messaging to best communicate (Jones and Crow, 2017 ; Yanovitzky and Weber, 2018 ), or rather be ‘honest brokers’ representing without favour a body of work (Pielke, 2007 )? Or should this type of dissemination work be undertaken by boundary organisations or individuals who develop specific skills and networks? There is little empirical evidence about how best to make these choices (Oliver and Cairney, 2019 ), or how these consequences affect the impact or credibility of evidence (Smith and Stewart, 2017 ); nor is there good quality evidence about the most effective strategies and interventions to increase engagement or research uptake by decision-makers or between researchers and their audiences (Boaz et al., 2011 ). It seems likely that some researchers may get involved and others stay in the hinterlands (Locock and Boaz, 2004 ), depending on skills and preference. However, it is not clear how existing studies can help individuals navigate these complex and normative choices.

Communities (of practice, within policy, amongst diverse networks) develop their own languages and rationalities. This will affect how evidence is perceived and discussed (Smallman, 2018 ). Russell and Greenhalgh have shown how competing rationalities affect the reasoning and argumentation deployed in decision-making contexts (Greenhalgh and Russell, 2006 ; Russell and Greenhalgh, 2014 ); how can we interpret local meanings and sense-making in order to better communicate about evidence? Much has been written about the different formats and tailored outputs which can be used to ‘increase uptake’ by decision-makers (Lavis et al., 2003 ; Makkar et al., 2016 ; Traynor et al., 2014 )—although not with conclusive findings—yet we know so little about how these messages are received. Researchers may be communicating particularly messages, but how can we be sure that decision-makers are comprehending and interpreting those messages in the same way? Theories of communication (e.g., Levinson, 2000 ; Neale, 1992 ) must be applied to this problem.

Similarly, drawing on psychological theories of behaviour change, commentators have argued for greater use of emotion, narrative and story-telling by researchers in an attempt to influence decision-making (Cairney, 2016b ; Davidson, 2017 ; Jones and Crow, 2017 ). Are these effective at persuading people and if so how do they work? What are the ethical questions surrounding such activities and how does this affect researcher identity? Should researchers be aiming to communicate simple messages about which there is broad consensus?

Discussions of consensus often ask whether agreement is a laudable aim for researchers, or how far consensus is achievable (De Kerckhove et al., 2015 ; Lidskog and Sundqvist, 2004 ; Rescher, 1993 ). We are also interested in the tension between scientific and politician consensus, and how differences in interpretations of knowledge can be leveraged to influence political consensus (Beem, 2012 ; Montana, 2017 ; Pearce et al., 2017 ). What tools can be used to generate credibility? Is evidence persuasive of itself; can it survive the translation process; and is it reasonable to expect individual researchers to broadcast simple messages about which there is broad consensus, if that is in tension with their own ethical practices and knowledge (even if the most effective way to influence policy? Is consensus required for the credibility of science and scientists, or can am emphasis on similarity in fact reduce the value of research and the esteem of the sector? Is it the task of scientists to surface conflicts and disagreements, and how far does this duty extend into the political sphere (Smith and Stewart, 2017 )?

Transforming decision-making, and the role of evidence within it

Finally, we need to understand how research and researchers can support decision-making given what we know about the decision-making context or culture, and how this influences evidence use (Lin, 2008 ). This means better understanding the roles of professional and local cultures of evidence use, governance arrangements, and roles of public dialogues so that we can we start to investigate empirically-informed strategies to increase impact (Locock and Boaz, 2004 ; Oliver et al., 2014 ). This would include empirical examination of individual strategies to influence decision-making, as well as more institutional infrastructures and roles; case studies of different types of policymaking and the evidence diets consumed in these contexts; and how different people embody different imperatives of the evidence/policy nexus. We need to bring together examples of the policy and practice lifecycles, and examine the roles of different types of evidence throughout those processes (Boaz et al., 2011 , 2016 ).

We want to know what shapes the credibility afforded to different experts and forms of expertise, and how to cultivate credibility to enable better decision-making (Grundmann, 2017 ; Jacobson and Goering, 2006 ; Mullen, 2016 ; Williams, 2018 ). What does credibility enable (greater attention or influence; greater participation by researchers in policy processes; a more diverse debate)? What is the purpose of increasing credibility? What is the ultimate aim of attempting to become credible actors in policy spaces? How far should universities and researchers go—should we be always aiming for more influence? Or should we recognise and explore the diversity of roles research and researchers can play in decision-making spaces?

Ultimately, methods must be found to evaluate the impact of evidence on policy and practice change, and on populations—including unintended or unwanted consequences (Lorenc and Oliver, 2013 ; Oliver et al., 2019 , 2019a ). Some have argued that the primary role for researchers is to demonstrate the consequences of decisions and to enable debate. This requires the development and application of methods to evaluate changes, understand mechanisms, and develop theory and substantive and normative debates, as well as engage in the translation and mobilisation of evidence. It also requires increased transparency to enable researchers to understand evidence use (Nesta, 2012 ), while also allowing others like Sense about Science to check the validity of evidence claims on behalf of the public (Sense about Science, 2016 ).

Next steps and concrete outputs

These illustrative examples demonstrate the vast range of discussions which are happening, and need to happen to help us transform how we produce and use evidence. We are not the first to identify the problems of research wastage (Glasziou and Chalmers, 2018 ) or to emphasise the need to maximise the value of research for society (Duncan and Oliver, 2017 ). Nor are we the first to note that all the parts of the research system play a role achieving this, from funding (Geuna and Martin, 2003 ), to research practices (Bishop, 2019 ; Fransman, 2018 ), to translational activities (Boaz et al. 2019 ; Nutley and Tseng, 2014 ), professional science advice (Doubleday and Wilsdon, 2012 ) and public and professional engagement (Holliman and Warren, 2017 ). There have been sustained attempts to build communities and networks to attempt ways to improve parts of this system Footnote 1 . However, most of these initiatives are rooted in particular disciplines or professional activities. We see a need for a network which bridges these initiatives, helping each other articulate their key lessons for one another, and progressing our conversations about how to do better research about evidence production and use.

Researchers, funders, decision-makers and publics will approach and inhabit this space from different, sometimes very different directions. We do not claim to be writing the definitive account. But we would like to open the door to more critical accounts of evidence production and use which are specifically aimed at multi-disciplinary and sectoral audiences. Our aim is to welcome and support debate, to introduce parts of our diverse community to each other, and to enable our individual perspectives and knowledge to be more widely valued.

We anticipate disagreement and discussion, and support a multitude of ways of approaching the issues we identify above. Some may feel that our energies should be directed to democratising knowledge for all and ensuring that this is mobilised to maximise equality and fairness (Stewart et al., 2018 ). Others may feel that our task is to observe, problematise and critique these processes, rather than engage in them directly (Fuller, 1997 ). Our view is that both normative and critical approaches are vital; as are empirical and theoretical contributions to our understanding of high-level research systems, down to micro-interactions in evidence production and use. Our contention is that we must keep this space vibrant and busy, producing new knowledge together, and learning from each other. This requires investment in research on evidence production and use, in virtual and literal spaces to hold conversations, as well as in capacity and capability. There are significant and important gaps in what we know about evidence production and use, but identifying the particular and specific research agendas for each of these gaps must be a collaborative process.

We also see a need to support those who are new to this space. Many come to the problem of evidence use without any training in the history of research in this space. We see a need to provide an accessible route into these debates, and welcome opportunities to collaborate on textbooks or learning resources to support new students, non-academics and those new to the field.

The Nuffield Foundation meeting which led to this paper demonstrated how valuable these opportunities are to enable learning and relationship-building through face-to-face interactions. We will continue to create opportunities for greater transdisciplinary and academic-partner conversations, to share learning across spheres of activity and to build capacity, and to use these new perspectives to generate fresh avenues of enquiry, through the new Transforming Evidence Footnote 2 collaboration.

Finally, we argue for increased investment to maximise the learning we already have, and to support more effective knowledge production and use. Too much money and expertise has been wasted, and too many opportunities to build on existing expertise have been squandered. We must find better ways to make this learning accessible, and to identify true knowledge gaps. Indeed, we believe that collaboration across disciplinary and sectoral boundaries is the only way in which this space will both progress and demonstrate its true value. We must prevent the waste of limited resources to understand how to transform evidence production and use for the benefit of society. Putting what we already know into practice would be an excellent place to start.

Change history

29 august 2019.

An amendment to this paper has been published and can be accessed via a link at the top of the paper.

See, for example https://www.alliance4usefulevidence.org/ , https://www.ingsa.org/ , https://4sonline.org/ , https://www.metascience2019.org/ , http://www.alltrials.net/

See Transforming Evidence site, https://transformure.wordpress.com/

Bachrach P, Baratz MS (1962) Two faces of power. Am Polit Sci Rev. https://doi.org/10.2307/1952796

Article   Google Scholar  

Bayley JE, Phipps D (2017) Building the concept of research impact literacy. Evid Policy . https://doi.org/10.1332/174426417x15034894876108

Bednarek AT, Shouse B, Hudson CG et al. (2016) Science-policy intermediaries from a practitioner’s perspective: The Lenfest Ocean Program experience. Sci Pub Policy. https://doi.org/10.1093/scipol/scv008

Beem B (2012) Learning the wrong lessons? Science and fisheries management in the Chesapeake Bay blue crab fishery Public Underst Sci 21(4):401–417. https://doi.org/10.1177/0963662510374177

Article   PubMed   Google Scholar  

Berger PL, Luckmann T (1966) The social construction of reality: A treatise in the sociology of knowledge. Doubleday, Garden City, NY

Bishop D (2019) Rein in the four horsemen of irreproducibility. Nature 568(7753):435–435. https://doi.org/10.1038/d41586-019-01307-2

Article   ADS   CAS   PubMed   Google Scholar  

Boaz A, Baeza J, Fraser A (2011) Effective implementation of research into practice: An overview of systematic reviews of the health literature. BMC Res Notes. https://doi.org/10.1186/1756-0500-4-212

Boaz A, Robert G, Locock L et al. (2016) What patients do and their impact on implementation. J Health Organiz Manag. https://doi.org/10.1108/JHOM-02-2015-0027

Boaz A, Davies H, Fraser A et al. (2019) What works now? Evidence-based policy and practice revisited. Policy press. Available at: https://policy.bristoluniversitypress.co.uk/what-works-now . (Accessed 17 July 2018)

Brett J, Staniszewska S, Mockford C et al. (2014) A systematic review of the impact of patient and public involvement on service users, researchers and communities. Patient 7(4):387–395. https://doi.org/10.1007/s40271-014-0065-0

Cairney P (ed) (2016a) Health and advocacy: What are the barriers to the use of evidence in policy? In: The politics of evidence-based policy making. Palgrave Macmillan, London, UK, pp 51–84. https://doi.org/10.1057/978-1-137-51781-4_3

Google Scholar  

Cairney P (2016b) The politics of evidence-based policy making. Springer, London, pp 1–137. https://doi.org/10.1057/978-1-137-51781-4

Cairney P (2018) Three habits of successful policy entrepreneurs. Policy Polit 46(2):199–215. https://doi.org/10.1332/030557318X15230056771696

Cairney P, Oliver K (2018) How should academics engage in policymaking to achieve impact? Polit Stud Rev . https://doi.org/10.1177/1478929918807714

Callard F, Des Fitzgerald (2015) Rethinking interdisciplinarity across the social sciences and neurosciences. https://doi.org/10.1057/9781137407962

Book   Google Scholar  

Chalmers I, Bracken MB, Djulbegovic B et al. (2014) How to increase value and reduce waste when research priorities are set. Lancet 383(9912):156–165. https://doi.org/10.1016/S0140-6736(13)62229-1

Chapman JM, Algera D, Dick M et al. (2015) Being relevant: Practical guidance for early career researchers interested in solving conservation problems. Glob Ecol Conserv 4:334–348. https://doi.org/10.1016/j.gecco.2015.07.013

Chrisler AJ (2015) Humanizing research: Decolonizing qualitative inquiry with youth and communities. J Fam Theory Rev 7(3):333–339. https://doi.org/10.1111/jftr.12090

Collins HM, Evans R (2002) The third wave of science studies. Soc Stud Sci 32(2):235–296. https://doi.org/10.1177/0306312702032002003

Crenson MA (1971) The un-politics of air pollution; a study of non-decisionmaking in the cities. Johns Hopkins Press, Baltimore and London

Cvitanovic C, McDonald J, Hobday AJ (2016) From science to action: Principles for undertaking environmental research that enables knowledge exchange and evidence-based decision-making. J Environ Manage. https://doi.org/10.1016/j.jenvman.2016.09.038

Article   CAS   PubMed   Google Scholar  

Davidson B (2017) Storytelling and evidence-based policy: Lessons from the grey literature. Palgrave Commun. https://doi.org/10.1057/palcomms.2017.93

De Kerckhove DT, Rennie MD, Cormier R (2015) Censoring government scientists and the role of consensus in science advice: A structured process for scientific advice in governments and peer-review in academia should shape science communication strategies. EMBO Rep 16(3):263–266. https://doi.org/10.15252/embr.201439680

Article   CAS   PubMed   PubMed Central   Google Scholar  

Debnam G (1975) Nondecisions and Power: The Two Faces of Bachrach and Baratz. Am Political Sci Rev 69(03):889–899. https://doi.org/10.2307/1958397

Dobbins M, Robeson P, Ciliska D et al. (2009) A description of a knowledge broker role implemented as part of a randomized controlled trial evaluating three knowledge translation strategies. Implement Sci 4(1). https://doi.org/10.1186/1748-5908-4-23

Dobrow MJ, Goel V, Upshur REG (2004) Evidence-based health policy: context and utilisation. Soc Sci Med. https://doi.org/10.1016/S0277-9536(03)00166-7

Doubleday R, Wilsdon J (2012) Science policy: Beyond the great and good. Nature. https://doi.org/10.1038/485301a .

Douglas H (2009) Science, Policy, and the Value-Free Ideal. University of Pittsburgh Press, Pittsburgh

DuMont K (2019) Reframing evidence-based policy to align with the evidence|William T. Grant foundation. Available at: http://wtgrantfoundation.org/digest/reframing-evidence-based-policy-to-align-with-the-evidence . (Accessed 28 Jan 2019)

Duncan S, Oliver S (2017) Editorial. Res All 1(1):1–5. https://doi.org/10.18546/RFA.01.1.01

Article   MathSciNet   Google Scholar  

Dye TR (1975) Understanding public policy. Prentice-Hall. Available at: http://agris.fao.org/agris-search/search.do?recordID=US201300519645 . (Accessed 18 Jan 2019)

Emejulu A (2018) On the problems and possibilities of feminist solidarity: The Women’s March one year on. IPPR Progress Rev 24(4):267–273. https://doi.org/10.1111/newe.12064

Farrell CC, Harrison C, Coburn CE (2019) What the hell is this, and who the hell are you? role and identity negotiation in research-practice partnerships. AERA Open 5(2):233285841984959. https://doi.org/10.1177/2332858419849595

Ferlie E (2019) The politics of management knowledge in times of austerity. Available at: https://books.google.co.uk/books?hl=en&lr=&id=Ok5yDwAAQBAJ&oi=fnd&pg=PP1&dq=info:XZBJCDoqIowJ:scholar.google.com&ots=Vg1eZHL9e_&sig=fS2Bf8w7VtyDKfZ3InQWq-npbuk&redir_esc=y#v=onepage&q&f=false . (Accessed 14 Feb 2019)

Fransman J (2018) Charting a course to an emerging field of ‘research engagement studies’: A conceptual meta-synthesis Res All 2(2):1–49. http://www.ingentaconnect.com/contentone/ioep/rfa/2018/00000002/00000002/art00002#

Fransman J, Hall B, Hayman R et al. (2018) Promoting fair and equitable research partnerships to respond to global challenges. Rethinking research collaborative. Available at: http://oro.open.ac.uk/57134/ . (Accessed 25 Apr 2019)

French RD (2018) Lessons from the evidence on evidence-based policy. Can Public Adm 61(3):425–442. https://doi.org/10.1111/capa.12295

Fuller S (1997) Constructing the high church-low church distinction in STS textbooks. Bull Sci, Technol Soc 17(4):181–183. https://doi.org/10.1177/027046769701700408

Funtowicz SO, Ravetz JR (1993) Science for the post-normal age. Futures. https://doi.org/10.1016/0016-3287(93)90022-L

Gamoran A (2018) Evidence-based policy in the real world: A cautionary view Ann Am Acad Political Soc Sci 678(1):180–191. https://doi.org/10.1177/0002716218770138

Geuna A, Martin BR (2003) University research evaluation and funding: An international comparison. Kluwer Academic Publishers, Minerva. https://doi.org/10.1023/B:MINE.0000005155.70870.bd

Glasziou P, Chalmers I (2018) Research waste is still a scandal—an essay by Paul Glasziou and Iain Chalmers. BMJ 363:k4645. https://doi.org/10.1136/BMJ.K4645

Gluckman P (2014) Policy: The art of science advice to government. Nature 507(7491):163–165. https://doi.org/10.1038/507163a

Gonzalez Hernando M, Williams K (2018) Examining the link between funding and intellectual interventions across universities and think tanks: a theoretical framework. Int J Polit, Cult Soc 31(2):193–206. https://doi.org/10.1007/s10767-018-9281-2

Goodyear-Smith F, Jackson C, Greenhalgh T (2015) Co-design and implementation research: challenges and solutions for ethics committees. BMC Med Eth. https://doi.org/10.1186/s12910-015-0072-2

Gough D, Maidment C, Sharples J (2018) UK What Works Centres: Aims, methods and contexts. London. Available at: https://eppi.ioe.ac.uk/cms/Default.aspx?tabid=3731 . (Accessed 27 Feb 2019)

Gov.UK (2016) Science and research funding allocation: 2016 to 2020-GOV.UK. Available at: https://www.gov.uk/government/publications/science-and-research-funding-allocation-2016-to-2020 . (Accessed 14 Feb 2019)

Greenhalgh T, Russell J (2006) Reframing evidence synthesis as rhetorical action in the policy making drama. Healthcare Policy|Politiques de Santé. https://doi.org/10.12927/hcpol.2006.17873

Article   PubMed   PubMed Central   Google Scholar  

Grundmann R (2017) The problem of expertise in knowledge societies. Minerva 55(1):25–48. https://doi.org/10.1007/s11024-016-9308-7

Hall BL, Tandon R (2017) Decolonization of knowledge, epistemicide, participatory research and higher education. Res All 1(1):6–19. https://doi.org/10.18546/RFA.01.1.02

Hartley S (2016) Policy masquerading as science: an examination of non-state actor involvement in European risk assessment policy for genetically modified animals. J Eur Public Policy. https://doi.org/10.1080/13501763.2015.1049196 .

Hartley S, Pearce W, Taylor A (2017) Against the tide of depoliticisation: the politics of research governance. Policy Polit 45(3):361–377. https://doi.org/10.1332/030557316X14681503832036

Hawkins B, Ettelt S (2018) The strategic uses of evidence in UK e-cigarettes policy debates. Evid Policy. https://doi.org/10.1332/174426418X15212872451438

Haynes A, Brennan S, Redman S et al. (2016) Figuring out fidelity: A worked example of the methods used to identify, critique and revise the essential elements of a contextualised intervention in health policy agencies. Implement Sci 11(1). https://doi.org/10.1186/s13012-016-0378-6

Haynes AS, Derrick GE, Redman S et al. (2012) Identifying trustworthy experts: How do policymakers find and assess public health researchers worth consulting or collaborating with?. PLoS ONE 7(3):e32665. https://doi.org/10.1371/journal.pone.0032665 .

Article   ADS   CAS   PubMed   PubMed Central   Google Scholar  

Holliman R, Warren CJ (2017) Supporting future scholars of engaged research. Res All. https://doi.org/10.18546/rfa.01.1.14

Innvaer S, Vist G, Trommald M et al. (2002) Health policy-makers’ perceptions of their use of evidence: a systematic review. J health Serv Res policy 7(4):239–44. https://doi.org/10.1258/135581902320432778

Ioannidis JPA (2005) Why most published research findings are false. PLoS Med 2(8):e124. https://doi.org/10.1371/journal.pmed.0020124

Jacobson N, Goering P (2006) Credibility and credibility work in knowledge transfer. Evid Policy. https://doi.org/10.1332/174426406777068894

Jasanoff S (2005) Judgment under siege: The three-body problem of expert legitimacy. In: Maasen S, Weingart P (eds) Democratization of expertise? Springer-Verlag, Berlin/Heidelberg, pp 209–224. https://doi.org/10.1007/1-4020-3754-6_12

Jasanoff S, Polsby NW (1991) The fifth branch: Science advisers as policymakers. Contemp Sociol 20(5):727. https://doi.org/10.2307/2072218

Jones M, Crow D (2017) How can we use the ‘science of stories’ to produce persuasive scientific stories. Palgrave Commun 3(1):53. https://doi.org/10.1057/s41599-017-0047-7

Kislov R, Wilson PM, Knowles S et al. (2018) Learning from the emergence of NIHR Collaborations for Leadership in Applied Health Research and Care (CLAHRCs): a systematic review of evaluations. Implement Sci 13(1):111. https://doi.org/10.1186/s13012-018-0805-y

Kuhn TS (1970) The structure of scientific revolutions. The physics teacher. https://doi.org/10.1017/CBO9781107415324.004

Lancaster K (2014) Social construction and the evidence-based drug policy endeavour. Int J Drug Policy 25(5):948–951. https://doi.org/10.1016/j.drugpo.2014.01.002

Latour B, Woolgar S (2013) Laboratory life: The construction of scientific facts. 1986. https://doi.org/10.1017/CBO9781107415324.004

Lavis JN, Robertson D, Woodside JM et al. (2003) How can research organizations more effectively transfer research knowledge to decision makers? Milbank Q 81(2):221–248. https://doi.org/10.1111/1468-0009.t01-1-00052

Lee CJ, Scheufele DA, Lewenstein BV (2005) Public attitudes toward emerging technologies: Examining the interactive effects of cognitions and affect on public attitudes toward nanotechnology. Sci Commun 27(2):240–267. https://doi.org/10.1177/1075547005281474

Levinson SC (2000) Presumptive meanings: The theory of generalized conversational implicature. language, speech, and communication series. https://doi.org/10.1162/coli.2000.27.3.462

Liabo K, Stewart R (2012) Involvement in research without compromising research quality. J Health Serv Res Policy. https://doi.org/10.1258/jhsrp.2012.011086

Lidskog R, Sundqvist G (2004) From consensus to credibility: New challenges for policy-relevant science. Innovation 17(3):205–226. https://doi.org/10.1080/1351161042000241144

Lin V (2008) Evidence-Based public health policy. In: Quah, Stella R (eds) International encyclopedia of public health. Elsevier, Oxford. https://doi.org/10.1016/B978-012373960-5.00234-3

Chapter   Google Scholar  

Lindblom CE (1990) Inquiry and change: The troubled attempt to understand and shape society. Yale University Press, JSTOR. http://www.jstor.org/stable/j.ctt1dszwww

Locock L, Boaz A (2004) Research, policy and practice–worlds apart? Soc Pol Soc. https://doi.org/10.1017/S1474746404002003

Lopez N, Gadsden VL (2018) Health inequities, social determinants, and intersectionality. NAM Perspect. 6(12). https://doi.org/10.31478/201612a

Lorenc T, Oliver K (2013) Adverse effects of public health interventions: a conceptual framework. J Epidemiol Community Health 68(3):288–290. https://doi.org/10.1136/jech-2013-203118

Makkar SR, Howe M, Williamson A et al. (2016) Impact of tailored blogs and content on usage of Web CIPHER–an online platform to help policymakers better engage with evidence from research. Health Res Policy Syst 14(1):85. https://doi.org/10.1186/s12961-016-0157-5

Malbon E, Carson L, Yates S (2018) What can policymakers learn from feminist strategies to combine contextualised evidence with advocacy? Palgrave Commun. https://doi.org/10.1057/s41599-018-0160-2

Montana J (2017) Accommodating consensus and diversity in environmental knowledge production: Achieving closure through typologies in IPBES. Environ Sci Policy 68:20–27. https://doi.org/10.1016/J.ENVSCI.2016.11.011

Mullen EJ (2016) Reconsidering the ‘idea’ of evidence in evidence-based policy and practice. European journal of social work 19(3–4):310–335

Neale S (1992) Paul Grice and the philosophy of language. Linguist Philos. https://doi.org/10.1007/BF00630629

Nesta (2012) The red book for evidence. Available at: https://www.nesta.org.uk/blog/red-book-evidence/ . (Accessed 14 Feb 2019)

Nielsen KH, Sørensen MP (2017) How to take non-knowledge seriously, or “the unexpected virtue of ignorance”. Public Underst Sci 26(3):385–392. https://doi.org/10.1177/0963662515600967

NIHR (2009) NIHR collaborations for leadership in applied health research and care (CLAHRCs): implementation plan 5.8. Available at: https://www.nihr.ac.uk/about-us/how-we-are-managed/our-structure/infrastructure/collaborations-for-leadership-in-applied-health-research-and-care.htm . (Accessed 14 Feb 2019)

NIHR (2018) NIHR announces £150m investment in applied health research. Available at: https://www.nihr.ac.uk/news/nihr-announces-150m-investment-in-applied-health-research/8800 . (Accessed 25 Apr 2019)

Nurse P (2015) Ensuring a successful UK research endeavour: A review of the UK Research councils. BIS/15/625, Department for Business, Innovation and Skills, London

Nutley SM, Smith PC, Davies HTO (eds) (2000) What works?: Evidence-based policy and practice in public services. Policy Press, Bristol

Oliver K, Boaz A (2018) What makes research useful? We still don’t know. Available at: https://www.researchresearch.com/news/article/?articleId=1377811 . (Accessed 18 Jan 2019)

Oliver K, Cairney P (2019) The do's and don’ts of influencing policy: a systematic review of advice to academics. Palgrave Commun 5(1):21

Oliver K, Faul MV (2018) Networks and network analysis in evidence, policy and practice. Evid Policy 14(3):369–379. https://doi.org/10.1332/174426418X15314037224597

Oliver K, Pearce W (2017) Three lessons from evidence-based medicine and policy: increase transparency, balance inputs and understand power. Palgrave Commun 3(1):43. https://doi.org/10.1057/s41599-017-0045-9

Oliver K, Innvar S, Lorenc T et al. (2014) A systematic review of barriers to and facilitators of the use of evidence by policymakers. BMC Health Serv Res 14(1):2. https://doi.org/10.1186/1472-6963-14-2

Oliver K, Lorenc T, Innvær S (2014) New directions in evidence-based policy research: A critical analysis of the literature. Health Res Policy Syst 12(1):34. https://doi.org/10.1186/1478-4505-12-34

Oliver K, Tinker J, Lorenc T et al. (2019a) Evaluating unintended consequences: new insights into solving practical, ethical, and political challenges of evaluation. Evaluation (in press)

Oliver K, Kothari A, Mays N (2019) The dark side of coproduction: do the costs outweigh the benefits for health research? Health Res Policy Syst 17(1):33

Oliver A, de Vocht F (2015) Defining ‘evidence’ in public health: a survey of policymakers’ uses and preferences. Eur J Public Health: ckv082. https://doi.org/10.1093/eurpub/ckv082

Orton L, Lloyd-Williams F, Taylor-Robinson D et al. (2011) The use of research evidence in public health decision making processes: Systematic review. PLoS ONE. https://doi.org/10.1371/journal.pone.0021704

Owen R, Macnaghten P, Stilgoe J (2012) Responsible research and innovation: From science in society to science for society, with society. Sci Public Policy. https://doi.org/10.1093/scipol/scs093

Parkhurst J (2017) The politics of evidence: From evidence-based policy to the good governance of evidence. Routledge Studies in Governance and Public Policy. Routledge, London. https://doi.org/10.4324/9781315675008

Parkhurst JO, Abeysinghe S (2016) What constitutes “Good” evidence for public health and social policy-making? From hierarchies to appropriateness. Soc Epistemol 30(5–6):665–679. https://doi.org/10.1080/02691728.2016.1172365

Pearce W, Raman S (2014) The new randomised controlled trials (RCT) movement in public policy: challenges of epistemic governance. Policy Sci 47(4):387–402. https://doi.org/10.1007/s11077-014-9208-3

Pearce W, Grundmann R, Hulme M et al. (2017) Beyond counting climate consensus. Environ Commun 11(6):723–730. https://doi.org/10.1080/17524032.2017.1333965

Pearce W, Mahony M, Raman S (2018) Science advice for global challenges: Learning from trade-offs in the IPCC. Environ Sci Policy 80:125–131. https://doi.org/10.1016/j.envsci.2017.11.017

Pielke RA (2007) The honest broker: Making sense of science in policy and politics. Cambridge University Press, Cambridge. https://doi.org/10.1017/CBO9780511818110

Popper K (1963) Science as Falsification. Conjectures and refutations, readings in the philosophy of science. Routledge, London. https://doi.org/10.2307/3517358

Powell WW, Snellman K (2004) The knowledge economy. Annu Rev Sociol 30(1):199–220. https://doi.org/10.1146/annurev.soc.29.010202.100037

Prainsack B (2018) The “We” in the “Me”: Solidarity and health care in the era of personalized medicine. Sci Technol Hum Values 43(1):21–44. https://doi.org/10.1177/0162243917736139

Prainsack B, Svendsen MN, Koch L et al. (2010) How do we collaborate? Social science researchers’ experience of multidisciplinarity in biomedical settings. BioSocieties 5(2):278–286. https://doi.org/10.1057/biosoc.2010.7

Reed M, Evely A (2016) How can your research have more impact? Five key principles and practical tips for effective knowledge exchange. LSE Impact blog: 1–5. Available at: http://blogs.lse.ac.uk/impactofsocialsciences/2015/07/07/how-can-your-research-have-more-impact-5-key-principles-tips/ . (Accessed 10 July 2018)

Rescher N (1993) Pluralism: against the demand for consensus. Clarendon Press, Oxford University Press, Oxford

Russell J, Greenhalgh T (2014) Being ‘rational’ and being ‘human’: How National Health Service rationing decisions are constructed as rational by resource allocation panels. Health (United Kingdom). https://doi.org/10.1177/1363459313507586

Sanderson I (2000) Evaluation in complex policy systems. Evaluation 6(4):433–454. https://doi.org/10.1177/13563890022209415 .

Sarewitz D (2018) Of cold mice and isotopes or should we do less science? In: Science and politics: Exploring relations between academic research, higher education, and science policy summer school in higher education research and science studies, Bonn, 2018. Available at: https://sfis.asu.edu/sites/default/files/should_we_do_less_science-revised_distrib.pdf

Science (2018) Congress approve largest U.S. research spending increase in a decade. https://doi.org/10.1126/science.aat6620

Scott J, Lubienski C, Debray-Pelot E (2009) The politics of advocacy in education. Educ Policy. https://doi.org/10.1177/0895904808328530

Scott JT (2011) Market-driven education reform and the racial politics of advocacy. Peabody J Educ. https://doi.org/10.1080/0161956X.2011.616445

Sense about Science (2016) Missing evidence. Available at: https://senseaboutscience.org/activities/missing-evidence/ . (Accessed 14 Feb 2019)

Shapin S (1995) Here and everywhere: sociology of scientific knowledge. Ann Rev Sociol. https://doi.org/10.1146/annurev.soc.21.1.289

Shefner J Dahms HF Jones RE (eds) (2014) Social justice and the university. Palgrave Macmillan UK, London. https://doi.org/10.1057/9781137289384

Shepherd J, Frampton GK, Pickett K et al. (2018) Peer review of health research funding proposals: A systematic map and systematic review of innovations for effectiveness and efficiency. PLoS ONE 13(5):e0196914. https://doi.org/10.1371/journal.pone.0196914

Smallman M (2018) Science to the rescue or contingent progress? Comparing 10 years of public, expert and policy discourses on new and emerging science and technology in the United Kingdom. Public Underst Sci. https://doi.org/10.1177/0963662517706452

Article   MathSciNet   PubMed   Google Scholar  

Smith K, Stewart E (2017) We need to talk about impact: Why social policy academics need to engage with the UK’s research impact agenda. J Soc Policy 109–127. https://doi.org/10.1017/S0047279416000283

Smith K, Stewart E, Donnelly P et al. (2015) Influencing policy with research-public health advocacy and health inequalities. Health Inequalities. https://doi.org/10.1093/acprof:oso/9780

Smith KE, Stewart EA (2017) Academic advocacy in public health: Disciplinary ‘duty’ or political ‘propaganda’? Soc Sci Med 189:35–43. https://doi.org/10.1016/j.socscimed.2017.07.014

Stevenson O (2019) Making space for new models of academic-policy engagement. Available at: http://www.upen.ac.uk/blogs/?action=story&id=41 . (Accessed 12 Apr 2019)

Stewart R, Langer L, Erasmus Y (2018) An integrated model for increasing the use of evidence by decision-makers for improved development. Dev Southern Africa. 1–16. https://doi.org/10.1080/0376835X.2018.1543579

Stilgoe J, Owen R, Macnaghten P (2013) Developing a framework for responsible innovation. Res Policy. https://doi.org/10.1016/j.respol.2013.05.008

Tchilingirian JS (2018) Producing knowledge, producing credibility: British think-tank researchers and the construction of policy reports. Int J Polit Cult Soc 31(2):161–178. https://doi.org/10.1007/s10767-018-9280-3

Traynor R, DeCorby K, Dobbins M (2014) Knowledge brokering in public health: A tale of two studies. Public Health 128(6):533–544. https://doi.org/10.1016/j.puhe.2014.01.015

Nutley SM, Tseng V (2014) Building the infrastructure to improve the use and usefulness of research in education. In: Finnigan KS, Daly AJ (eds) Using research evidence in education: From the schoolhouse door to Capitol Hill. Policy implications of research in education, vol. 2. Springer, pp 163–175. https://doi.org/10.1007/978-3-319-04690-7_11

Tseng V, Easton JQ, Supplee LH (2018) Research-practice partnerships: Building two-way streets of engagement. Soc Policy Report. https://doi.org/10.1002/j.2379-3988.2017.tb00089.x

UKRI-UNDP (2018) UKRI-UNDP joint report: ‘How science, research and innovation can best contribute to meeting the sustainable development goals for developing countries’ full application guidance -applications by invitation only. Available at: https://www.ukri.org/research/global-challenges-research-fund/ukri-undp-joint-report-how-science-research-and-innovation-can-best-contribute-to-meeting-the-sustainable-development-goals-for-developing-countries/ . (Accessed 14 Feb 2019)

UKRI (2017) UK strategy for the global challenges research fund (GCRF). Available at: https://www.ukri.org/files/legacy/research/gcrf-strategy-june-2017/%0A%0A

Ward V (2017) Why, whose, what and how? A framework for knowledge mobilisers. Evid Policy. https://doi.org/10.1332/174426416X14634763278725

Wehrens R, Bekker M, Bal R (2010) The construction of evidence-based local health policy through partnerships: Research infrastructure, process, and context in the Rotterdam ‘Healthy in the City’ programme. J Public Health Policy. https://doi.org/10.1057/jphp.2010.33

Weiss CH (1979) The many meanings of research utilization. Public Adm Rev 39(5):426. https://doi.org/10.2307/3109916

White HC (2008) Identity and control: How social formations emerge. Princeton University Press, Princeton. https://doi.org/10.1007/s13398-014-0173-7.2

Whitehead M, Petticrew M, Graham H et al. (2004) Evidence for public health policy on inequalities: 2: Assembling the evidence jigsaw. J Epidemiol Community Health 2004:817–821. https://doi.org/10.1136/jech.2003.015297

Williams K (2018) Three strategies for attaining legitimacy in policy knowledge: Coherence in identity, process and outcome. Public Admin. https://doi.org/10.1111/padm.12385

Wynne B (1992) Misunderstood misunderstanding: Social identities and public uptake of science. Public Understand Sci. 1281–304. https://doi.org/10.1088/0963-6625/1/3/004

Yanovitzky I, Weber M (2018) Analysing use of evidence in public policymaking processes: a theory-grounded content analysis methodology. Evid Policy. https://doi.org/10.1332/174426418x15378680726175

Download references

Acknowledgements

We thank the Nuffield Foundation, the Wellcome Trust and the William T Grant Foundation for financial support for a meeting on Transforming the use of Research Evidence, held in London in 2018. We are grateful to both the participants at this meeting and those attending the William T Grant Foundation Use of Research Evidence meeting in Washington 2019. In particular, we very much appreciate the contribution of Kim DuMont, Paul Cairney and Warren Pearce who commented on drafts of this paper before submission. Our thanks to you all.

Author information

Authors and affiliations.

London School of Hygiene and Tropical Medicine, London, UK

Kathryn Oliver

Kingston University, London, UK

Annette Boaz

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Kathryn Oliver .

Ethics declarations

Competing interests.

The authors declare no competing interests.

Additional information

Publisher’s note: Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Oliver, K., Boaz, A. Transforming evidence for policy and practice: creating space for new conversations. Palgrave Commun 5 , 60 (2019). https://doi.org/10.1057/s41599-019-0266-1

Download citation

Received : 19 February 2019

Accepted : 14 May 2019

Published : 28 May 2019

DOI : https://doi.org/10.1057/s41599-019-0266-1

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

This article is cited by

Combining public health evidence, policy experience and communications expertise to inform preventive health: reflections on a novel method of knowledge synthesis.

  • Maddie Heenan
  • Alexandra Chung
  • Lucie Rychetnik

Health Research Policy and Systems (2023)

Developing, implementing, and monitoring tailored strategies for integrated knowledge translation in five sub-Saharan African countries

  • Kerstin Sell
  • Nasreen S. Jessani
  • Lisa M. Pfadenhauer

Barriers to evidence use for sustainability: Insights from pesticide policy and practice

  • Benjamin Hofmann
  • Karin Ingold
  • Sabine Hoffmann

Ambio (2023)

Building evidence into youth health policy: a case study of the Access 3 knowledge translation forum

  • Daniel Waller
  • Fiona Robards
  • Melissa Kang

Health Research Policy and Systems (2022)

Bridging the Gaps Among Research, Policy, and Practice in the Field of Child Maltreatment Through Cross-Sector Training and Innovation

  • Lisa Schelbe
  • Donna L. Wilson
  • J. Bart Klika

International Journal on Child Maltreatment: Research, Policy and Practice (2020)

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

research policy

Health Research Policy and Systems

Aims and scope.

Health Research Policy and Systems covers all aspects of the organisation and use of health research – including agenda setting, building health research capacity, and how research as a whole benefits decision makers, practitioners in health and related fields, and society at large.

Discount for Health Systems Global members

New Content Item

All Health Systems Global members receive a 20% discount on  Health Research Policy and Systems  Article Processing Charges (APCs). To claim the discount, please contact the Secretariat at  [email protected]  to obtain the BMC journals' discount code.

  • Most accessed

National strategies for knowledge translation in health policy-making: A scoping review of grey literature

Authors: Balázs Babarczy, Julia Scarlett, Tarang Sharma, Péter Gaál, Balázs Szécsényi-Nagy and Tanja Kuchenmüller

Salt reduction policy for out of home sectors: a supplementary document for the salt reduction strategy to prevent and control non-communicable diseases (NCDS) in Malaysia 2021–2025

Authors: Zaliha Harun, Suzana Shahar, Yee Xing You, Zahara Abdul Manaf, Hazreen Abdul Majid, Chia Yook Chin, Hasnah Haron, Viola Michael, Hamdan Mohamad, Siti Farrah Zaidah Mohd Yazid, Musaalbakri Abdul Manan, Wan Zunairah Wan Ibadullah, Mhairi K. Brown, Feng J. He and Graham A. MacGregor

Using a priority setting exercise to identify priorities for guidelines on newborn and child health in South Africa, Malawi, and Nigeria

Authors: Solange Durão, Emmanuel Effa, Nyanyiwe Mbeye, Mashudu Mthethwa, Michael McCaul, Celeste Naude, Amanda Brand, Ntombifuthi Blose, Denny Mabetha, Moriam Chibuzor, Dachi Arikpo, Roselyn Chipojola, Gertrude Kunje, Per Olav Vandvik, Ekpereonne Esu, Simon Lewin…

Politics–evidence conflict in national health policy making in Africa: a scoping review

Authors: Edward W. Ansah, Samuel Maneen, Anastasia Ephraim, Janet E. Y. Ocloo, Mabel N. Barnes and Nkosi N. Botha

Gender and non-communicable diseases in Mexico: a political mapping and stakeholder analysis

Authors: Emanuel Orozco-Núñez, Enai Ojeda-Arroyo, Nadia Cerecer-Ortiz, Carlos M. Guerrero-López, Beatriz M. Ramírez-Pérez, Ileana Heredia-Pi, Betania Allen-Leigh, Emma Feeny and Edson Serván-Mori

Most recent articles RSS

View all articles

The utilisation of health research in policy-making: concepts, examples and methods of assessment

Authors: Stephen R Hanney, Miguel A Gonzalez-Block, Martin J Buxton and Maurice Kogan

The application of systems thinking in health: why use systems thinking?

Authors: David H Peters

The role of NGOs in global health research for development

Authors: Hélène Delisle, Janet Hatcher Roberts, Michelle Munro, Lori Jones and Theresa W Gyorkos

How to engage stakeholders in research: design principles to support improvement

Authors: Annette Boaz, Stephen Hanney, Robert Borst, Alison O’Shea and Maarten Kok

The 10 largest public and philanthropic funders of health research in the world: what they fund and how they distribute their funds

Authors: Roderik F. Viergever and Thom C. C. Hendriks

Most accessed articles RSS

Featured article

Identifying priority technical and context-specific issues in improving the conduct, reporting and use of health economic evaluation in low- and middle-income countries

Health Research Policy and Systems (2018) 16 :4

The authors aim to identify the top priority issues that impede the conduct, reporting and use of economic evaluation as well as potential solutions as an input for future research topics by the international Decision Support Initiative and other movements.

Visit our Health Services Research page

Visit our Health Services Research page

Editors-in-Chief

Kathryn Oliver, London School of Hygiene and Tropical Medicine, UK Chigozie Jesse Uneke, Ebonyi State University, Nigeria

New cross journal series

New Content Item

Latest Tweets

Your browser needs to have JavaScript enabled to view this timeline

Call for papers

We are delighted to announce a Call for Papers for our new thematic series on " The role of the health research system during the COVID-19 epidemic: experiences, challenges and future vision ". This will focus on the role of health research systems in the control and management of COVID-19, so that the experiences of countries can be shared with each other and the lessons learned are accessible to all. More details can be found here .

Do you have an idea for a thematic series? Let us know!

HARPS logo

Published in collaboration with the World Health Organization

  • Editorial Board
  • Manuscript editing services
  • Instructions for Editors
  • Sign up for article alerts and news from this journal
  • Follow us on Twitter

Annual Journal Metrics

2022 Citation Impact 4.0 - 2-year Impact Factor 4.3 - 5-year Impact Factor 1.820 - SNIP (Source Normalized Impact per Paper) 1.353 - SJR (SCImago Journal Rank)

2023 Speed 70 days submission to first editorial decision for all manuscripts (Median) 196 days submission to accept (Median)

2023 Usage  1,738,266 downloads 1,598 Altmetric mentions 

  • More about our metrics

ISSN: 1478-4505

  • Submission enquiries: Access here and click Contact Us
  • General enquiries: [email protected]

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • J Clin Transl Sci
  • v.4(3); 2020 Jun

Logo of jctsci

Bridging the gap between research, policy, and practice: Lessons learned from academic–public partnerships in the CTSA network

Amytis towfighi.

1 Southern California Clinical and Translational Sciences Institute, University of Southern California, Los Angeles, CA, USA

2 Los Angeles County Department of Health Services, Los Angeles, CA, USA

Allison Zumberge Orechwa

Tomás j. aragón.

3 San Francisco Department of Public Health, San Francisco, CA, USA

Marc Atkins

4 Center for Clinical Translational Science, University of Illinois at Chicago, Chicago, IL, USA

Arleen F. Brown

5 University of California Los Angeles Clinical and Translational Science Institute, Los Angeles, CA, USA

6 Northwestern University Clinical and Translational Sciences Institute, Chicago, IL, USA

Olveen Carrasquillo

7 University of Miami Clinical and Translational Sciences Institute, Miami, FL, USA

Savanna Carson

Paula fleisher.

8 University of California San Francisco Clinical and Translational Science Institute, San Francisco, CA, USA

Erika Gustafson

Deborah k. herman, moira inkelas, daniella meeker, doriane c. miller.

9 Institute for Translational Medicine, University of Chicago, Chicago, IL, USA

Rachelle Paul-Brutus

10 Chicago Department of Public Health, Chicago, IL, USA

Michael B. Potter

Sarah s. ritner.

11 Alliance Chicago, Chicago, IL, USA

Brendaly Rodriguez

Anne skinner, hal f. yee, jr., associated data.

For supplementary material accompanying this paper visit http://dx.doi.org/10.1017/cts.2020.23.

A primary barrier to translation of clinical research discoveries into care delivery and population health is the lack of sustainable infrastructure bringing researchers, policymakers, practitioners, and communities together to reduce silos in knowledge and action. As National Institutes of Healthʼs (NIH) mechanism to advance translational research, Clinical and Translational Science Award (CTSA) awardees are uniquely positioned to bridge this gap. Delivering on this promise requires sustained collaboration and alignment between research institutions and public health and healthcare programs and services. We describe the collaboration of seven CTSA hubs with city, county, and state healthcare and public health organizations striving to realize this vision together. Partnership representatives convened monthly to identify key components, common and unique themes, and barriers in academic–public collaborations. All partnerships aligned the activities of the CTSA programs with the needs of the city/county/state partners, by sharing resources, responding to real-time policy questions and training needs, promoting best practices, and advancing community-engaged research, and dissemination and implementation science to narrow the knowledge-to-practice gap. Barriers included competing priorities, differing timelines, bureaucratic hurdles, and unstable funding. Academic–public health/health system partnerships represent a unique and underutilized model with potential to enhance community and population health.

Introduction

The translation of research discoveries from “bench to bedside” and into improved health is slow and inefficient [ 1 ]. The attempt to bridge science, policy, and practice has been described as a “valley of death,” reflecting few successful enduring outcomes [ 2 ]. Federal investment in basic science and efficacy research dwarfs the investment in health quality, dissemination, and outcomes research [ 3 ]. Although social determinants of health account for approximately 60% of health outcomes [ 4 ], the United States spends a significantly lower percentage of its gross domestic product (GDP) on social services as compared to similar countries with better health outcomes [ 5 ], and only 5% of U.S. national health expenditures are allocated to population-wide approaches to health promotion [ 6 ]. Widespread adoption of evidence into policy and practice is hampered when academic institutions undertake science in controlled settings and conditions. Additionally, evidence-based practices resulting from academic studies often result in limited dissemination even within academic circles. Yet, public agencies, including safety-net healthcare systems and departments of public health, must respond to and implement evidence-based policies and health promotion services for populations facing higher burdens of health and healthcare disparities.

More researchers are turning to dissemination and implementation science (D&I) methods to more effectively bridge the research-to-practice gap [ 7 ]. Yet, despite a century of empirical research to advance the translation of research to practice, considerable barriers remain, especially for advancing public health policy and practice [ 8 ].

A primary challenge to addressing the research-policy-practice gap is the lack of sustainable infrastructure bringing researchers, policymakers, practitioners, and communities together to: (1) align the research enterprise with public and population health priorities; (2) bridge healthcare, public health, mental health, and related sectors; (3) engage health systems in research; and (4) develop innovative solutions for health systems. Without a formal mechanism to effectively engage the community, academicians, and public health and healthcare agencies, research fails to address the need among most public health and healthcare agencies to increase the quality of services with existing resources.

Institutions with Clinical and Translational Science Awards (CTSAs) are uniquely positioned to bridge this gap and contribute to care delivery, translation of research into interventions that improve the health of communities, and public health innovation. In 2006, National Institutes of Health (NIH) launched the CTSA Program to support a national network of medical research institutions, or “hubs,” that provide infrastructure at their local universities and other affiliated academic partners to advance clinical and translational research and population health. Hubs support research across disciplines and promote team-based science closely integrated with patients and communities. Their education and training programs aim to create the next generation of translational scientists who are “boundary crossers” and “systems thinkers” [ 9 ]. Through their collaboration with communities, hubs are uniquely situated to identify local health priorities, as well as the resources and expertise to catalyze research in those areas. The CTSA network holds great promise for bridging the research-policy-practice gap.

A number of CTSA hubs have a major emphasis on partnering with city, county, and state health organizations to drive innovations in clinical care and translate research into practical interventions that improve community and population health. We will describe examples from seven CTSA hubs in four cities – Los Angeles, Chicago, Miami, and San Francisco – that have activated their resources toward research, effective service delivery, policy development, implementation, and program evaluation.

Synergy Paper Collaboration

With support from the CTSA Coordinating Center through a “Synergy paper” mechanism, representatives from the seven CTSA hubs and public health/health system partners participated in monthly teleconferences to collaborate on developing a manuscript on this shared topic. The earlier teleconferences included a brief overview by participants of their existing academic–public health/health system partnerships and discussions on shared experiences, lessons learned, and future directions. This led to more in-depth conversations, addressing common themes and both mutual and unique barriers to achieving goals. As the linkage to respective health systems was crucial to this evaluation, authors from each CTSA hub collaborated closely with key public health and health system representatives and received written comments and feedback to integrate into the manuscript. After the elicitation and information sharing processes, members categorized critical factors, challenges, and opportunities for improvement and strategized on recommendations. As a group, members summarized activities and assessed similarities.

CTSA-Public Health System and Health Department Partnerships

The areas of focus spanned the translational spectrum from creation of evidence-based guidelines (T2) to translation to communities (T4) (Table ​ (Table1). 1 ). Focus areas included direct research support, program evaluation, implementation research, infrastructure and expertise in data sharing, analytics, and health information technology, community needs assessments, educating or conducting interventions with community health workers (CHWs), community professional development, dissemination science, and policy setting.

Partnership activities by city and translational stage

Chicago is the third largest city in the United States, with a population of 2.7 million. Approximately 50% of the population is non-white, with one in five people born outside of the United States and 36% speaking a language other than English at home. Twenty percent of the people in Chicago are living in poverty, which includes one in three children. The three CTSA programs in Chicago, at Northwestern University, the University of Chicago, and the University of Illinois at Chicago, formed a formal collaboration over a decade ago to advance community-engaged research across the Chicagoland region. This collaboration, the Chicago Consortium for Community Engagement (C3 ), is composed of representatives from the community engagement teams of each CTSA, the Chicago Department of Public Health (CDPH), and AllianceChicago, a nonprofit that provides research support to over 60 Federally Qualified Health Centers throughout Chicago and nationally.

In the city of Chicago, there is up to a 17-year gap in life expectancy between community areas that is closely correlated with economic status and race. The CDPH joined C3 in 2016 concurrent with the release of Healthy Chicago 2.0 , a citywide, 4-year strategic plan to promote health equity for Chicagoʼs over 2.5 million diverse residents (CDPH HC 2.0) [ 10 ]. The report is a blueprint for establishing and implementing policies and services that prioritize residents and communities with the greatest need. CDPH and the C3 recognized that the success of Healthy Chicago 2.0 would depend, in part, on strengthening the relationship between communities and academic institutions to advance a public health research agenda.

Activities of the C3 include (1) facilitating and supporting university-based research and evaluation of CDPH-sponsored and community-based programs (four to date); (2) jointly developing mechanisms to facilitate dissemination of research opportunities and findings to community audiences; (3) aligning Clinical and Translational Science Institute (CTSI) seed funding opportunities with Healthy Chicago 2.0 priority areas; (4) facilitating collaborations with community-based organizations and community health centers; (5) collaboratively developing and delivering capacity-building workshops on community-engaged research and dissemination strategies; and (6) improving community partner and member understanding of and interest in research. Most notably, the partnership resulted in a new CDPH Office of Research and Evaluation whose lead staff position is jointly funded by the three Chicago CTSA hubs. She is currently serving on 11 CTSI research projects and center advisory boards.

The C3 meetings allow for discussion of data analytics related to The Chicago Health Atlas (ChicagoHealthAtlas) that provides public health data for the city of Chicago and aggregated community area data based on Healthy Chicago 2.0 indicators. This provides a unique opportunity to consider social determinants of health by, for example, promoting research examining medical center electronic medical record data in relation to Chicago community-level data at each CTSA program. Moreover, ongoing involvement of CDPH leadership in these discussions provides an opportunity to promote research that will inform Healthy Chicago 2025, the blueprint for Chicago healthcare policy and practices (HealthyChicago2025). Examples include recent discussions with CPDH epidemiologists to add questions regarding attitudes about research participation to the Chicago Health Survey; plans for the Chicago-based roll out of the NIH All of Us research initiative, and local efforts by the three Chicago CTSA programs to drive broad participation in a new local multi-institutional research portal. Lastly, the collaboration includes discussions with representatives from the Alliance for Health Equity (AllianceforHealthEquity), a collaborative of over 30 nonprofit hospitals, health departments, and community organizations, that completed a collaborative Community Health Needs Assessment for Chicago and Suburban Cook County to allow partners to collectively identify strategic priorities.

Los Angeles

Los Angeles is the most populous county in the nation, with 10 million residents, and more people live in Los Angeles County (LAC) than in 42 states. Three quarters of the countyʼs residents are non-white, more than 30% of residents were born outside the United States, nearly one in five is below the federal poverty line, approximately one in 5 lack health insurance, and many speak a language other than English at home. The Los Angeles County Department of Health Services (LAC-DHS), the second-largest municipal health system in the United States, provides care to 700,000 patients annually through 4 hospitals, 19 comprehensive ambulatory care centers, and a network of community clinics. Many physicians serving the DHS facilities are also faculty members at the University of Southern California (USC) and University of California Los Angeles (UCLA), and DHS hospitals are training sites for physicians at USC and UCLA. The leadership of both the UCLA and USC CTSA hubs work in tandem with the DHS Chief Medical Officer to identify areas of intersection between academic research and the health system.

The parties invest resources in pilot funding for these areas of mutual interest and into two DHS-wide service cores – implementation science and clinical research informatics. Working closely with the DHS Research Oversight Board on policy and procedure development, the DHS Informatics and Analytics Core established new research informatics infrastructure, serving a county-wide clinical data warehouse and supporting 23 research pilot projects to date. The Innovation and Implementation Core facilitates multidisciplinary team science, deploys research methods that are feasible and acceptable in a safety-net health system, supports bidirectional mentoring and training, and develops new academic and public health leaders who can leverage the strengths of both systems. To date, the 18 projects supported by the Innovation and Implementation Core have affected the care provided by over 270 clinicians and outcomes of over 80,000 patients. An exemplary project supported by both cores is a teleretinal screening program that increased diabetic retinopathy screening rates from 41% to 60% and decreased ophthalmology visit wait times from 158 to 17 days [ 11 ]. To incubate and advance such multidisciplinary projects, the USC/UCLA/DHS partnership has created an intramural pilot funding program for projects that test interventions to enhance quality, efficiency, and patient-centeredness of care provided by LAC-DHS. Proposals are evaluated on these criteria, as well as promise for addressing translational gaps in healthcare delivery and health disparities, alignment with delivery system goals, and system-wide scalability. Six pilot grants have been awarded since 2016, addressing topics such as substance use disorders in the county jail, antimicrobial prophylaxis after surgery, and occupational therapy interventions for diabetes.

The Healthy Aging Initiative is an example of a collaborative effort between the UCLA and SC CTSAs, LAC-DHS, LAC Department of Public Health (LAC-DPH), the City of Los Angeles Department on Aging, California State University, and diverse community stakeholders. The initiative aims to support sustainable change in communities to allow middle-aged and older adults to stay healthy, live independently and safely, with timely, appropriate access to quality health care, social support, and services.

In addition, the Community Engagement cores at both Los Angeles (LA) hubs partner with DHS, DPH, and other LA County health departments in broad-ranging community-facing activities, including community health worker training and outreach, research education workshops based on community priorities, and peer navigation interventions.

Home to over 6 million people, the South Florida region is the largest major metropolitan area in the State of Florida. Miami-Dade County is unique in that 69% of the county is Hispanic, 20% of persons lack health insurance, and 53% were born outside the US [ 12 ]. Since 2012, the Miami CTSI – comprising University of Miami, Jackson Memorial Health System, and Miami VA Healthcare System – has partnered with the Florida Department of Health (FLDOH) to educate and mobilize at-risk communities via the capacity building of culturally and linguistically diverse CHWs. Recognizing that CHWs serve a vital role in bridging at-risk communities and formal healthcare, in 2010, FLDOH established a Community Health Workers Taskforce (now called the Florida Community Health Worker Coalition (FLCHWC) and incorporated as a nonprofit in 2015). By 2015, the Coalition had developed a formal credentialing pathway for CHWs in the state. As a key member of the task force, the Miami CTSI provided considerable and essential input into that process. Since then, the Miami CTSI has helped develop CHW educational programs that meet training requirements on core competencies and electives for CHW certification or renewal. These programs developed in partnership with training centers, clinics, local health planning agencies, and the FLDOH are aimed at expanding the local CHW healthcare workforceʼs capacity to address health conditions related to health disparities (e.g., social determinants of health, communication skills, motivational interviewing, and oral and mental health awareness among others).

The Miami CTSI has been partnering with the FLDOH to develop condition-specific or disease-specific training in response to emergent public health concerns of local county and state health departments. In 2016, when the Zika epidemic in Latin America arrived in Florida, the Miami CTSI developed a Zika/vector-borne disease prevention training module for CHWs that were delivered in both English and Spanish across Miami/Dade County in a short timeframe. That partnership also facilitated a Zika Research Grant Initiative that awarded 12 Florida Department of Health (DOH) grants to University of Miami investigators. Totaling over $13M, the grants focused on vaccine development, new diagnostic testing or therapeutics, and dynamic change team science. Another example was in 2018 when the Miami CTSI also worked with the FLCHWC and the FLDOH in developing opioid epidemic awareness modules for CHWs. The Miami CTSI has also worked with the FLDOH around HIV workforce development. The training modules that the Miami CTSI helped develop are now offered by the FLDOH. In turn, various University of Miami CTSI sponsored research projects now have their CHWs undergo the FLDOH HIV training, which the Miami CTSI initially helped develop.

The Miami CTSI also partners with the FLDOH and the Health Council of South Florida to perform community health needs assessments and shares data with the One Florida Clinical Research Consortium (spearheaded by the University of Florida CTSA). The FLDOH is a critical stakeholder in this consortium.

San Francisco

San Francisco is a county and city under unitary governance, with an ethnically diverse population of about 850,000 residents. It has many health sector assets, including a local public health department, a health sciences university (University of California, San Francisco [UCSF]), hospitals and health systems, and robust community-based organizations. Nonetheless, San Francisco has prominent health disparities. For example, relative to whites, hospitalization rates for diabetes are seven times higher among African Americans and twice as high among Latinos [ 13 ]. The vision of the San Francisco CTSI Community Engagement and Health Policy Program is to use an innovative Systems Based Participatory Research model which integrates community-based, practice-based, and policy research methods to advance health equity in the San Francisco Bay Area. This program strengthens the ability of academicians, the community, and Department of Public Health to conduct stakeholder engaged research through several strategies. First, the San Francisco Health Improvement Partnership (SFHIP) is a collaboration between academic, public, and community health organizations of San Francisco, an ethnically diverse city with 850,000 residents. It was formed in 2010 “to promote health equity using a novel collective impact model blending community engagement with policy change” [ 13 ]. Three backbone organizations – the San Francisco Department of Public Health, the University of California San Francisco CTSI, and the San Francisco Hospital Council – engage ethnic-based community health coalitions, schools, faith communities, and other sectors on public health initiatives. Using small seed grants from the UCSF CTSI, working groups with diverse membership develop feasible, scalable, sustainable evidence-based interventions, especially policy, and structural interventions that promote improving longer-term health outcomes. The partnership also includes community health needs assessments and a comprehensive, online data repository of local population health indicators. Results of past initiatives have been powerful. For example, the development of policy and educational interventions to reduce consumption of sugar-sweetened beverages led to new policies and legislation. These included warning labels on advertisements, a new “soda tax,” new filtered tap water stations at parks and other venues in low-income neighborhoods, and movement toward healthy beverage policies at UCSF, Kaiser Permanente, and other large hospitals. They also developed environmental solutions for reducing disparities in alcohol-related health and safety problems. As a result, they developed an alcohol outlet mapping tool that powers health research, routine blood alcohol testing in a trauma center, and influenced a new state ban on the sale of powdered alcohol, to name a few outcomes. This initiative was spearheaded by community members in neighborhoods affected by high rates of alcohol-related violence, health problems, and public nuisance activities, in collaboration with the San Francisco Police Department and other stakeholders. Using the SFHIP model, UCSF CTSI supported the development of the San Francisco Cancer Initiative, which provided science that has been used to support major community-based policy initiatives such as the banning of menthol cigarettes in San Francisco and more targeted clinical initiatives such as an effort to increase colorectal cancer screening and follow-up activities in local community health centers [ 14 ]. UCSF CTSI also has supported the San Francisco Department of Public Health in the development of its Healthy Cities Initiative, funded by Bloomberg Philanthropies, which seeks to link geocoded electronic health records data across multiple health systems with other neighborhood data to identify community-based strategies to address population health challenges across the city.

Critical Factors and Facilitators

The participating hubs share some foundational similarities and facilitators, although their specific goals and activities are diverse. Across multiple cities, numerous factors were commonly recognized as critical to the success of the partnerships (Table ​ (Table2). 2 ). First and foremost, in all locales, the needs of the departments of public health and health services shaped the activities of the CTSA hubs. All partnerships were driven by the priorities of the front-line care providers, patients, and/or the public at large, reflecting the specific goals of each health department. Projects originated with problems as identified by healthcare system leaders and clinicians, public health officials, and/or community members. For example, the USC and UCLA CTSAs in Los Angeles collaborated with the LAC-DHS to use implementation science methods to develop, implement, and evaluate sustainable solutions to health system priorities. In San Francisco, the UCSF CTSI initiated the SFHIP program, but leadership and funding responsibilities were turned over to the San Francisco Department of Public Health to ensure that community stakeholders drove the agenda. The CTSAs provided value to the public health/health systems by serving as conveners; offering expertise in informatics, community health needs assessments, implementation, evaluation, and dissemination; providing education and technical support; collaborating on policy development (whether organizational or governmental policy); and leveraging relationships with community organizations.

Key critical factors, facilitators, and barriers

CTSA, Clinical and Translational Science Award.

Since the partnerships developed in response to the public health/health systems’ needs, their goals and activities varied. While the Miami partnership focused on developing workforce capacity, the San Francisco partnership collaborated on policy changes, and the Chicago and Los Angeles partnerships concentrated on building research infrastructure and fostering collaborative research opportunities aligned with public health and health system priorities. By using the academic tools of community-engaged research, healthcare delivery science, implementation, and dissemination research in real-world settings, the partnerships are primed for disruptive innovations in healthcare.

Second, each health department had at least one designated “champion” that helped prioritize partnership activities and advocated for the partnerships to promote tangible and immediate real-life impact. For example, the LAC-DHS Chief Medical Officer has been an enthusiastic champion for the Los Angeles partnership. He co-wrote the pilot funding opportunity request for application (RFA) and offered detailed feedback to each applicant. He was instrumental in establishing and facilitating operations and policy development for the two service cores. His perspective and influence have been critical for initiating the program, refining the program each year, and promoting the research resources available to DHS clinicians. In addition, the UCLA hub created a population health program that is co-led by the Director of Chronic Disease and Injury Prevention within the LAC Department of Public Health. In Chicago, the ongoing involvement of health department leadership with the three CTSA programs through their C3 collaboration promoted a substantial shift in C3 priorities and activities to align more closely with health department programs and practices. This ultimately led to an agreement for the CTSA programs to jointly fund a new position at the health agency to serve as a liaison between the health department and the CTSA programs, despite a city-wide hiring freeze due to statewide budget constraints. In Florida, a Centers for Disease Control and Prevention Policy, Systems and Environment Change grant to the DOH Comprehensive Cancer Control Program created a staff position that was critical to establishing consistent community engagement in developing the capacity of the Florida CHW Coalition to create a credentialing program, on-going statewide involvement in promoting CHWs, and elevate the entire south Florida regionʼs effort to incorporate CHWs in prevention practice and access to care. The rest of the state learned from Miamiʼs efforts, and Miami was strengthened with the support of the statewide coalition. The staff member was able to devote three-quarters of her time to Coalition development, which unfortunately did not continue once the grant ended.

On the academic side, CTSA principal investigators and senior administrators also dedicated significant time and effort to the initiatives beyond monetary resources. CTSA leadership collaborated with the public health/health system champions to set the vision for the initiative, viewed the partnership as a priority for their hub, and exerted the influence needed to drive initiatives forward.

Third, the CTSAs needed the capacity to respond rapidly to key stakeholders and requests. The partnerships have been particularly effective when they have been nimble and responsive to the evolving needs of the local health departments, health systems, and communities. For example, in Miami, the CTSA core trained CHWs and was primed to respond with additional disease-specific training in the setting of the Zika outbreak. The UCLA CTSA offered scientific expertise to the Department of Public Health regarding vaping and e-cigarettes.

Fourth, partnerships can ensure that the communityʼs voice is heard. By leveraging CTSAs’ Community Engagement Cores, and the longstanding partnerships between public health/healthcare systems and community organizations, the communityʼs priorities and concerns can be brought to light. In another example, the UCSF CTSA leveraged long-term trusting relationships with community groups to engage in reducing disparities in alcohol-related harms. Similarly, in Chicago, the Department of Public Health provided the CTSA representatives with an early view of a new citywide health initiative, Healthy Chicago 2025, to initiate ongoing CTSA involvement in planning and implementation. By being responsive to initiatives and priorities, CTSA goals can be harmonized with partners’ operational objectives.

Fifth, as the healthcare landscape in the United States evolves, these partnerships offer opportunities to enhance translation of evidence to practice, study the effects of various payment models, and inform policy.

Other critical factors and facilitators included a common commitment among all parties to address local health disparities; funding in the form of pilot grants tailored to the needs of the public partners, which several CTSA hubs offered, and maturity of the partnership. In Los Angeles, responsiveness to the pilot funding opportunity improved with each iteration of the funding cycle. In all cities, longer relationships increased trust among the partners.

Lessons Learned, Barriers, Gaps, and Challenges

Numerous barriers have become evident in the infancy of these academic/public health/health system partnerships (Table ​ (Table2). 2 ). When evaluating the programs’ experiences, several themes emerged around challenges and the solutions employed to overcome them.

First, there are often competing priorities between the public health/health system and academic partners. All partnerships addressed this by finding areas where the public partners’ priorities aligned with academic expertise. In Miami, they developed disease-specific training in response to emergent public health concerns of local county and state health departments. In Los Angeles, the CTSA pilot funding criteria and prioritization topics were co-developed with DHS. In Chicago, seed funding projects required alignment with C3/CDPH priorities.

Second, partners’ timelines often differ substantially. The public health/healthcare system cannot adjust the pace to accommodate traditional academic endeavors. Individuals making operational decisions typically do not have the luxury of time to collect pilot data and study intervention implementation and outcomes using conventional research timelines. They are given directives to implement changes broadly and swiftly. Nevertheless, integration with academic endeavors can be achieved. One example is emphasizing underutilized research methods in implementation and improvement designed to generate both locally applicable and generalizable knowledge. Another example is embedding academicians in the public health or healthcare system, to ensure that they are involved in the design, planning, implementation, evaluation, and dissemination of initiatives. Academicians may be frustrated by hasty implementation and limitations in evaluation of outcomes, yet public health and health systems do want to base their decisions on good science. Funding cycles and grant review criteria are not consistent with business timelines and priority setting and often do not value the emerging scientific methods that are designed for learning in systems (e.g., implementation science, improvement science, design science). It is possible to undertake rigorous science that balances the competing operational needs and culture between health departments and universities when these partners focus on appropriate methods and problem-solving. In addition, researchers may have difficulty maintaining their academic credentials, gauged by grant portfolios and publication records. This is an important issue for CTSA program leadership locally and nationally, to advance changes in university tenure policies to encourage and promote health services, community-based, and community-engaged research [ 15 ]. To that end, sustained and systematic collaboration with local health departments can alleviate logistical barriers to community-engaged research to fulfill the CTSA mandate to promote research that informs policy and practice.

Third, it is critical to skillfully navigate bureaucratic hurdles when working with government entities. Several CTSAs have found it particularly effective to appoint a liaison to the public health/healthcare system. Liaisons acted as bridges between partners, drawing on expertise in multiple areas and access to resources across the partnershipʼs sites. As employees of health departments, often with dual appointments at the partnering university, liaisons understand the needs of health departments on an intimate level. With their connections and operational experience, they can act as navigators and advisors to academicians. For example, in Chicago, the new lead of Research and Evaluation at CDPH and co-chair of C3 helps researchers identify funding opportunities, disseminate research findings, and broker relationships. In addition, she serves on the CTSI community governance bodies for all three Chicago CTSIs. In Los Angeles, each of the CTSAs (UCLA and USC) appointed as their liaison an academician who practices in the DHS system. Moreover, the DHS Chief Medical Officer not only served as a supporter and champion internally but was also on the advisory committees for both USC and UCLA CTSA hubs, supporting a bidirectional strategic relationship. This is reflected in infrastructure for data services and provider workgroups promoting institutionally tailored evidence-based practices and tools [ 16 , 17 ]. In Miami, a trusted staff member served as the primary and long-term point of contact for communication channels and helped train a larger workforce of CHWs as an extension of the liaison model. UCSF explored creating a joint position and subsequently developed “Navigator” roles.

Agreements that make programs sustainable often have to be approved by politicians and health department leaders, and the process for obtaining approval may be complex and time consuming. A strategy for addressing the bureaucratic hurdles is to leverage the tools developed in other partnerships. We have compiled resources, including a Request for Proposals and a position description, that may be helpful to others developing similar collaborations (see Supplementary Materials). In cases where longstanding educational partnerships and agreements are in place, agreements and policies devoted to supporting translational research may build upon relationships and roles that establish faculty in leadership positions that advance research.

Fourth, unstable funding threatens the success of these partnerships. Funding is a critical factor in developing informatics and research infrastructure, workforce development, and research and evaluation. Key positions such as the liaison between the CTSI and the public health/health system should be prioritized to ensure the success of these partnerships. Strategies to address this barrier include leveraging existing resources, applying for funding from diverse sources, and being creative with resource utilization. On the other hand, mechanisms and policies for accepting funding from grants into operating budgets can also prove challenging. Three of the four LAC-DHS hospitals have an established research foundation to administer grant funding for clinician-researchers; however, these entities do not have contact with the healthcare budgeting organizations that would support resources for information technology, space, or support staff. The unpredictability of research funding is reflected in the absence of investment or awareness of procedures for accepting relatively small funds for investigator-initiated awards.

Fifth, for CTSIs collaborating with public entities, navigating a political landscape represents unique challenges. Examples include policy initiatives that could threaten corporations and well-funded industries; projects that span various public entities’ purviews (e.g., Public Health vs. Health Services vs. Mental Health); responding to politicians’ priorities; and shifting gears when administrations change. Partnerships that rely heavily on a single influential champion without associated agreements, policies, and procedures are vulnerable to leadership changes. Strong stakeholder engagement and a well-developed infrastructure are critical to ensuring the success of navigating the political sphere and sustainability.

Finally, academicians’ tools may not be well-suited to the public health systems’ needs. For example, in our Los Angeles partnership, although the UCLA and USC CTSIs had knowledge and expertise in implementation science, LAC-DHS was more interested in health delivery science, execution, operationalization, and evaluation. Rather than detailed evaluation of facilitators and barriers of implementation, they desired broad and swift implementation of interventions that reduced resource utilization while improving quality of care. Academicians have typically used an incremental approach, which often requires additional resources; whereas, LAC-DHS was more interested in disruptive approaches. We found that the best way to address the lack of alignment between the needs and the academic tools was to connect researchers with leaders in the public healthcare system early in the process of proposal development and to connect researchers with methodologists who focus on applied science in public delivery systems. Other potential solutions include expanding educational offerings for academicians, providing mentored hands-on experience, embedding researchers in public health/healthcare settings, training health department leaders in research, training community members in results dissemination, and offering incentives for cost-saving.

Unique CTSA hub collaborations with city, county, and state health organizations are driving innovations in health service delivery and population health in four urban cities. A common element among all partnerships was the CTSA hubs’ alignment of activities with the needs of the city/county partners. Other critical factors included having designated “champions” in health departments, CTSAs’ ability to respond quickly to evolving needs, and a common commitment to addressing local health disparities. Most programs encountered similar barriers, including competing priorities, different timelines, bureaucratic hurdles, and unstable funding. The academic–public partnerships have explored numerous strategies to addressing these barriers. These partnerships offer a model for innovatively disrupting healthcare and enhancing population health.

Finding areas of common ground is key. While universities and public health/healthcare systems differ in their priorities, timelines, and modus operandi, successful partnerships are poised to answer some of the critical questions in health policy, including how to deliver critical services to populations in a cost-effective manner and how to address the needs of the public. Many of these challenges are not unique to partnerships between academic centers and public systems. Some of the experiences apply equally to academic medical centers that are increasingly acquiring large private healthcare organizations without an established culture of education and research. If CTSA programs are to have a substantive impact on population health, significant expansion beyond academic medical centers is needed to address the full range of social determinants of health (e.g., housing instability, concentrated poverty, chronic unemployment). Public health departments are ideal partners to consider the bidirectional relation of social determinants and health disparities [ 18 ].

Limitations

First, this manuscript focused on partnerships between CTSAs and public entities such as Departments of Public Health or Departments of Health Services. Yet, CTSAs also have broad-ranging activities engaging communities. Second, public health and health systems have extensive collaborations with researchers and local, national, and international foundations, beyond the CTSAs. PCORnet, for example, has funded nine Clinical Research Networks; several include collaborations between universities and public health systems. Although these partnerships have been impactful, they are beyond the scope of this paper. Third, we have detailed the experiences of seven CTSAs in four large metropolitan areas. These findings and experiences may not be generalizable to other settings, particularly nonurban areas. Fourth, while we provided the experience of seven CTSAs, other CTSAs may have partnerships with their local city/county/state health departments. Rather than providing a comprehensive review of all CTSA/public health/health system partnerships, our hope was to stimulate more discussion around these partnerships.

Future Directions

There are several ways in which collaborations among CTSA programs and public sector health departments can be optimized. First, CTSA programs can prioritize opportunities for workforce development on policy-relevant research through sponsored internships and practica for graduate students and faculty. For students, these training opportunities could be aligned with core program goals across CTSA-affiliate programs in health-related fields (e.g., medicine, public health, psychology, dentistry) to provide a community perspective and promote an awareness of public sector needs early in training. For faculty, innovative funding opportunities could be modeled on sabbatical leave of absences perhaps aligned with pilot seed funding for promising research proposals.

In addition, formal lines of communication between health departments and CTSA program leadership could be encouraged by National Center for Advancing Translational Sciences (NCATS) in RFA announcements and program reviews. Encouraging each CTSA program to have at least one public sector representative on external advisory boards could also expedite cross-channel communication. Prioritizing rapid and consistent communication could help to bridge the gap between biomedical researchers and public health/health system leadership. This is especially important for early-stage research to encourage an appreciation for community resources and needs and to anticipate common barriers to implementation research [ 19 ]. In addition, ongoing feedback across CTSA and health department leadership could provide new opportunities for bi-directional exchanges that can lead to new research opportunities as well as adaptations in ongoing research to improve community-level outcomes.

A related challenge to sustaining changes is the paucity of focus on execution and operationalization. Historically, a missing link has been failure to acknowledge and address the challenges lying between an idea or proven intervention and its implementation. Randomized trials in controlled academic settings can, at best, be considered proofs-of-concept in other settings. In addition to implementation science, a key focus should be on improvements in effective operational management and culture change. The DHS-USC-UCLA partnership has worked to close this gap by hiring, coaching, and empowering multiple academically trained physicians from both the UCLA and USC CTSI hubs by the DHS. These academically trained health services researchers have become key DHS leaders and operational managers within the clinical care delivery system. Second, the partnership has used behavioral economics as an efficient and effective culture change tool in healthcare delivery. The sustainability and retention of these types of programs and partnerships may be less financial and more cultural—a “tipping point” may require organizational dissemination and incentive alignment from the top down to cultivate operational mechanisms and durable pathways to success.

Overall, the goal is to promote research that informs health policy and to encourage health policy that is informed by research. These collaborations show that this goal is best accomplished by a strategic alliance of CTSA programs and health departments. As is evident from the examples of these four cities, new opportunities for shared data and resources emerge from ongoing discussions of shared priorities. The health departments benefit by allocation of CTSA program trainees and funding, and the CTSA programs gain valuable insight and access into community health and health system needs and resources. Ultimately, the alliances promote the overall goal of translational science to inform and improve population health.

Acknowledgments

The authors wish to thank the public officials, researchers, administrators, champions, liaisons, and community members who contributed to the success of each partnership. We also wish to thank the staff at the Center for Leading Innovation and Collaboration (CLIC) for their support in developing this manuscript.

This work was funded in part by the University of Rochester CLIC, under Grant U24TR002260. CLIC is the coordinating center for the CTSA Program, funded by the NCATS at the National Institutes of Health (NIH). This work was also supported by grants UL1TR001855, UL1TR001881, UL1TR002736, UL1TR001872, UL1TR001422, UL1TR000050, UL1TR002389 from NCATS. This work is solely the responsibility of the authors and does not necessarily represent the official views of the NIH.

Supplementary material

Disclosures.

The authors have no conflicts of interest to declare.

IMAGES

  1. FREE 13+ Sample Research Proposals in MS Word

    research policy

  2. Policy Research

    research policy

  3. Role of Research in Policy Making (in terms of policy research)

    research policy

  4. PPT

    research policy

  5. Graphical illustration of research, policy and strategy and DHIS2

    research policy

  6. PPT

    research policy

VIDEO

  1. The Benefits and Costs of Diversity: Lessons from Economic History

  2. Charting Your Course: A Guide to Analyzing State Policy Studies

  3. 1. Introduction to research-policy interactions

  4. Work-Based Learning Experiences for Students with Disabilities

  5. Policy Implications of Organizational Behavior and Human Resource Management Research

  6. Engaging with policy makers: can your research influence policy?

COMMENTS

  1. Research Policy | Journal | ScienceDirect.com by Elsevier">Research Policy | Journal | ScienceDirect.com by Elsevier

    Research Policy (RP) is a multi-disciplinary journal devoted to analyzing, understanding and effectively responding to the economic, policy, management, organizational, environmental and other challenges posed by innovation, technology, R&D and science.

  2. Research Policy | All Journal Issues - ScienceDirect">Research Policy | All Journal Issues - ScienceDirect

    Read the latest articles of Research Policy at ScienceDirect.com, Elsevier’s leading platform of peer-reviewed scholarly literature.

  3. Research Policy | SpringerLink">Research Policy | SpringerLink

    Research policy activities include formulating goals for research, deciding on funding and distribution of resources for research organizations, programs and projects, laying down rules and organizational settings for research funding, research organizations and activities as well as securing education of researchers, participation in ...

  4. Research Policy | Vol 52, Issue 1, January 2023 - ScienceDirect">Research Policy | Vol 52, Issue 1, January 2023 - ScienceDirect

    Read the latest articles of Research Policy at ScienceDirect.com, Elsevier’s leading platform of peer-reviewed scholarly literature.

  5. policy ‘impact’: four models of research-policy relations ...">Rethinking policy ‘impact’: four models of research-policy...

    Abstract. Political scientists are increasingly exhorted to ensure their research has policy ‘impact’, most notably via Research Excellence Framework (REF) impact case studies, and ‘pathways...

  6. policy: a systematic ... - Nature">The dos and don’ts of influencing policy: a systematic ... -...

    We condense this advice into eight main recommendations: (1) Do high quality research; (2) make your research relevant and readable; (3) understand policy processes; (4) be accessible to...

  7. Research Policy | SpringerLink">Research Policy | SpringerLink

    Policies and ethics. Research policies, aimed at excellence in research, are designed and implemented to ensure that good research practices are pursued in balance with ethical considerations. As there is increasing overlap between good research practices and research ethics, and given...

  8. policy and practice: creating space for new ...">Transforming evidence for policy and practice: creating space for...

    For decades, conversations between research funders, users, and producers have focused on different aspects of what evidence is, the roles it plays in policy and practice, and the different ways ...

  9. Research Policy and Systems">Home page | Health Research Policy and Systems

    Aims and scope. Health Research Policy and Systems covers all aspects of the organisation and use of health research – including agenda setting, building health research capacity, and how research as a whole benefits decision makers, practitioners in health and related fields, and society at large.

  10. research, policy, and practice: Lessons ...">Bridging the gap between research, policy, and practice: Lessons...

    A primary challenge to addressing the research-policy-practice gap is the lack of sustainable infrastructure bringing researchers, policymakers, practitioners, and communities together to: (1) align the research enterprise with public and population health priorities; (2) bridge healthcare, public health, mental health, and related sectors; (3 ...