2. Overview
• Evolution of evidence-based policy in the UK
• DFID’s approach to evidence-based development practice;
• Evidence-aware humanitarian action: what is DFID doing?
3. Evidence-based policy in the UK
• 1999 and 2001 key White Papers ‘modernising government’
• Increased investment in publicly funded research – with the
expectation of practical application
• Learning from medicine
4. RCT
Agreement on s
the nature of
evidence
Elements of evidence-aware
6. Eg
Effective dissemination Training and Nice, centre
of knowledge guidance materials specific
guidance
Synthesis of evidence Cochrane
Commissioning
frameworks
RCT
Agreement on s
the nature of
evidence
Elements of evidence-aware
7. Decision
Eg
Training and Nice, centre
guidance materials specific
guidance
Effective dissemination
of knowledge
Synthesis of evidence Cochrane
Commissioning
frameworks
RCT
Agreement on s
the nature of
evidence
Elements of evidence-aware
9. Evidence-based development: DFID’s approach
• 2006 White Paper provided for increased investment;
• 2008 research strategy published
• Building the systems for evidence-based development
– Invest in primary evidence base
– Invest in synthesis
– Increase incentives for technical excellence an use of evidence
10. Why is evidence-aware humanitarian practice
important
• Principles
• Effectiveness
• Accountability
12. Do we
Commissioning
have
frameworks enough
good
evidence
Agreement on
the nature of
evidence
Primary evidence base
13. Effective dissemination
of knowledge
What do we
really know?
Synthesis of existing
evidence
Commissionin Do we
g frameworks have
enough
good
evidence?
Agreement on
the nature of
evidence
Primary evidence base
14. Professional What are the
Incentives to use and skilled incentives to
evidence in practice use evidence?
decision-
making
How can we
make evidence
Training and
accessible to
guidance materials practitioners?
Effective dissemination
of knowledge
What do we
really know?
Synthesis of existing
evidence
What are the
Commissionin main priorities
g frameworks for more
evidence?
What is
Agreement on good
the nature of evidence?
evidence
Primary evidence base
A couple of months back I booked a plane ticket to come here. In making that decision quite a lot of different factors were in play. I like Washington, it’s a beautiful place and each time I’ve come here I’ve felt privileged to be in place with so much history and such political vibrancy. I felt too that by attending this meeting, I would be able to catch up with old friends and colleagues, many of whom I’ve not seen for some time. So, at one level, my decision was an emotional one – a fast thought as Danny Kahneman would say. In slightly slower time, the decision was affected by the fact that the title of the meeting correlated almost precisely with my role within DFID, which is about how to enable the organisation to adopt an evidenced approach to its humanitarian work. The meeting provided an opportunity to tell people about what we are trying to do, to test it out. My experience told me that this was a good thing to do. Having straddled academe and public policy for some years, I knew that that building networks is key to effecting change. My experiential learning is validated by a large literature regarding the impact of evidence/research on policy. This shows that it is very rare that a single piece of evidence changes policy. Rather, I am a subscriber to what has been called the “enlightenment”model, that is that policy change can take place when there is a steady aggregation of evidence over a period of time, which percolates into policy spaces through a variety of means. To use a geological metaphor, the influence of research and policy is more akin to the formation of stalactites than it is to earthquakes. In short, then, my decision to book the flight and attend the meeting was informed by a combination of gut and head, a messy mix of feeling, experiential learning and some good evidence. Most decisions, including very big ones about how best to respond to humanitarian need, are made on such a basis. They are what Davis and Nutley call “evidence aware” rather than evidence-based decisions.
My contention today is that to date in the humanitarian sector we have had to rely very heavily on making decisions by instinct and rely on an apprentice-like model of learning. That is, a model where we do what we saw our more experienced colleagues doing. I would suggest that while this model delivered a lot, in the future we will need to do better and different. Specifically, what I would like to do is to share with you how DFID is seeking to support more evidence-aware approaches to humanitarian action. In doing so, I will cover three main things: First, a brief overview of the emergence of evidence-based approaches, with a particular emphasis on UK public policy and how DFID is seeking to apply this in the international development arena. Second, to map out why evidence-based approaches are likely to be useful in the humanitarian domain. Finally, to share how DFID is seeking to link its capabilities as a commissioner of research, policy actor and major donor with strong field links in order to foster an evidence-aware approach to humanitarian action.
In 1999, the Government published a White Paper called Modernising Government. This aimed to promote a more evidence-based approach to public policy, encouraging officials to have “…More ideas, more willingness to question inherited ways of doing things, better use of evidence and research in policy-making and better focus on policies that will deliver long term goals”.In 2001, the Cabinet Office launched a further report entitled Better Policy-Making.In defining evidence-based approaches, it had a particular emphasis on ensuring that those responsible for public policy and service delivery were able to access and use evidence not only in policy design, but to guide implementation and change the behaviour of practitionersIn developing EB approaches, public policy makers drew significantly on the experience from medicine, which had undergone a near revolution in terms of its practice from the 1980s onwards.
Davis and Nutley have described the four elements that need to be in place to support evidence-based approaches. We can map these against the system that has evolved to support evidence-based practice, initially in medicine and nursing, but increasingly in other domains such as education and law and order. The first element of evidence-based systems is agreement on methodology. Throughout the 1980s and 1990s there had been growing consensus within medicine about the methodologies that could be used to test the effectiveness of different interventions, with Randomised Controlled Trials. This provided the foundation for what has become an increasingly integrated system to deliver evidence-based policy and practice.
Medical researchers also become increasingly adept at becoming strategic about where to build the evidence base, convening expert panels (particularly through major funding bodies) to prioritise research investment.
This consistency of methodology, together with the demand to know what is known, provided for the ability to synthesise evidence across a range of studies, increasing the power of the analysis and therefore providing decision-makers (in this case doctors) with a higher degree of certainty regarding the robustness of the findings. In other words, this has provided a means of disseminating effectively knowledge of what is known, and importantly allowed users of that knowledge (medical practitioners in this case) with a good understanding of confident they can or can’t be about the state of knowledge in any particular area.
The final piece of the jigsaw has been the incentives that are in place to encourage useage of information. In the UK, two important things have helped all of this to work in medicine. First, in the UK there are central policy-making bodies – particularly the National Institute for Clinical Medicine – as well as health centre specific requirements to provide guidance (top down drivers of an EB approach). Second there are personal incentives for individual practitioners to maintain their professional competence and requirements for them to demonstrate that they are aware of the key evidence in their respective fields (accreditation and licensing). It is this combination of the raw ingredients of evidence, together with the mechanisms to translate this into knowledge and the incentives to enable uptake of evidence that have proved so powerful in transforming medical practice in the past two decades. While there are important differences across different fields in terms of the characteristics of each component of these evidence-based systems for policy and practice, the key elements are largely consistent.
These moves towards evidence based policy and practice have attracted their fair share of criticism. This has tended to centre around two main areas of concern. First, there is disagreement about what counts as evidence. This disagreement takes a number of forms of what Davies and Nutley have called the methodological “paradigm wars”. It includes those who reject the positivist assumption that the world can be known objectively, as well as the qual/quant debate. For the sceptics, it is not possible to know ‘what works’ without asking first, for whom is it working, where and why? Second there is disagreement about the degree to which evidence can really influence policy. Black (2001) for example, argues that health policy in the UK has been shaped more by political considerations than by scientific evidence. I will return to these themes in terms of how we might address them in the humanitarian context shortly. But first, I want to look at how DFID sought to apply the principles of evidence-based practice to its development work.
In 2006, the UK government published a White Paper that provided for a doubling of investment in research relating to development by 2010/11. In 2008, a strategy was approved to deliver this investment. This strategy underscored the importance of evidence-based decision-making and the need to invest in new products and processes to tackle the most urgent development challenges. The strategy did not provide for investment in research in the humanitarian domain. In order to deliver on the strategy and the much increased budget (approximately $300 million dollars year), the department responsible for delivering research expanded considerably and was professionalised, being led by a senior academic (who also happened to be a clinician!).This provided for an investment in the production of primary research. But it also enabled DFID to think more broadly about other elements of the evidence-based system that would be required increasing the accessibility of evidence, including through the production of systematic reviews of the evidence for development, and Increasing the incentives for those responsible for programme design and implementation to use evidence in their work. As anyone who follows Duncan Green’s blog will know, this approach has attracted similar controversy to that in public policy more generally. Interestingly, in this context there has been a strong link between critiques of evidence-based development and that of the results agenda as applied in the development arena by those such as Andrew Natsios.
Three decades ago there were visionaries such as John Seaman and Julius Holt who believed that disasters, and specifically disaster response, like any other field of endeavour, could be the subject of scientific investigation. They founded the journal Disasters and then subsequently the International Relief Institute, which became the Humanitarian Policy Group at ODI. Like all visionaries they were ahead of their times and were, if not sole voices, part of a select minority. So why should all humanitarians take the evidence-agenda more seriously? I think there are at least 3 reasons: If we are serious about humanitarian principles we need to take evidence seriously. Delivering impartial and neutral responses requires good evidence of different types. This includes strong demographic, epidemiological and nutritional data, as well as political economy analysis. The IPC is one important initiative in this area, corralling good primary evidence into a format that decision-makers can use. Concern to improve performance We still don’t really know what works in terms of response, rather there is a tendency, described by Simon Levne amongst others to provide nails because we have a hammer. The current work by Dan Maxwell among others encourages us to move from needs to response analysis in order to ensure that what we are doing is likely to make the best difference. Concern to deepen accountability – the accountability ‘revolution’ in humanitarian practice has tended to focus on mechanisms for improving accountability. Relatively less attention has been placed on the quality of data and the importance of putting this into the public domain. In medicine patients can now interrogate a range of data, published in multiple forms that enable them to challenge the hierarchy that has traditionally characterised doctor-patient relations. Mediating this knowledge are a range of patient advocacy groups who enable people to get better information about treatment options. What would be our equivalent for doing this. Information is power, so how do we enable affected communities to access this kind of information?
The Humanitarian Emergency Response Review recommended the UK government should increase its investment in humanitarian research and innovation commensurate with that in development. So how are we planning to deliver that recommendation?
Using the framework above, we can unpack the approach that we’ve taken,In terms of agreement on the nature of evidence. The humanitarian arena is methodologically eclectic and we would aim to be similarly so in terms of method, combining the best of social science with epidemiology, seismology and economics. Indeed, a key ambition is to promote greater integration of hard and social science. Second, we have sought to adopt a strategic approach to the creation of evidence. Our initial review of the literature identified a large number of evidence gaps. Evidence from other sectors suggests that research investment in more likely to yield benefits when targeted around a small number of priority questions. This enables the community to build a body of evidence rather than a fruit salad of bits and pieces of findings on different topics. We recognised the risk of trying to cover too broad a water front and have tried to focus on doing a few things well, marshalling our resources around trying to ask and answer four key questions:How is the nature and scale of risk changing?What works in humanitarian action and can we find better ways of responding?How is the institutional environment changing and how can we best respond to itHow can we maximise the uptake of high quality evidence?
In common with the rest of DFID’s researchinvestment we also recognised the importance of enabling better dissemination of knowledge, including the product of robust syntheses so that we can better see what really do (and don’t know)
Finally we are looking at the incentives to increase the uptake of evidence in policy and practice. At present, this means looking internally, using the existing business systems that DFID has adopted that require an ability to demonstrate the use of evidence in programme design and that technical specialists undertake at least 55 hours per year of continuous professional development. Adopting a multi-year approach to our humanitarian programming is helping to provide more space to do this.The above approach is designed to provide an integrated portfolio of investment in high quality evidence, innovation and knowledge uptake. The programme has been up and running since June 2012. To date we have programmed approximately US$30 million, leveraging a further $7 million from the Wellcome Trust, one of the UK Research Councils and other donors.
Any approach towards more evidence aware humanitarian practice will need to be conscious of the lessons learned from other sectors attempting to become more evidence-based. There are four main things that concern me. First, there is a concern about what is driving an interest in evidence, particularly when it comes from a major donor. Specifically, there has been a lively debate as to whether arguments in favour of evidence-baesd approaches are simply a reductionist results agenda in new clothes. The answer is, they could be. But in my view they should be. Like any other tool, evidence is only as good as the person (or organisation) who uses it. We need to be self-critical and aware about how we use the evidence agenda.Second, it will be important to build greater capacity to produce and use evidence. In part, because of historical funding constraints there are only a v very few suppliers of evidence in this sector – perhaps 3-4 major institutions, all of which are in the global north. We need new players if we are to increase the quality and quantity of evidence.Third, we all need to think about whether and how the use of evidence is valued (or not) in our work. Do we have the time to read? Are we rewarded? Are we taught how to commission, manage and read ‘good’ studies?Finally, we need to proportionate, we will not necessarily have gold-plated demographic data, nor the ability to demonstrate definitively whether humanitarian advocacy works in terms of increasing access. We will need to come to a consensus about what constitutes good enough, without falling back into the trap of thinking that it is not possible to get good evidence in humanitarian contexts.
At the end of the day evidence will be imperfect and the kind of problems that we are trying to address are complex. Evidence will be only one factor that shapes decisions, but I do think that given the gravity of the problems we are responsible for trying to address it is arguably more important, rather than less, that we are evidence -aware.