The ‘demand’ side of evidence-based policy making: why and how. Two great days of discussion and sharing in London

Last week, I attended Evidence Works 2016: A Global Forum For Government. The event was convened by four organisations, all of which, in different ways, work on the demand and use of evidence in policy making: the Alliance 4 Useful Evidence and NESTA in the UK, and two initiatives from the US, Results for America and Results for All.

The ‘demand’ side of evidence-based policy making: why and how. Two great days of discussion and sharing in London

Arnaldo Pellini

Lead – K2P Learning at the Knowledge Sector Initiative and Research Fellow at the Overseas Development Institute

 

Last week, I attended Evidence Works 2016: A Global Forum For Government. The event was convened by four organisations, all of which, in different ways, work on the demand and use of evidence in policy making: the Alliance 4 Useful Evidence andNESTA in the UK, and two initiatives from the US, Results for America and Results for All.

Interestingly, the conference was held at the Royal Society. This was established by philosophers and physicians in 1663 as 'The Royal Society of London for Improving Natural Knowledge' at a time when kings and queens were thought to have a divine right to make decisions. Today, it is the UK’s national science academy with a fellowship of some 1,600 of the world’s most eminent scientists. Its motto, nullius in verba, means take nobody's word for itand expresses an appeal to facts determined by experiment against the domination of authority.

The conference hosted 150 participants from about 40 countries. Unlike other conferences I have attended over the last few years, it had a distinct ‘demand and use of evidence’ flavour. The majority of the participants were civil servants and policy makers who, in one way or another, are trying to bring evidence to life in governments and parliaments.  

My main reflections on the two days are naturally from the perspective of being based in Indonesia with theKnowledge Sector Initiative (KSI). KSI works on various aspects of the knowledge-to-policy ecosystem: knowledge production with a group of 16 Indonesian think tanks, working with policy analysts, the demand for evidence within some ministries, and reforms in the regulatory framework that governs, for example, the procurement of research by government organisations, the funding for research, and the role and function of policy analysts in government departments, etc.

1) No country has the link between evidence and policy making fixed. High-income countries grapple with the same challenges as middle- and low-income countries: how to synthesise evidence, how to have better access to data and analysis, quality of evaluations, and policy research communication, etc. There is no blueprint for evidence-based policy making processes and systems. Countries and organisations have to develop systems that suit the specific problems they face and want to solve. What this means is that there are a lot of tested practices out there–things that work and things that have not worked. There is a great pool of ideas into which government organisations, think tanks and development partners can tap to try to solve context-specific problems: What Works Network in the UK, Sinergia at the National Department for Planning in Colombia, the work on evaluation at the Department of Planning Monitoring and Evaluation in South Africa, the Productivity Commission in Australia, the Performance Management & Delivery Unit (PEMANDU) at the Prime Minister’s Department in Malaysia, and many others.

2) Behavioural science and behavioural economics is a fast growing area of evidence for governments. It provides a new and different insight into the factors that determine why and how people make choices and it helps go beyond rational choice theory. Behavioural economics can help policy researchers better understand what encourages policy makers to make a decision, and find openings and entry points for the evidence and policy recommendations they produce. The Behavioural Insights Team in the UK is one example. The team’s objectives are to make public services more cost-effective and easier for citizens to use; improve outcomes by introducing a more realistic model of human behaviour to policy; and wherever possible, enable people to make ‘better choices for themselves’. In doing so, it develops tests and experiments to re-design public services drawing on ideas from behavioural science literature. Importantly, the team has been able to demonstrate that high quality evidence can be generated and communicated through quick experiments in a matter of weeks, rather than months or years.

3) Evidence is there, the challenge is its quality and how to put it into use. Most countries have infrastructures that produce evidence: universities, think tanks, national statistics, policy units and evaluation units at national and local levels. Where they differ is in the quality of research procured by universities, the resources invested in training researchers, and building a strong research foundation in a country. On the other hand, policy makers are inundated by data and information. In India, for example, each village produces an annual development plan after collecting 300 different types of information during the process. That is a lot of information. The development 2.0 challenge is therefore not so much about infrastructure for producing different types of evidence (that was more a development 1.0 challenge), but how to improve the quality of the evidence and make it useable, available and useful to policy makers. Coupled with this is the challenge of developing context-specific policy-making systems and processes that are better at demanding and using evidence.

4) For policy makers: investing in evidence means taking risks. Evidence can complicate policy makers’ decisions. Investing in a randomised control trial can lead to findings that a programme or a policy is having no impact, or even a negative impact. That is a risk for civil servants and elected officials. The incentive to use evidence is therefore not very strong. The demand for evidence may also look like an admission that there are things policy makers do not know. This is not negative per se and can actually be empowering, particularly when it results in the development of evidence investment strategies that map out policy priority areas, research questions that a ministry or a local government want answered, and the budget to do so. For example, the Prime Minister’s Office in Finland leads a consultation process every three years to develop a research strategy and allocate funding for short-term and long-term policy research projects to inform the Government’s policy process. The current evidence strategy is worth approximately 10 million Euros over three years.

5) The compliance culture that exists in most line ministries is a risk for the development or strengthening of an evidence-based policy decision-making culture. Administrative data and information are required by the administrative arm of government, but they can also contribute to a culture where evidence is produced to meet targets and performance, leaving the big questions, such as why services do not work properly, unanswered. TheDepartment of Planning Monitoring and Evaluation in South Africa is addressing this by working to improve not only the methodology, but also the culture around evaluations as a source of evidence and learning in government. The evaluation and research unit conduct planning and implementing evaluations of policies and programmes and training and professional development of senior civil servants on and about evaluations. So far 180 senior officials of the top two-three levels of the public service have been trained. To ensure utilization of findings, all evaluations are produced in partnership with the department(s) being assessed, departments are asked to propose evaluations and participate in the evaluation process, every evaluation is produced in an accessible 1:5:25 format (ie, one page policy implications, up to five pages executive summary, and up to 25 pages for the main text) and is presented to a cluster of government departments and Cabinet, and then sent to Parliament and made public. The development of a culture that favours the demand and use of evidence is only partially the result of technocratic solutions, such as regulations or procedures. It also requires leadership skills and good change-management processes.

6) Champions are overworked. I participated in an interesting group discussion around the role of ‘champions’ within government organisations to produce change in the way evidence is generated, demanded and used. Champions do exist but are usually overloaded by external requests for support and internal tasks and responsibilities. This means that gaining access to the authority that champions may have to influence is important, but at the same time it is important to build acceptance around new, easy ways of working on other layers in the bureaucracy. Champions alone cannot guarantee that changes of behaviour will occur. Champions and acceptance together considerably increase the chances that change will occur within a bureaucracy.

7) An obvious truth: policymaking is fundamentally political and evidence may be disregarded for political reasons. Parliamentarians, policy makers and senior civil servants do not have time to download all the evidence that is available. They may have time to read abstracts or summaries, but not always long reports. If they have scientific training, they may be more attuned to what evidence can provide them. Yet, that is not a guarantee that evidence will be the main influence on their decision. There is an emotional side to decision making, as well as a political one, and this can result in the best evidence being rejected. That is how things are, but it is not a failure of an evidence-based approach to policy making. It is just reality.

Jonathan Breckon, who leads the Alliance 4 Useful Evidence, wrote a short paper as an introduction to the conference, Evidence in an era of ‘post-truth’ politics, in which he describes the difficulty in this day an age of being an advocate of evidence in policy making. However, he notes that there are many interesting initiatives, ideas and experiences out there to tap into.

One of the speakers at the conference said that policy makers had the right to ignore evidence, but that they could not be ignorant of the evidence. In the same way, practitioners and researchers like me, who work on and in evidence-based policy making, cannot be ignorant of the experiences and ideas that are out there when testing solutions to problems in a particular country, sector or policy-making organisation.

These may be challenging times for the use of evidence in policy, but to me the conference has shown that evidence and working on the various aspects of evidence-based policy making are more relevant than ever!

  • Share: