Annual Report Cards: one way for government organisations to synthesise and present complex evidence for policy making

By Louise Shaxson and Josephine Tsui There are different ways and tools government organisations use to synthetise the evidence they have and the evidence they procure to inform policy processes.

Annual Report Cards: one way for government organisations to synthesise and present complex evidence for policy making

By Louise Shaxson and Josephine Tsui 

There are different ways and tools government organisations use to synthetise the evidence they have and the evidence they procure to inform policy processes. One of those tools is the Annual Report Cards which we describe in a working paper with the title Synthesising and Presenting Complex Evidence for Policy Making: Experience with Annual Report Cards

In 2005 in the UK, the Department for Environment, Food and Rural Affairs (Defra) faced many complex challenges when trying to assemble and interpret evidence around the effects of climate change on the marine environment. Many different organisations—government research agencies, universities and non-government organisations from several countries—provided relevant evidence.  However it was not done in a coordinated way, making it difficult for Defra policy makers to gather and interpret the evidence effectively. 

An assessment of the state of the UK’s oceans and seas was being carried out at this time, and during its preparation the different government organisations involved noted how difficult it was to assemble evidence on the current impact of climate change. The evidence available to them was deemed to be too academic and did not clarify what was known and what was not known about different aspects of the marine environment. Because the various sources of evidence were not synthesised, it was difficult to elicit an overview of the state of evidence on the marine environment as a whole.

It was decided that a focal point was needed, with the time and resources to collate and interpret the evidence produced by the academic community, government research institutes and international organisations, and to communicate the information effectively to policy makers. In 2006, a group of organisations pooled their resources to create such a focal point, the Marine Climate Change Impacts Partnership (MCCIP). This partnership was to focus on three areas: 1) Assembling the evidence; 2) Assessing the robustness of the evidence; and 3) Presenting the evidence in an accessible way. 

The MCCIP is funded by contributions from a range of government organisations, with the largest contribution coming from central government. It has a small secretariat that is independent of policy makers, and a steering group led by a senior marine policy maker to ensure that the partnership’s work is relevant to current policy priorities. Other members of the steering group come from the academic community, other government departments and non-government organisations. An expert advisory panel, drawn from academia, reports on quality assurance issues.

The format chosen by the partnership to present evidence was an Annual Report Card (ARC). ARCs synthesise evidence so it is accessible to non-specialist policy makers and the public. ARCs enjoy a high profile, which means the organisations involved in the partnership are happy to contribute their own ‘in-kind’ resources, such as travel, attendance and communications.

As the main evidence product of the MCCIP, ARCs are published on a regular basis (every two years) to inform marine policy makers about the development and evolution of evidence and knowledge in the marine environment. ARCs present information in a visually appealing way that makes it very easy for policy makers to pick up the key messages and use them to inform their decisions. The production of ARCs involves a very wide range of stakeholders in developing, producing and quality assuring the evidence. 

Each ARC is approximately 12 pages and includes charts, graphs, maps and summaries arranged by topic area. Their format has evolved since the first one was produced in 2006, as the MCCIP has changed the topics it covers and has experimented with different ways of presenting the evidence.

The ARCs begin with four or five headline messages on the front page. These are the most important issues policy makers need to be aware of in the current moment. The topics of the headline messages change over time, depending on which issues are most urgent. The next two pages set out key topics that are covered in the rest of the report card, and how the confidence assessments were constructed. The fourth and fifth pages give more detail on the headline messages with graphs of key trends, maps and summary tables. The remainder of the ARC gives details on the individual topics, together with the confidence assessments. This structure makes it easy for policy makers to know where to look for key messages and to find the details they need. Graphs, maps and diagrams are presented with a short analysis of what they mean, as the topics are often quite complex. 

The main innovation in ARCs is the confidence assessment, which helps the reader understand both the volume of evidence on an issue and to what extent the experts in a field agree about what that evidence is telling them. Confidence assessments present whether the experts think there is high, medium or low confidence in:

  1. The evidence about what is already happening: the state of evidence on the most important issues within the marine environment.
  2. The evidence about what could happen: how climate change may affect each issue in the future. This provides policy makers with an idea of how quickly this area is changing, and the quality and strength of evidence. 

The level of each confidence rating depends on two variables, each rated low, medium or high. The first variable describes the level of agreement within the research literature about the evidence. The second describes the volume of evidence available. By separating the two variables, the reader has more information about how the overall confidence assessment scores were reached. For example, there can be a lot of evidence on an issue but low consensus on what this evidence means; or there can be a high level of agreement on how to interpret a small volume of evidence. For a high confidence assessment, there needs to be both a high volume of evidence and high level of agreement.

This format allows a lot of information about the evidence to be presented in a way that is easy to read. It allows policy makers to link knowledge of what is currently happening to knowledge of what will happen in the future. The online versions of the report card provide links to papers that contain the evidence on which these assessments have been based. 

The challenge for the MCCIP is to map and synthesise a very rich and complex evidence base and present it in 12 pages. Over the years, the steering group has refined the process of producing the ARCs into three separate areas: (i) deciding what topics to include; (ii) quality assurance and summarising the evidence; and (iii) constructing the confidence assessments. 

The ARCs are highly synthesised presentations of very complex evidence that is contained in supporting evidence papers. A lead author is assigned to each topic, and he or she then works with four or five researchers. The authors all volunteer their time. In return, each paper is peer-reviewed to the same degree as an academic journal article. It is also given a DOI number, which encourages citations in academic literature. This gives the paper real academic credibility and provides a strong incentive for the authors to give their time for free.

The quality assurance process enhances the likely uptake of the work in two ways. First, the fact that there is a very structured process reassures policy makers that the ARCs are of high quality and that the synthesis is based on robust evidence. Second, the ARCs provide clear and easy-to-follow links to the source of the evidence so that policy makers (or their staff) can gain a more detailed understanding if necessary.

ARCs have been in existence for almost one decade. They are not the only way in which evidence contributes to policy decisions, rather they are part of a broader process of evidence-informed policy making to which many organisations contribute. 

While it is not possible to say that the MCCIP has had a direct impact on specific policy issues, it is clear that it has made distinct contributions to on-going policy processes such as policy, design, implementation and monitoring. The format of the ARCs has been adopted by other organisations facing similar challenges in presenting complex evidence to policy makers—indicating that they are considered to be best practice in their field.

The evidence needed to inform policies is rarely completely conclusive. ARCs synthesise a broad range of evidence in a structured and rigorous process and present the evidence to policy makers in a visually appealing way. The confidence assessments are an innovative way of ensuring that the limits of the evidence base are well understood.

The process ensures that the MCCIP provides quality academic information, but the partnership’s success also stems from its independent status. The partnership represents the interests of all the devolved administrations, not just one government’s agenda. Furthermore, the MCCIP is not an advocacy organisation: while it is guided by current policy priorities, it does not respond to hot issues in the media. Nor does the MCCIP design policies or become involved in policy formulation. This neutrality encourages trust among policy makers. 

So far, the ARCs have only been used in the environmental arena—though the MCCIP has plans to improve their social science content. There is no reason to suggest that confidence assessments would not work in other policy areas, as long as the process of deriving them is similarly robust and they are presented in a similar way.

 

*************

Louise Shaxson is a research fellow in the RAPID programme at the Overseas Development Institute.  Her work focuses on improving public sector policy and strategy within the broad framework of evidence-informed policymaking. Over the past twelve years she has worked with government departments in several countries to strengthen the ways they source, handle and use evidence. This has included advising on how to take a more strategic approach to evidence planning, on the provision of scientific advice to Ministers, and on creating links between researchers and policymakers. 

Josephine Tsui is a researcher at the Overseas Development Institute in the RAPID programme. With ten years of experience, she has developed an expertise on research to policy links, looking at the power dynamics of knowledge production. She has been specialising on monitoring and evaluation of policy influence and advocacy with UN agencies, research institutes and NGO’s. Her background has traditionally served gender, social development, and agriculture issues. She has lived and worked with government ministries in Zambia and Ghana for over three years, and now she frequently works in South Asia.

Download File

 

  • Share: