KSI Adaptive Management Learning Brief #2[1]
January 2022
Introduction
This brief is one in a series of learning documents that capture and share key lessons the Knowledge Sector Initiative (KSI) in Indonesia has learned in its efforts to implement an adaptive management approach during Phase 2 of its operations (2017-2022). These briefs are intended to be practitioner-focused and while there is some reference to the broader literature, our aim is to provide a succinct description of the KSI approach and a set of lessons we hope will be useful to others attempting to implement similar adaptive approaches.
As described in the introductory brief, adaptive management has both ongoing ‘adaptive delivery’ components that are a part of a program’s day-to-day ways of working[2], and more formal, deliberate and structured components, i.e. ‘adaptive programming’[3]. In this note we explore the latter, focusing on the specific systems and processes that were intentionally developed and put in place to facilitate adaptation. Companion briefs reflect on the program’s evolving understanding of adaptation, the human side of adaptation, and leadership of adaptive programs.
KSI has built on existing knowledge, expertise and experience in developing its adaptive management systems and processes. A good deal of this knowledge has been captured in case studies, and now increasingly in thematic synthesis and analysis (much of which has been collected and is available in an online library). Of particular interest for this note is the growing literature documenting efforts to develop and implement systems for monitoring, evaluation and learning. The Asia Foundation’s work on Strategy Testing[4] and Development Entrepreneurship[5] is perhaps the most well-known, but by no means the only effort in this area.[6] For KSI helpful stepping stones included previous program strategies, similar efforts by the same implementer elsewhere, and efforts by other implementers working with the Department of Foreign Affairs and Trade (DFAT) and Government of in Indonesia (GoI).[7]
Core systems and processes, a brief practical description:
KSI’s formal systems and processes in support of adaptive management have three core components:
Learning Weeks: The KSI approach is centered around the Learning Week process. Organized and facilitated by the team responsible for performance, monitoring and evaluation (PME), the Learning Week sessions were specifically created in order to provide a forum for internal reflection by the implementing team for each workstream. The sessions are held twice a year, generally lasting 3-4 hours per workstream. Sessions are attended at a minimum by the team responsible for that area of work, the PME team, the Program Team Leader and Deputy Team Leader, generally resulting in sessions with groups of around 10 participants.
At its core, the process and templates ask the team to assess progress since the last Learning Week and prospects for achieving its intended outcomes, and to reflect upon what those assessments mean for its current strategy and any changes that might be necessary. The discussions build on evidence gathered through the team’s reflection logs (see below), formal evaluations (where available), ongoing monitoring by the implementing team (e.g. stakeholder analysis), and additional insights brought out by the facilitator. While the sessions do aim to set the strategic direction for the next six months, they are not intended to be a forum for detailed work planning.
Progress Review: From the outset, the Learning Weeks were linked to a tripartite Progress Review, involving implementer (KSI), donor (DFAT) and partner government (GoI) representatives. The program circulates materials in advance to inform a full day of discussion. This forum is intended to serve as an opportunity not only to jointly take stock of program progress towards outcomes, but also to reflect on shareholder expectations of all parties and on program management practices. This three-part Balanced Scorecard Approach provides a means by which the program can be held accountable for learning and adaptation alongside, though not in place of, its progress towards outcomes. Some decisions can be made in this forum, though others require the approval of separate program governance mechanisms.
Reflection logs: One key challenge is to ensure periodic systems for reflection and adaptation capture the team’s thinking along the way, avoiding the temptation to project current knowledge backwards and overlook learning when it comes time for periodic review. Doing this effectively not only improves the quality of information available in the Learning Week discussions, thereby strengthening the link between adaptive delivery and adaptive programming, it also encourages regular reflection in the moment. While other programs have used tools like timelines and action research to record key moments in reform processes, KSI developed reflection logs in which members of the implementing team could record their thinking on an ongoing basis using simple online templates. These reflections are then reviewed by the PME team and used alongside other sources of evidence to pre-populate Learning Week templates and prompt discussion in the Learning Week session itself.
Missteps corrected: While some version of these core systems for reflection, learning and collective review persisted throughout implementation, other approaches did not have the desired effect and were dropped along the way. Early in Phase 2, KSI experimented with Adaptive Management Worksheets that had been developed as a means to capture examples of adaptation for communication and meta-analysis. However, in this case experience with the worksheets suggested they were cumbersome and saw the team bogged down in documenting smaller tactical changes, and the decision was made to focus on the reflection logs instead.
Key lessons for practitioners:
In the remainder of this brief, we focus on five lessons we feel will be most useful for practitioners in ensuring systems and processes like those described above actually do support adaptation and not merely serve as adaptive window dressing.
- Develop a shared clarity of purpose: It is absolutely fundamental that the reasons for investing significant time and effort in the systems and processes to support adaptation are clear to all involved. If this is not achieved there is a risk of a perception among implementing teams that the process and documentation are intended to serve a donor’s needs or objectives other than enable them to do their jobs better. This can be the case if adaptive approaches are thought to be the purview of M&E specialists rather than implementing teams or the program as a whole. Others have made the case for the value of dedicated M&E expertise and human resources to help manage the burden of adaptive management systems (e.g. maintaining extensive theory of change documentation).[8] Our experience confirms this, but with the caveat that if adaptive systems and processes are too tightly linked to M&E and reporting needs that are detached from the type of information that teams use on a day-to-day basis, formal systems can become mechanistic and generate resentment from those simply wanting to get on with the job. In achieving this, it is therefore helpful to…
- Link discussion to specific decision-making and governance arrangements: The links between the Learning Week, work planning processes and forums like the Progress Review help to ensure that thinking can be translated into action. This is absolutely critical in making reflective processes worthwhile and developing the buy-in from those who need to devote scarce time to informing, facilitating and participating in these processes. However, it is important to note that complicated programs like KSI are making a variety of decisions at different levels. Adaptation can come in the form of strategic changes within workstreams or in choices about the portfolio of workstreams. The types of systems described in this note can inform either or both, but changes at different levels, with different budget implications will likely require approval at different levels. In KSI, certain decisions required only program approval, others donor approval, and others approval by program governance mechanisms. Therefore, it is important to…
- Articulate different spheres of decision-making, clarify who has discretion in each sphere, and assert that discretion where appropriate: For programs to be trusted to adapt, it is important that they can convincingly convey their learning and analysis, and that they are assertive about using the space for adaptation that is afforded contractually. We found it useful to clarify in the progress review whether the program was: a) informing partners of a change it is making and why; b) asking for insights or input to a decision; or c) asking for approval. While it is important to bring partners along the journey by effectively communicating learning and evidence, being clear about who has authority to make different operational choices can help programs target messaging to key decision-makers and mitigate potential micromanagement from partners.
- Participation: Tools like the Learning Week and progress review can create opportunities for thoughtful debate. In our experience, making the most of these opportunities depends on having robust mechanisms to challenge views and a workplace culture that not only accepts, but supports and rewards those challenges.[9] In order to prevent sessions from becoming a narrow back and forth between a facilitator and a single person and to build group ownership of ideas (and thereby minimize defensiveness):
- Facilitators must make a concerted effort to bring in a variety of perspectives, including not only the person working most closely on the workstream in question, but also other members of the implementing team (working on related issues or with some of the same stakeholders). This can be done in the course of conversation, but may also include use of complementary tools like simple surveys. Prompted in part by the use of online meetings following the onset of Covid-19, the KSI team used a number of these tools to facilitate discussion on progress and prospects. Rapid assessments (e.g. traffic light ratings, see Box 1) and feedback via Surveymonkey, Zoom polling and chat functions helped broaden participation and is helpful for avoiding groupthink and mitigating tendencies to defer to more senior staff or those seen as ‘owners’ or leaders of a given workstream. There is no reason similar approaches cannot be used in in-person settings, whether through polling or facilitation techniques that draw out perspectives from junior staff before asking leaders; the key, however, is to allow all members of the team to make their own initial judgement before sharing the findings. At that point, a facilitator can probe further into areas of apparent disagreement.
- Senior leadership must engage, but with the right tone and a specific role. On the former, leaders in adaptive programs need to strike a balance. While strong personalities can dominate discussion and unwittingly stifle dissent, leaders that defer entirely to those they manage can fail to challenge their teams to consider alternative ideas. On the latter, the participation of leadership is particularly useful precisely because management cannot get drawn down into the full detail of every workstream. There is good reason for keeping learning and strategy sessions largely internal (again, see the companion brief on soft side), but it helps to have someone a step removed from the day-to-day to help maintain focus on strategy rather than tactics, keeping a given workstream in the broader ‘helicopter view’ of what the program as a whole is trying to achieve.[10], [11]
5. Finally, be prepared to evolve. While there are clear benefits to using a consistent process that helps participants know what is expected of them, it is equally important to be open to making adjustments to systems and processes. The Learning Week templates and discussion structure evolved over the course of implementation. When it became clear that teams were sometimes tempted to reflect on progress implementing the strategy or workplan without digging more deeply into the question of whether or not the strategy is the right one or needs to be adapted in light of learning, the program restructured by:
- Placing a restatement of the problem being addressed up front as the foundation of the discussion;
- Prioritizing definitions of success and intermediate metrics that clarified goals and reflected theories of change, even where these were harder to measure (e.g. stakeholder analysis) than verifying outputs; and,
- Articulating contribution to change, while recognizing reform processes are complex and claims of attribution are difficult to make.
While the development community has made important strides in recognizing the importance of adaptation, knowledge about how to translate this into practice is still emerging. The key lessons noted here are not a complete ‘how-to’, but we hope they provide practical guidance that contributes to systems and processes that do more than simply look like adaptive management, and indeed to deliver on the promise of adaptive approaches.
-----------------------------
[1] These briefs have been written by Daniel Harris with the support of the KSI Performance, Monitoring and Evaluation team, and are based on a program of action research that followed and informed the program’s adaptive management approach.
[2] Christie, A. and Green, D. (2018) Adaptive Programming in Fragile, Conflict and Violence-Affected Settings, What Works and Under What Conditions?: The Case of Pyoe Pin, Myanmar, Itad and Oxfam in association with IDS for the Action for Empowerment and Accountability Research Programme
[3] Punton, M. and Burge, R. (2018) 'Adaptive Programming in Fragile, Conflict and Violence-Affected Settings. What Works and Under What Conditions? The Case of PERL, Nigeria', Case Study for the Action for Empowerment and Accountability Programme, Itad and Oxfam in association with IDS
[4] Ladner, D. (2015) Strategy Testing: An innovative approach to monitoring highly flexible aid programs. Working politically in practice case study 3. San Francisco, USA: The Asia Foundation
[5] Faustino, J. and Booth, D. (2014). Development entrepreneurship: how donors and leaders can foster institutional change. San Francisco / London: The Asia Foundation and ODI
[6] See Pasanen and Barnett (2019) Supporting adaptive management Monitoring and evaluation tools and approaches. ODI Working Paper 569. London: ODI for a summary.
[7] At the program level this would include efforts like a timeline tool used in KSI phase 1. At the organizational level, RTI as the lead implementer has documented some of its own experiences here. In Indonesia and with DFAT specifically, see, e.g. KOMPAK in Teskey and Tyrrel (2017) Thinking and working politically in large, multi-sector Facilities: lessons to date; also Davda and Tyrrel (2019) Monitoring, Evaluating and Learning for Complex Programs in Complex Contexts: Three Facility Case Studies.
[8] Booth, D. (2018) Incubating Policy for Economic Transformation: Lessons from Nepal. London: ODI.
[9] On the latter, see also the companion brief on the soft side of adaptive management.
[10] Note, good managers, of course, will avoid the temptation to micromanage, particularly in programs that are essentially portfolios of a number of different workstreams/interventions/reforms.
[11] Denney, L. (2016) Reforming Solid Waste Management in Phnom Penh. San Francisco / London: The Asia Foundation and ODI.