By switching to dark mode you can reduce the energy consumption of our digital service.

Routes to resilience: Lessons from monitoring BRACED

This paper shares insights, reflections and lessons learned from designing, implementing and reporting against the BRACED Monitoring and Evaluation framework.

Introduction

This paper* shares insights, reflections, and lessons learnt from designing, implementing and reporting against the Building Resilience and Adaptation to Climate Extremes and Disasters (BRACED) programme’s Monitoring and Evaluation (M&E) framework. The BRACED programme aims to build the resilience of up to 5 million people vulnerable to climate extremes and disasters and supports international, regional and local organisations, working in 15 consortia across 13 countries in East Africa, the Sahel and South-East Asia.

To understand what works and what does not in building climate and disaster resilience, the BRACEDKnowledge Manager is developing and testing a variety of resilience measurement and monitoring approaches. The BRACED Monitoring and Evaluation (M&E) framework is designed to enable data collection and evidence generation to track, measure and understand the processes of change that lead to climate and disaster resilience.

Developing programme-level M&E frameworks for resilience-building programmes is a relatively new area of work, with limited experience to draw on. Reflection about the BRACED M&E framework is, therefore, a critical learning step for BRACED itself to improve M&E practice and evidence generation within the programme. It also provides an exciting opportunity to contribute to building the knowledge base on resilience monitoring and measurement for the wider community. We hope that the reflections shared in this paper will contribute to ongoing and future resilience-building programmes.

Each year, the BRACED project Implementing Partners and the Knowledge Manager’s Monitoring and Results Reporting team address the critical question: ‘How are BRACED projects contributing to building resilience?’ The answer has been captured in our companion synthesis report – ‘Routes to resilience: insights from BRACED year 1’.

*Download the full report from the right-hand column. A short background to M&E in BRACED, summaries of the lessons learnt so far and recommendations moving forward are provided below; please see the full text for much more detail. A summary of the full report can be found here.

Monitoring and evaluation in BRACED

During the first year of BRACED, we have addressed the following M&E challenges:

  • moving from concepts to practice
  • rolling out a programme-level M&E framework to 15 projects working across 13 countries
  • trialing qualitative reporting approaches at project- and programme-level
  • aggregating and synthesising highly contextually specific data.

These experiences have generated new insights into how to approach the monitoring and results reporting of a resilience-building programme at the scale of BRACED.

Developing programme-level M&E frameworks for resilience-building programmes is a relatively new area of work, with limited experience to draw on. Reflection about the BRACED M&E framework is, therefore, a critical learning step for BRACED itself to improve M&E practice and evidence generation within the programme. It also provides an exciting opportunity to contribute to building the knowledge base on resilience monitoring and measurement for the wider community. We hope that the reflections shared in this paper will contribute to ongoing and future resilience-building programmes.

Lessons learnt so far

In year 1, BRACED project Implementing Partners have embraced a new way of monitoring and reporting change. We have learnt a great deal as a result of taking a programme-level view of how resilience is being built in BRACED. The key lessons emerging from our BRACED experience to date include:

  1. Translating concepts into practice: Measuring progress on resilience cannot be done with one ‘simple’ indicator. It requires qualitative and explanatory frameworks, that contextualise results against shocks and stresses, as well as the wider context that projects operate within. There is a risk of losing and obscuring critical learning about resilience building if we measure resilience using just one indicator. Understanding the determinants of climate and disaster resilience is complex and there aren’t any ready ‘yes’ or ‘no’ answers.
  2. Rolling out M&E frameworks: There are different options for rolling out programme-level M&E frameworks and systems, but each comes with its own trade-off. Options and trade-offs include decisions about the type and level of support to provide to project partners. The rolling out of programme-level M&E frameworks and systems must find a balance between light-touch and resource-intensive options. They also need to allow for continual adjustments based on the emerging body of knowledge and experience regarding the monitoring and measuring of resilience. In BRACED, the Knowledge Manager was set up after the project log frames, theories of change and M&E plans were defined. Establishing the BRACED programme-level M&E framework would have been easier if it had been developed at the same time as the 15 BRACED projects’ M&E.
  3. Reporting on resilience: Qualitative and explanatory frameworks offer an opportunity to complement resilience indicators. However, if we are to truly engage with these frameworks, we need to shift mindsets from accountability to learning-oriented M&E. Engaging with qualitative and explanatory frameworks requires M&E practices to go beyond ‘business as usual’ and accountability-driven exercises. M&E experts and project managers also need to engage in more refined and complex data collection and analysis processes than in a traditional programme.
  4. Aggregating and synthesising data at scale: Synthesising and aggregating data while retaining context specificity requires time, resources and thorough synthesis methodologies.Qualitative and explanatory frameworks call for exhaustive synthesis processes that are able to deal with complex data analysis, varying levels of data quality and self-reporting bias. This lengthens the lead time between project-level annual reporting and programme-level learning, which may limit the findings’ potential impact on programme and project decision-making.

These lessons are discussed in detail in section 3 of the full text.

Recommendations (abridged)

Monitoring resilience-building efforts and reporting on their progress is challenging. M&E for resilience programming is still nascent and BRACED is learning-by-doing. A key message emerging from this paper, together with its companion programme-level synthesis report, is that genuinely understanding resilience in practice means moving away from a logframe-driven and ‘accountability’-focused M&E culture. Moving forward:

  • Project Implementing Partners should enhance their ongoing monitoring and results reporting efforts by taking a more reflective and critical approach. This could challenge project assumptions and will build a better understanding of how to build climate and disaster resilience in fragile and vulnerable contexts.
  • The programme-level Monitoring and Results Reporting team should consider how to encourage this critical Implementing Partner reflection and dialogue. There are limits to what reporting templates alone can achieve in this regard. We therefore plan to provide further training to Implementing Partners, along with light-touch helpdesk support.
  • Programmes like BRACED need to find and resource efficient ways of achieving a sufficient level of reflection and learning for the benefit of both project- and programme-level evidence generation. Ideally, the programme-level M&E framework should be designed in conjunction with the project-level frameworks.

To better understand the stability of outcome-level changes over time and how communities learn and ‘bounce back better’ from disaster events, outcome-level indicators need to be complemented by systematic monitoring and evaluation of resilience in the context of actual shocks. Moving forward:

  • Implementing Partners are in a unique position to contribute to knowledge about how to quantify the number of people whose resilience has been built (KPI4) at the project level. The Monitoring and Results Reporting team, together with wider members of the Knowledge Manager and the BRACED Fund Manager, should further explore outcome-level resilience indicators in different contexts: the advantages and disadvantages, as well as opportunities and trade-offs.
  • When designing and funding similar programmes in the future, the Department for International Development (DFID) should adopt a pragmatic and realistic view on the feasible level of outcome-level data and evidence generation in a three-year programme like BRACED. Resilience-building efforts are not only complex, but also involve processes of change that take time to materialise. Prioritising annual data collection efforts against quantitative indicators may come at the cost of losing critical evidence about what works and what does not in building resilience to climate extremes and disasters.
  • Programmes like BRACED should consider having a diverse set of methodologies and analysis in place for interrogating quantitative outcome-level resilience indicators. They should be pragmatic about what sort of outcome-level data and information can be expected in a three-year period.

While much attention has been given to project-level approaches to monitoring and measuring resilience, programme-level efforts face a unique set of challenges. To date, there is both limited literature and examples from other programmes addressing these challenges. In BRACED, we have been learning-by-doing on an ongoing basis. Moving forward:

  • The Monitoring and Results Reporting team, together with Implementing Partners, should consider ways to further capture their monitoring and results reporting experiences within BRACED.This would benefit both BRACED and other existing and future resilience-building programmes.
  • Programmes like BRACED should also share experiences and contribute to building knowledge in this relatively new area of work.

Website: www.braced.org
Twitter: @bebraced
Facebook: www.facebook.com/bracedforclimatechange

This reflection paper was written by Paula Silva Villanueva and Catherine Gould, based on their experiences of monitoring change in the BRACED programme.

The authors wish to acknowledge critical contributions from Florence Pichon, as well as Emily Wilkinson, Blane Harvey, Katie Peters, Fran Walker and Dave Wilson from the BRACED Knowledge Manager. The authors are also grateful to the M&E and project leads of the BRACED Implementing Partners and to Annie Bonnin Roncerel and Jim Djontu of the BRACED Fund Manager, for openly sharing their experiences and reflections. The paper has benefited from critical review from Robbie Gregorowski and Katie Peters of the Knowledge Manager and Derek Poate (external). The donor DFID have also provided their feedback and discussed how to ensure the lessons are taken up and applied, both within BRACED and other similar programmes. Finally, we thank Charlotte Rye and Clare Shaw of the Knowledge Manager for their support in the publication process.

Suggested citation:

Villanueva, P. S., Gould, C. (2016) Routes to resilience: Lessons from monitoring BRACED. BRACED Reflection Paper. Building Resilience and Adaptation to Climate Extremes and Disasters (BRACED): London, UK

Related resources

Add your project

Exchange your climate change adaptation projects and lessons learned with the global community.