Interventions gone wrong: Learning from the failures we didn’t see coming
Written by Thokozile Munthali
June 15, 2020
About the author: Thokozile Munthali, Monitoring and Evaluation (M&E) Officer in the Government of Malawi. Thokozile was awarded a bursary to attend LIDC’s short course “Evaluation: From Innovation to Impact” in 2019.
No one wants a project to fail. But can anything good come from failure? The government of my home country Malawi – with support from development partners – has been implementing many projects to improve lives. As an M&E Officer, I want all projects and programmes implemented by the Government and its development partners to have positive impacts. After all, a lot of resources get pumped into interventions to improve the lives of people in need. A solid M&E framework is crucial for us to know which interventions have had the impact we strived for – and which ones did not.
The Impact Evaluation Short Course unpacked methods and ideas for how to carry out a good impact evaluation. The evaluation-based “theory of change” stood out to me, as well as the discussions on why understanding context is crucial when you are conducting an evaluation. I learned in the short course not to feel demotivated when evaluations show that an intervention hasn’t had the positive outcome we expected. Instead, this is an opportunity to look into contributory factors by understanding the context in which the intervention was implemented. This can, in turn, inform programming to make future failures less likely.
One size fits none
The discussions about learning from failures made me think of one programme in particular. In 1996, the Malawi Social Action Fund (MASAF) was introduced with credit from the International Development Agency. MASAF was as part of the government’s poverty alleviation programme. It aimed to assist and empower individuals and communities to take action so that they themselves could better manage risks associated with education, health, sanitation, food insecurity, climate change and transport. MASAF was implemented in four phases and it was regularly modified and adapted to address emerging issues.
Unfortunately, the evaluation of MASAF concluded that there was little evidence that it had contributed to improving people’s livelihoods. The project was soon phased out.
After studying the project in depth, I have concluded that MASAF failed in part because of a desire to capture and report on positive impacts only and neglecting the negative impacts during implementation. MASAF was implemented as one-size-fits-all project, without considering the different contexts and other influencing factors on the expected outcome.
Studying past shortcomings for better outcomes
In December 2019, the government introduced a successor of MASAF, entitled “Enhanced Public Works”. I’ve been lucky enough to be involved. A lot has changed in how “Enhanced Public Works” is being implemented compared to MASAF. This time, the project will be implemented in 10 out of 28 districts, unlike the MASAF, which was nationwide. The new project has also tailored interventions to each district.
With the knowledge and skills I gained from the short course, I was well prepared to identify and mitigate potential issues that may have contributed to the failure of the predecessor project. My team and I were able to isolate issues such that different activities in watershed management would be implemented in different areas instead of implementing uniform activities. For example, one of the activities was afforestation of hills. In one area we opted to plant fruit trees in home steads and carry out natural regeneration activities in the hills to maximise the impact.
These changes may seem small, but can potentially make a huge difference for the lives of people who need it the most. Without learning from the past, through a solid evaluation system, we risk making the same mistakes again and again. The price of repeating mistakes are high – and will almost always be paid by the people who can least afford it.