There is only one thing in the world worse than being talked about, and that is not being talked about (Oscar Wilde). Our recent review of community-driven development (CDD) is certainly being talked about. Sparked off by Duncan Green’s blog on our review, there has been an active debate about CDD on social media. But the message being taken away from our review is that ‘CDD doesn’t work’. We don’t say that exactly. CDD has been enormously successful in delivering small-scale infrastructure.
CDD funders and implementers should certainly be taking note of these discussions as they decide how to invest precious development money in designing programmes. So, we are keen to get our message straight.
Our evidence synthesis study is not a critique of CDD. The answers to the questions we examined are sometimes nuanced and sometimes straightforward. The question ‘what works?’ needs to be unpacked into ‘what works for whom and to achieve what, as well as why does it work or not work?’ These are not straightforward Yes or No questions, as we found in looking carefully at high-quality evidence on programmes from different countries.
Dismissing participatory development on the basis of CDD’s impact would also be a dangerously incorrect conclusion. We reviewed programmes that mainly focused on infrastructure because billions of dollars have been pumped into this approach in several countries. Community engagement programmes in specific sectors, such as school-based management, community-led total sanitation or self-help groups, have their own theories of change. We need to examine evidence of their effectiveness separately. 3ie has in fact a dedicated evidence programme looking at community engagement approaches for improving immunisation. A review of school-based management finds that it does have positive effects, though not in low-income settings. And a review of self-help groups finds they do lead to women’s economic, social and political empowerment.
What our synthesis finds
What is clear is that CDD programmes do work for building public infrastructure. This is an unequivocal finding of our study. They have often constructed or rehabilitated very large numbers of facilities in communities and have benefitted from tens to hundreds of thousands of people in each country. In many cases, programmes have exceeded targets. However, its cost effectiveness compared to alternative approaches is not clearly established. We need to carry out further research in different contexts.
But proponents of CDD have claimed more than this impact. They claim that CDD builds communities, not just schools, roads and health clinics. A staggering finding from our work is that CDD programmes have no effect on social cohesion. Irrespective of the country or the type of programme (long or short-term), the lack of effect is consistent across contexts. This is where meta-analysis is so useful, as it clearly illustrates the finding in the forest plot (see graph). When the intervention line crosses the vertical line – as it does in all cases – it means there is no significant impact. We find the same lack of effect on improved governance.
We think that part of the problem between the claim and the evidence lies in an unclear theory of change for how CDD improves social cohesion. Social cohesion refers to behaviours and attitudes within a community that reflect members’ propensity to cooperate. If this is used as a working definition, then it is unclear how the inclusive process of community participation in implementing development projects automatically engenders further cooperation. As has been suggested in some studies, CDD programmes may well be users rather than producers of social capital. For example, it is those communities where social capital is already high that come together to make a successful application for funding.
Moreover, our analysis using the funnel of attrition (see infographic) shows how participation in actual decision-making and project management ends up involving only a small number of community members.
Where the studies do not give such a clear message is about CDD’s impact on social and economic welfare. Here again, CDD’s theory of change relies on assumptions that may not hold. As other commentators have said, the creation of public infrastructure will not improve health and education outcomes without complementary inputs. So, while most CDD programmes did not improve children’s school enrolment, the Peruvian FONCODES did simply because a centrally managed school uniform and school feeding programme was implemented along with school construction and rehabilitation. Similarly, in very exceptional cases, health outcomes have improved because construction or rehabilitation of facilities is accompanied by investment in health staff, medicines and other supplies.
Some commentators on our study have cited another review by Casey, which does find positive effects on economic outcomes. Casey’s analysis includes fewer studies than our review. But this is not a question of ‘our study is right, your study is wrong’. This points to the need to unpack the evidence further to ask which CDD programme designs may have these positive effects and in which contexts.
Understanding programme variance through a mixed-method review
When we started working on this review, we quickly realised that impact evaluations could not give us all the answers we were looking for (as some commentators have emphasised). We drew on process evaluations and qualitative research to understand how different programme design elements have worked. Here are some things that stood out for us:
- CDD implementing agencies have used a variety of measures to promote the participation of women, poor and marginalised groups. They have targeted the poorest communities, mandated quotas in project committees and provided facilitation. While impact evaluations provided no information on how these measures worked, project documents have been a rich source of information. For instance, long running programmes such as the Kecamatan Development Programme (KDP) in Indonesia have a comprehensive set of successful measures to improve women’s participation compared to several short term programmes that either had limited or one-off measures.
- The institutional set-up of the CDD implementer, as an independent agency or as part of an existing ministry or department, influences the impact of the programme in different ways.
- Evaluations show that communities face numerous challenges in maintaining infrastructure. Implementers should pay explicit attention to the technical, institutional and financial mechanisms in place for ensuring that infrastructure are maintained and operate properly. Again, long-running programmes such as KALAHI–CIDSS in the Philippines have used good practices that are worth learning from.
Our bottom line
Our bottom line is that CDD has worked to deliver infrastructure and might be the most cost-effective approach, though that needs testing in different contexts. Agencies should however stop claiming that CDD also builds social cohesion. More effort is required to understand participatory approaches, particularly through learning from where it has been successful.
We urge CDD implementers and evaluators to move beyond the definition of a community as a geographic administrative unit. There should be more attention paid to local political economy and gendered power dynamics in communities. For CDD to be truly development-driven by communities we need to work towards taking a different approach.