Reassessing the effectiveness of Tuungane in Democratic Republic of Congo
Context
Community-driven reconstruction (CDR) finds its roots in community-driven development, most widely implemented in support of decentralised governance systems. The approach primarily focuses on rural or remote regions impacted by conflict. CDR works toward building community-led institutions and systems that assist in recovery of the affected regions. CDR constitutes an institution-building phase, funded by a donor, who works with the communities to elect local village development committees. These elected committees are mandated to include women and are tasked with designing, implementing and maintaining aid-funded development projects. CDR is based on the principle that exposing communities to good governance can help alter social behaviour over a period of time. The approach has been pioneered by the World Bank in various developing countries. Since the turn of the century, CDR has witnessed rapid scale-up; however, there is little evidence on its impact and effectiveness.
In 2007, International Rescue Committee (IRC) and CARE International, with financial support from the UK government, launched Tuungane, a major CDR programme in Eastern Democratic Republic of Congo. The three main objectives of the programme were economic recovery, improving the quality of local governance and social cohesion. In 2010, 3ie collaborated with researchers from Columbia University, United States, to measure the impact of Tuungane on social and economic outcomes.
A randomised controlled trial was built into Tuungane’s implementation in South Kivu, Maniema, Nord Katanga and Haut Katanga provinces, with 280 communities randomly chosen to participate in Tuungane and 280 similar communities assigned to the control group.
Evidence
Despite strong implementation, the programme did not impact social cohesion, governance or welfare significantly. Though the evaluation found positive experiences around local governance and transparency under Tuungane, there was no significant statistical difference between the treatment and the control groups on these parameters.
Villages not subject to the gender parity requirement had, on average, 30 per cent female representation on the village development committees, suggesting that in this context the gender parity requirement is not a necessary condition to ensure some degree of female representation. Attitudes regarding the roles and responsibilities of women did not differ between villages with or without the gender parity requirement, nor did the projects selected vary significantly with or without the requirement. Together, these results indicate that requiring gender parity may not effectively impact women's roles or voices in this context.
Evidence impacts
Type of impact: Improve the culture of evidence use
When decision makers or implementers demonstrate positive attitudinal changes towards evidence use or towards information the research team provides. Examples include strengthening monitoring and evaluation systems, increasing understanding of evidence and openness to using it, integrating these systems more firmly into programming or commissioning another evaluation or review.
This is one of 3ie’s seven types of evidence use. Impact types are based on what we find in the monitoring data for an evaluation or review. Due to the nature of evidence-informed decision-making and action, 3ie looks for verifiable contributions that our evidence makes, not attribution.
Read our complete evidence impact typology and verification approach here.
Close windowThe null findings from the study and interaction with the field team highlighted the need for better monitoring and evaluation to understand the challenges faced during implementation and to better capture impact. This led to IRC revamping its monitoring and evaluation methods for this programme, to better account for implementation activities and provide detailed information on all aspects of the programme.
Type of impact: Inform discussions of policies and programmes
When subsequent phases of the evaluated programme or policy draw from the findings of the evaluation or review, and/or the study team participates in informing the design of a subsequent phase.
This is one of 3ie’s seven types of evidence use. Impact types are based on what we find in the monitoring data for an evaluation or review. Due to the nature of evidence-informed decision-making and action, 3ie looks for verifiable contributions that our evidence makes, not attribution.
Read our complete evidence impact typology and verification approach here.
Close windowFollowing the evaluation findings suggesting a need to reassess the effectiveness of the CDR model and focus on more fundamental questions, the IRC launched a programme of research to further investigate the CDR programme theory of change. IRC intends to apply this learning to future programme and evaluation design. This experience also prompted IRC to embed qualitative components in its evaluations to better understand the implementation context. IRC applied this lesson to the next phase of Tuungane and to the evaluation of the community-driven development programme in Somalia.
I mean, the findings themselves I think were important, but I think where it had a lot [of] and maybe in some ways more impact was immediately realising that we needed to…have much better M&E (monitoring and evaluation) even as we had impact evaluation, so [we] actually improved on monitoring [and] evaluation of the IRC because we realised that we were lacking that critical aspect of information that would actually complement that impact evaluation.
Suggested citation
International Initiative for Impact Evaluation (3ie), 2020. Reassessing the effectiveness of Tuungane in Democratic Republic of Congo [online summary], Evidence Impact Summaries. New Delhi: 3ie.
Evidence impact summaries aim to demonstrate and encourage the use of evidence to inform programming and policymaking. These reflect the information available to 3ie at the time of posting. Since several factors influence policymaking, the summaries highlight contributions of evidence rather than endorsing a policy or decision or claiming that it can be attributed solely to evidence. If you have any suggestions or updates to improve this summary, please write to influence@3ieimpact.org