How qual improves quant in impact evaluations

Qualitative-blog

Bridging divides, be they across ethnicities, religions, politics or, indeed, genders, is never easy.  There have been many books written about them, including some that made millions – for example, John Gray’s idea that men and women come from different planets, Mars and Venus respectively, is apparently the best-selling hard cover non-fiction book ever. One shouldn’t begrudge them because the payoffs – domestic or planetary peace – are high indeed.  I don’t think that essays on quantitative and qualitative techniques in evaluation will ever make anyone rich or resolve existential problems. But they can improve evaluations. How do they do so?

In a recent paper that has just been published by CEDIL, my co-authors and I ask: what are the characteristics of quantitative impact evaluations which have successfully integrated qualitative research into their analysis? How does the integration improve analysis? To get the answer, we chose recently completed impact evaluations and developed a tool which assessed the rigour of their quantitative and qualitative analyses. Some 57 studies were identified from impact evaluation repositories (DFID, 3ie, World Bank and JPAL) spanning 20 countries, including five studies in fragile and conflict-affected contexts.

Developing a rigour assessment tool for qualitative and quantitative analyses integration

Our rigour assessment tool drew on evaluation criteria from a number of sources including, for quantitative analyses, 3ie’s own quality assurance standards. For the qualitative work, the criteria covered the domains of confirmability, credibility, transferability and utilisation which are also well articulated in several existing studies. There are less well-established criteria for integration. Indeed, there are studies which may do quantitative and qualitative analyses well, but don’t do such a good job in showing how the two can be blended for a better result. A common approach to integrating qualitative data collection in impact evaluations involves using these data to triangulate quantitative results on effects or mechanisms described in an intervention’s causal pathway. It involves checking for mechanisms that are harder to capture through quantitative measurements and documenting any unintended intervention consequences. We thus focused the tool on triangulation, complementarity, development, initiation and expansion. The tool was reviewed by subject matter experts on mixed-method research and underwent rigorous testing by independent reviewers and feedback from experts in the field, prior to finalisation.

What characterises well-integrated, mixed-method evaluations?

While there was great diversity across the studies, we found that the impact evaluations that were done had four consistent characteristics.

First, studies which scored highly on quantitative and qualitative rigour, also scored highly on integration. Moreover, when qualitative rigour was high, it was easier to discern how well a study had integrated qualitative and quantitative components. In contrast, studies that scored highly only on quant did not necessarily integrate the two methods well.

Second, it helps if a clear rationale for the integration of qualitative and quantitative methods is presented.

Third, using a multi-disciplinary team that has a shared framework with a clear delineation of tasks that transcend individual disciplines can help bridge gaps and lead to more robust, fully integrated mixed-method research.

Fourth, a common element among our exemplar studies is the provision of adequate documentation. This could be within a report, or through supplementary reports and/or appendices.

How does successful integration lead to better evaluations?

Successful integration led to better evaluations in four distinct ways.

  1. Integrating qualitative and quantitative lines of enquiry lies in the use of different methods of data collection, and how they inform study design and findings. For example, a study evaluating the impact of humanitarian cash transfers used participatory techniques of data collection in conflict-affected communities to identify target beneficiaries, which, in the absence of qualitative data might not have led to nuanced findings in the Democratic Republic of Congo.
  2. Successfully integrated studies enabled the teams to validate the results. In some cases, this was simply providing ‘ground-truthing’ to the quantitative studies. But in some studies, divergence of findings often results in more nuanced interpretations than might be afforded by using a single method alone.
  3. The use of qualitative methods can enhance the understanding of quantitative results by providing the context or background necessary to situate the findings. For instance, a handwashing intervention on women successfully reduced child diarrhea. However, the triangulated qualitative findings highlighted an important negative impact of the intervention – the ultra-poor in the sample were not only unable to take-up the intervention but they also suffered social censure from those in the sample who participated in the intervention.
  4. Successful integration can help inform contextually relevant policy recommendations. For example, in an evaluation of a nutrition programme in Bangladesh, when the quantitative methods were not able to detect significant impacts of the intervention, the qualitative evidence pointed to specific nodes in the intervention pathway that were problematic. The policy recommendations focused on resolving those issues.

Did these studies resolve all the divides across quants and quals in evaluation? Of course not.  But they did show that these ‘methodological tribes’ can resolve their differences in mutually productive ways. 3ie has been advocating for mixed-method studies for some time, as in this blog  published five years ago. Relative to then, there are now more examples of what characterises successful integration which I hope is helpful for those planning such evaluations. 3ie will continue to encourage and support the production of well-integrated, mixed-method studies that provide useful and policy-relevant evidence.

Add new comment

Authors

Emmanuel-Jimenez Emmanuel JimenezDirector General, Independent Evaluation Department of the Asian Development Bank (ADB)

About

Evidence Matters is 3ie’s blog. It primarily features contributions from staff and board members. Guest blogs are by invitation.

3ie publishes blogs in the form received from the authors. Any errors or omissions are the sole responsibility of the authors. Views expressed are their own and do not represent the opinions of 3ie, its board of commissioners or supporters.

Archives

Authors

Emmanuel-Jimenez Emmanuel JimenezDirector General, Independent Evaluation Department of the Asian Development Bank (ADB)

About

Evidence Matters is 3ie’s blog. It primarily features contributions from staff and board members. Guest blogs are by invitation.

3ie publishes blogs in the form received from the authors. Any errors or omissions are the sole responsibility of the authors. Views expressed are their own and do not represent the opinions of 3ie, its board of commissioners or supporters.

Archives