At the opening session of 3ie’s recent Measuring Results conference, Jyotsna Puri, Deputy Executive Director and Head of Evaluation at 3ie, said, “It takes a village to do an impact evaluation.” What she meant was that, for an impact evaluation to be successful and policy relevant, research teams need to be diverse and include a mix of disciplines, such as statisticians, anthropologists, economists, surveyors, enumerators and policy experts, as well as use the most appropriate mix of evaluation and research methods.

So the key questions are: how can the use of mixed methods and interdisciplinary approaches make impact evaluations more policy relevant? What extra dimensions get added to the standard ‘quantitative and econometric’ impact evaluation when we also add other elements from different methodologies and disciplines? The answer to this is nicely summed up by Rao and Woolcock who say that qualitative methods can be crucial to understanding the impact as opposed to simply measuring it. And understanding why certain ideas, when implemented a certain way work, while others don’t, is often what policymakers want to know.

At 3ie, our experience with managing impact evaluation grants has given us a better sense of how qualitative methods work as part of an impact evaluation. What seems clear is that in the quest for using mixed methods to produce high-quality impact evaluations, caution has to be exercised to ensure that qualitative methods do not play second fiddle to quantitative methods. Evaluators need to understand that the minor inclusion of qualitative methods through a few focus group discussions or in-depth interviews here and there, does not necessarily enhance the quality of the impact evaluation and nor does it justify calling the study a mixed methods evaluation. (A disappointing number of technical proposals that 3ie receives have this weakness.) Instead, qualitative methods should be systematically integrated with traditional quantitative methods at various points of an impact evaluation.

There are numerous ways this can be done. Qualitative methods can be immensely useful, for instance, in the development of quantitative survey instruments. By using qualitative methods, such as key informant interviews, focus group discussions and participatory rural appraisal, one can design surveys that take contextual issues into account and clarify the mechanisms through which the programme can be beneficial to those receiving it. A good example can be found in an ongoing 3ie-supported evaluation of a government programme that seeks to improve maternal and child care in rural Karnataka by providing free medical care to poor women in private health facilities. Results from qualitative analyses show that a range of factors, over and above the monetary aspects, play a crucial role in influencing the decision of the pregnant woman and her family on whether or not to seek care from private facilities. These include problems of physical access, such as the distance to the facility, lack of transportation and perceptions about the quality of care received. These insights have subsequently helped inform the survey instrument used in the impact evaluation.

Qualitative methods can also be helpful in the last stage of an impact evaluation, once the results of the impact evaluation have been generated. For example, a 3ie-supported impact evaluation of an intervention that aimed to increase the financial inclusion of people in rural Kenya found that, even after opening costs (information acquisition, account opening fees and administrative requirements) were taken care of, financial inclusion remained low. While 60 per cent of the people who were offered this opportunity opened accounts, the actual account use was much lower. Study authors conducted interviews to understand why the people selected to receive a subsidised account did not actively use it. They found out that a significant proportion of people did not trust the bank. They harboured fears related to issues of embezzlement, unreliable services and high transactions fees.

It is lamentable that although mixed methods are often seen as the way forward for conducting more meaningful impact evaluations, their use in impact evaluation is still very limited. For impact evaluations to be truly policy relevant, it is imperative that we use combinations of most appropriate methods and a range of expertise in order to uncover and understand why and how an intervention works. So yes, it does indeed take a village to do impact evaluations.

Leave a comment

Enter alphabetic characters only.