Putting government in the driver’s seat to generate and use impact evaluations in the Philippines

Putting government in the driver’s seat to generate and use impact evaluations in the Philippines

Impact evaluations are sometimes criticised for being supply-driven. It is hard to know for sure. There is no counterfactual to what would have happened without the impact evaluation.   

Regardless of whether this is true or not, one of the ways to ensure that an impact evaluation is more demand-driven is to put the government in the driver’s seat for increasing the demand for evaluation. This happened with the evaluation of Mexico’s Progresa programme, as documented in this one example of how evaluation changed the outlook for nutrition. A recent Brookings blog shows how other countries and institutions in Latin America are emulating that example.  

The East Asian experience with impact evaluations is more recent. One promising country example is the Philippines, where 3ie supports a government-led impact evaluation programme. The ongoing programme develops and funds rigorous, mixed-method impact evaluations to inform policymaking in the country. The programme also includes capacity development activities for local researchers and support for the impact evaluation management framework of the National Economic and Development Authority (NEDA), the Philippine government’s national planning agency.  

At the recently concluded 2nd Asia Pacific Evaluation Association International Evaluation Conference in Manila, we presented challenges and early lessons learned from this joint programme. Participants from around the region showed strong interest in understanding how the collaboration is working in practice and what some of the ways are to encourage governments to be strong champions and partners for evaluation. In this blog, we capture some of these reflections.

How we designed the Philippines Evidence Programme

Set up four years ago, the Philippines Evidence Programme (earlier called Policy Window Philippines) has established a strong partnership with the government. The steering committee overseeing the programme’s implementation is chaired by NEDA and includes representation from 3ie and the main funder, the government of Australia’s Department of Foreign Affairs and Trade. NEDA spearheaded the selection of government implementing agencies that needed impact evaluations of key priority programmes. After the agencies were selected, direct engagement among 3ie, selected research teams and implementing agencies followed. These consultations ensured that evaluation questions and study designs were demand-driven and relevant to the implementing agencies’ needs. As our partner, the government has a stronger sense of ownership of the program and understands: (i) how the programme could benefit them in providing evidence for decision-making; (ii) the required level of commitment from the implementing agency for carrying out an impact evaluation; and (iii) the limitations of the programme, what it can do and what it cannot do.  

Four impact evaluations have been funded through 3ie’s programme, in collaboration with the Department of Labor and Employment (DOLE), the Department of Social Welfare and Development, and the Supreme Court of the Philippines. The first of these studies has been published and examines the impact of DOLE’s Special Program for the Employment of Students. The authors of the 3ie-supported study found that in the medium run, the programme may be more effective as a work programme rather than an education programme. The study findings have informed DOLE’s decision to shift the focus of the programme to youth skills building and improvements in the quality of the training provided.

What have we learned?

It has not always been smooth sailing. What have been the most important factors in building trust and alignment amongst government agencies that have multiple priorities? Here are some lessons that we collectively learned.

Early, ongoing, proactive engagement involves patience and planning for failure.

One of the major challenges we have encountered is the attrition and (sometimes rapid) change in a government agency’s leadership between the study design and implementation phases. The 2016 Philippine presidential election meant that we had to carry out a new round of consultations with all the agencies involved in our programme in order to secure buy-in. We also had to adapt to changes in government priorities. In some cases, there was also a change in the operating guidelines of the programme chosen for evaluation that resulted in implementation delays.  

We had to be flexible to adapt and address the new demands as quickly as possible while adhering to the programme objectives and agreed deadlines. For example, an 18-month-long engagement between 3ie, the evaluation team and an implementing agency failed to secure support for a study, due to the sensitive and political nature of the programme that was to be evaluated. Despite everyone’s best efforts, ultimately the politics proved too complicated to manage. In another instance, despite multiple changes to the leadership of a department, we were able to kick off an impact evaluation after two years of engagement, with the active support of NEDA.

Continued coordination with decision makers and official agreements help facilitate study implementation.

Closely coordinating with programme implementers and technical staff helps to ensure that studies are on track and, when they are not, to flag issues to decision makers, including the steering committee. For example, one of the originally selected implementing agencies took longer than expected to resolve internal challenges, which put the study’s feasibility at high risk. Taking into consideration the time that was left, the steering committee was deliberate in deciding to allocate the resources to another agency that was ready to take on the impact evaluation.    

Signed agreements help. At the start of each study, 3ie signed a letter of agreement with the head of the relevant implementing agency to document the start of the collaboration. These letters proved invaluable in facilitating the continuity of the studies, especially in the face of rapid turnover within the agencies.

Customised capacity development and learning by doing is important for building the culture of evaluation.

It is important to develop a good understanding of the government’s capacity to commission such studies as well as use the evidence produced so that training and other capacity development activities are customised and well targeted. To develop this understanding, each study has a technical working group comprising representatives from all the different offices involved in implementing the programme under evaluation. Convening meetings with this technical working group, as often as necessary, has allowed the various programme implementers to be heavily involved in the study. They have given feedback on questionnaires, what is feasible, and what is not, and have provided inputs to the research team for interpreting some of the results.  

We also organised demand generation workshops, where representatives from multiple government departments expressed their evaluation and capacity needs and provided extremely helpful suggestions on where we should be focusing our efforts.

As part of the programme, 3ie also sponsored a two-week impact evaluation course for a group of 30 government representatives, young academics and graduate students. The long-term goal is to create a sustained interest in, and demand for, evidence and have local evaluators (government and non-government) engaged in independent, high-quality evaluations. A strong background in evaluation methods will also be helpful for government partners to translate the findings into action points for policy.

In summary, successful engagement with government requires targeted strategies and responsiveness to the specific needs of government counterparts. Our experience in the Philippines (and many other countries) has affirmed our belief that government-led and demand-driven studies are important for generating evidence that is useful and relevant to decision makers. 

Add new comment

This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.

Authors

Emmanuel-Jimenez Emmanuel JimenezDirector General, Independent Evaluation Department of the Asian Development Bank (ADB)
Tara Tara KaulFormer Evaluation Specialist, 3ie
Fides Borja Fides BorjaConsultant, 3ie

About

Evidence Matters is 3ie’s blog. It primarily features contributions from staff and board members. Guest blogs are by invitation.

3ie publishes blogs in the form received from the authors. Any errors or omissions are the sole responsibility of the authors. Views expressed are their own and do not represent the opinions of 3ie, its board of commissioners or supporters.

Archives

Authors

Emmanuel-Jimenez Emmanuel JimenezDirector General, Independent Evaluation Department of the Asian Development Bank (ADB)
Tara Tara KaulFormer Evaluation Specialist, 3ie
Fides Borja Fides BorjaConsultant, 3ie

About

Evidence Matters is 3ie’s blog. It primarily features contributions from staff and board members. Guest blogs are by invitation.

3ie publishes blogs in the form received from the authors. Any errors or omissions are the sole responsibility of the authors. Views expressed are their own and do not represent the opinions of 3ie, its board of commissioners or supporters.

Archives