Latest blogs

The HIV/AIDS treatment cascade

One of the reasons we appreciate international days is that they prompt us to pause and reflect on what we’ve been doing in the past year, as well as think about what the next year will bring.  On this International AIDS Day, our first reflection is realising how much we have grown our HIV/AIDS programming in 3ie in 2013.

A pitch for better use of mixed methods in impact evaluations

At the opening session of 3ie’s recent Measuring Results conference, Jyotsna Puri, Deputy Executive Director and Head of Evaluation at 3ie, said, “It takes a village to do an impact evaluation.” What she meant was that, for an impact evaluation to be successful and policy relevant, research teams need to be diverse and include a mix of disciplines, such as statisticians, anthropologists, economists, surveyors, enumerators and policy experts, as well as use the most appropriate mix of evaluation and research methods.

Shining a light on the unknown knowns

Donald Rumsfeld, a former US Secretary of Defense, famously noted the distinction between known knowns(things we know we know), known unknowns (things we know we don’t know), and unknown unknowns (things we don’t know we don’t know). In international development research, these same distinctions exist.

How will they ever learn?

The low-quality of education in much of the developing world is no secret. The Annual status of education report (Aser), produced by the Indian NGO Pratham, has been documenting the poor state of affairs in that country for several years. The most recent report highlights the fact that more than half of grade five students can read only at grade two level. Similar statistics are available from around the world.

Can we learn more from clinical trials than simply methods?

What if scientists directly tested their drug ideas on humans without first demonstrating their potential efficacy in labs? This question sounds hypothetical because we all know that using untested drugs can be potentially dangerous. If we were then to use the same logic, should we not be exercising similar caution with randomized controlled trials (RCTs) of social and economic development interventions involving human subjects?

The importance of buy-in from key actors for impact evaluations to influence policy

At a public forum on impact evaluation a couple of years ago, Arianna Legovini, head of the World Bank’s Development Impact Evaluation programme (DIME), declared that ‘dissemination is dead’. But her statement does not imply that we should stop the dissemination of impact evaluation findings for influencing policy.

Moving impact evaluations beyond controlled circumstances

The constraints imposed by an intervention can often make designing an evaluation quite challenging. If a large-scale programme is rolled out nationally, for instance, it becomes very hard to find a credible comparison group. Many evaluators would shy away from evaluating programmes when it is hard to find a plausible counterfactual. Since it’s also harder to publish the findings of such evaluations, there don’t seem to be many incentives for evaluating such programmes.

Placing economics on the science spectrum

Where does economics fit on the spectrum of sciences? ‘Hard scientists’ argue that the subjectivity of economics research differentiates it from biology, chemistry, or other disciplines that require strict laboratory experimentation. Meanwhile, many economists try to separate their field from the ‘social sciences’ by lumping sociology, psychology, and the like into a quasi-mathematical abyss reserved for ‘touchy-feely’ subjects, unable to match the rigor required of economics research.

Home Street Home

The International Day for Street Children provides a platform to call for governments to act and support the rights of street children across the world. But a recent study shows that we have very little evidence on the ways in which we could act most effectively to address the needs of street children. We need such studies as this is the only way we can avoid spending on ineffective programmes and channel funding to programmes that do work.

Uganda shows its commitment to evaluation

Uganda’s cabinet has just approved a new monitoring and evaluation policy, which will be officially endorsed and disseminated next month. It comes as a positive signal after several donors suspended aid to the Government and provides a solid foundation to boost the country’s commitment to evidence informed policy-making.

About

Evidence Matters is 3ie’s blog. It primarily features contributions from staff and board members. Guest blogs are by invitation.

3ie publishes blogs in the form received from the authors. Any errors or omissions are the sole responsibility of the authors. Views expressed are their own and do not represent the opinions of 3ie, its board of commissioners or supporters.

Archives