2020 Hindsight: A year of what the evidence has taught us
A little less than a year ago, at the start of what has become an infamous year, we launched a yearlong social media campaign called ‘2020 Hindsight: What Works in Development.’ Introducing the campaign, we explained that 20/20 vision is a term used to express that you can see clearly at 20 feet what should normally be seen at that distance. From there, the term 20/20 hindsight means one is able to evaluate past choices more clearly than at the time of the choice – indeed with the perfect clarity of vision that one did not have at the time.
Given the 2020 hindsight we now have about this past year, it is tempting to write a blog about how much we should have prepared for beforehand, knowing full well that pandemics can occur. But we will resist this path because it would take away from the blog's main purpose: to reflect on what we learnt from our synthesis work.
3ie was established in 2008 to help address the gap in quality development evidence that existed at the time. Since then, we have supported the production of new and rigorous evidence of what works, where, why, for whom and at what cost. Fast forward to 2020, and thousands of development effectiveness studies have been implemented in low- and middle-income countries by 3ie and many others, and hundreds of systematic reviews have synthesized the bodies of rigorous evidence relevant to development policy research questions. In addition, 3ie has curated the world’s largest searchable development evidence portal.
While the evidence gaps are still large in sectors like climate change or in contexts of fragility and conflict, we need to hold ourselves to account. What have we learnt 14 years after the report asked ‘when will we ever learn?’ We were aware of the possible challenges. As we noted in our launch post:
(i) A lot of the impact evaluations included in systematic reviews are of relatively low-quality, hence we should be careful about the conclusions. (ii) A lot of the systematic reviews find conflicting evidence due to a lack of evidence, or moderating factors like context, implementation and intervention design. (iii) There may be evidence that interventions mostly do not work. (iv) And, even when programs create change, they may not achieve the desired development goals. Elsewhere, we have also blogged about the fact that impact evidence alone is not enough - cost evidence is essential for deciding how scarce resources can best be used and that this is still largely missing.
As for the first of these concerns, the newly-updated development evidence portal made it an easier problem to manage. Since all of the systematic reviews in the portal have a quality appraisal rating, it was easy to weed out the low-quality reviews.
The remaining points all turned out to be more-or-less true. Nevertheless, we did find sufficient clear, interesting, actionable findings of positive effects to fill up the year, though their cost-effectiveness was missing in action.
The COVID-19 pandemic did make us reflect on how existing evidence could help us with something unprecedented. Looking at health and sanitation research took on a new importance, leading us to write about handwashing, vaccination campaigns, and community health workers.
Since it's the holiday season, gifts are on many of our minds. For us, this campaign has been a year of gifts, in a sense, with each piece of evidence representing a gift of knowledge. Here are a few of our favorites:
- How to get people to wash their hands: Only in 2020 could a blog post about hand washing have attracted so much attention and become our favorite post.
- What works to improve vaccination rates in low- and middle-income countries: We wrote this post back in April, hoping that it would be useful before too long. Now, the topic is even more relevant, as the distribution of COVID-19 vaccines begins.
- In humanitarian emergencies, cash-based social assistance is cheaper than food distribution, and it works just as well: With countries around the world initiating or expanding cash transfer programs to help manage the economic effects of the pandemic, we were glad that the evidence showed it to be an effective approach.
- What works to help students learn? Teach the teachers. Feed the students: The most effective solutions are not always high tech. While this blog was written before the pandemic had been declared, structured pedagogy programs (the combination of new instructional materials and teacher training) adapted to an online teaching reality are likely to only increase in importance!
- What works to get firms hiring? Support small businesses — but not the smallest ones: This post is not quite what we expected when we set to write about support for small businesses. Rather than findings about which types of programming were most effective, we found evidence about what size of businesses should be targeted for support, if the goal is to increase hiring.
- Shifting forest management to local control may reduce deforestation, without payments to anyone: Unlike other posts, this finding is about a policy change, not a program funded by a government or a donor. In fact, it would not necessarily cost anything!
We hope that our readers have found this series not just interesting, but useful. To share stories of how our campaign led to concrete changes in development policies, programs of research, or other examples of evidence use, we invite anyone to contact us at influence@3ieimpact.org. The best examples we receive will be recognized on our website with blog posts in next year's social media campaign, which will focus on evidence impact and use. We look forward to hearing from you!
Add new comment