As we've discussed on this blog before, the development community has made tremendous strides in producing and using rigorous evidence, and it also still has a long way to go to build a strong evidence culture. The closing plenary for USAID's Agency Learning and Evidence Month on 27 April reflected recent years' progress and the remaining challenges.
Just the existence of this first-ever "evidence month," with three dozen events, shows the increased focus on building evidence into development processes. Panelists reflected on numerous ways in which many development agencies must update the ways they work to bring evidence to the forefront.
"Transforming development into an empirical science is a revolutionary idea of this century," said Arianna Legovini, director of Development Impact Evaluation (DIME) at the World Bank. "The idea is very simple, although hard to implement … instead of implementing rigid designs, rigid ideas, [or] predetermined ideas, we actually trial and test ideas on the ground during the implementation stages of all development interventions to learn much more about what works and how to make it work."
Designing projects based on rigorous evidence can have tremendous benefits for poverty alleviation, Legovini said.
For project managers to use the evidence, they need access to it. That's where 3ie's Development Evidence Portal comes in, according to 3ie Executive Director Marie Gaarder.
"Free and easy access to curated evidence for all is a necessary condition … for our vision of better evidence-informed programs," Gaarder said.
And while generating and making the evidence available may be necessary conditions, they're insufficient to ensure evidence use.
"In order to really get there and enhance [evidence] use, we need to work on all the levers that we have. We need to build capacity, we need to build incentives, and we need to build in processes for timely evidence use," Gaarder said.
She urged that the measures of "success" within many development organizations need to change.
"Success is often measured by project approvals and disbursements of funds … but not so much by results on the ground," Gaarder added.
It's not a fix that can be made with a single requirement at one point in the process.
"Without addressing and promoting a holistic evidence culture, single measures aren't going to do it," Gaarder reminded.
At the Millennium Challenge Corporation (MCC), evidence use is already built-in at several steps of the project design and approval process.
"We have established clear criteria upfront with requirements for evidence in problem definition and program design that are built into the process," said Tom Kelly, deputy vice president of MCC's department for policy and evaluation. "We have very clearly defined steps with evidence requirements as teams move through the process."
These requirements lead to conversations between project design staff and MCC economists, which lead to better-designed projects, Kelly said.
"It forces discussions of project definitions and sharpens our understanding of the diagnosis of the problem. And while not everyone likes to be asked for evidence to back up their assertions, it's now expected at MCC,” he shared.
"Because of the structure we've put in place, no one is ever surprised internally when they're asked for evidence," Kelly said.
Hopefully, in the coming years, this statement will hold true for many more development organizations.