Seven ideas for rapid evidence that is both rigorous and actionable
When policymakers come to us for rapid response evidence, they want it to be immediate, actionable, and reliable, drawing on findings from high-quality evaluations. These requirements can sometimes seem to be at odds – often there are details of a specific policy situation that have not yet been addressed by rigorous research. So how do we balance the competing needs to be both actionable and rigorous?
While there is no precise formula, our work on the WACIE Helpdesk (English | French) has led us to identify a set of processes and guardrails we use to provide tailored evidence that is actually useful to practitioners.
- Help the practitioners refine their questions. Often, we get a question that existing research, or even existing methodologies, cannot answer. But there's usually a part of the question, or a related question, which can be answered. When we get an unanswerable question, we work with the practitioners to refine the scope of the question, identify key pieces of information that can still provide the necessary insight, or provide alternatives on similar policies. It can sometimes take several rounds of refinement, but eventually we arrive at the question which existing research can answer and which can inform the specific policy decision.
- Have set minimum standards for evidence quality. We aim to base our rapid responses on existing evidence synthesis work, specifically vetted "high" or "medium" quality systematic reviews as coded in our Development Evidence Portal (you can see our coding standards for systematic review quality here). Without minimum standards like these, it can be very tempting to look at studies that seem to exactly address a policymaker's need for evidence – but which do so with questionable data.
- Remember that context matters. Not all studies, or all components of all studies, are relevant or appropriate for a given policy context. Knowing what is applicable requires that members of the team assembling the evidence possess in-depth country knowledge.
- Be upfront about what evidence does not exist. We maintain our credibility about what the evidence says by being direct about what existing evidence does not say. Sometimes, even just saying "no one knows for sure on that point" can be helpful for a practitioner.
- Be proactive about suggesting ways to fill evidence gaps. There are often ways to build new evaluations or add data points into existing or upcoming programming that can provide answers down the line. Even if we can't answer all parts of a question immediately, we can say: "Here's how you can learn that during implementation and make fixes along the way."
- Speak the policymaker's language, literally and figuratively. For practitioners working in languages other than English, sometimes there is a dearth of studies published in the local language, requiring literal translation. But even evaluations written in a policymaker's language are often written with terminology that is inaccessible to non-technical audiences, and that's where the figurative translation comes in. In all cases, it is important to write and speak using the vernacular of the policymaker, not the academic researcher.
- Don't tell people what to do. Our role is to provide evidence, not to make a decision. Practitioners have many competing interests to consider. When we do provide recommendations, they tend to be for issues to consider or points to address, not specific policy actions we propose.
Still, we're learning more with each rapid response we provide. And if you have any thoughts, insights, or questions on the process – or if you're a policymaker in need of rapid evidence – reach out to us!
Add new comment