Four Ways to Bring Evidence into Education Policy: Lessons from IPA’s Work in Emerging Markets

Four Ways to Bring Evidence into Education Policy: Lessons from IPA’s Work in Emerging Markets

Template G Content Blocks
Sub Editor

Editor's note: This cross-posting first appeared on NextBillion

Evidence-based policymaking. Those in the academic and think tank worlds all say they want to do it, but aren’t quite sure how. We can all agree that for evidence-based policymaking you need evidence, but you also need to “make it easy” to use that evidence, as behavioral economist Richard Thaler would say.

Evidence we have – IPA has done 600-plus randomized evaluations, more than 50 in education alone. We even have summaries of evidence highlighting education policy lessons emerging from multiple studies, such as:

  • Reducing the actual and hidden costs of school helps more kids attend;
  • Targeting instruction at the child’s level improves learning levels.

So how do we make it easy for policymakers to use evidence once we have it? The education sector often has a unique set of challenges and political complications that differ from country to country. What we’ve learned from working with over a dozen ministries of education across the developing world is a resounding “it depends.” There is no one-size-fits-all strategy, and making it easy is anything but easy, but wherever we go we’ve found it always has something to do with relationships and timing.

Here are four strategies we’re pursuing in different contexts, all of which are leading to a greater understanding of evidence – to varying degrees of impact so far – in ministries of education in Peru, Ghana, Kenya and Liberia.

Peru: Institutionalize the Use of Evidence


Making something routine is one way to make it easy, so what better way to make sure evidence is used than to simply make it a permanent part of the policy process? In Peru, we are approaching this as a long-term continual collaboration by creating an institution. In this context it is a lab within the Ministry of Education dedicated to innovation and evidence, so that testing policy ideas before implementing them becomes a routine part of the policy process. As a side effect, we hope that making it a permanent facility in the government will help it withstand changes in administration. We, together with our partners at J-PAL, provided a staff member to be seconded to this lab during its beginning stages, and it was this staff member’s job to connect the program and policy ideas and innovations to academic evaluation. For example, one pilot tested the delivery of text messages to principals to motivate them to conduct school maintenance activities. The messages were successful, and the Ministry of Education of Peru is considering how to use these results to inform their communication strategy with teachers and principals.

9042495948_967e4c4d2c_o.jpg
 

Ghana: Adapt With the Changing Landscape and Co-Create Evidence Together


One finding of previous evaluations is that many students in poor countries are in classrooms, but not benefitting from the lessons because they arrive unprepared with the basic skills needed to keep up. However, research in India showed that some basic remedial skills tutoring can be a very cost-effective way to help them catch up. We worked together with the government of Ghana’s Education Service to design and test a targeted instruction program there based on the evidence from India. To make sure it was adapted to the local context, we had to first take the concerns of the teachers’ unions into consideration. This meant adding a teacher-led intervention to the variations of the program to be tested locally. But the Ghanaian context also offered an opportunity to try a new idea: leveraging an existing national youth employment program to recruit teaching assistants. And the study showed that using these assistants was more effective than the teacher-led intervention, largely because the teacher-led intervention was not as successfully implemented. However, over the long term, the youth employment program went defunct, and so we have helped them adapt the evidence to focus on implementation and teacher accountability issues. In this case, making it easy to use evidence meant working together to answer questions over the long term and adapting to the existing policy and implementation landscape. 

Kenya: It’s About Being Helpful, Not Just About Sharing Evidence


What happens when you have to tell a group that the approach they’re taking rarely works? Our Kenya team has worked for years with various actors in the Ministry of Education, but we recently joined the permanent working group on Technical Vocational Education and Training (TVET). This working group is about advancing TVET by bringing together government, private sector and donors, yet much of the existing evidence on TVET suggests that traditional TVET has not been especially effective in improving employment outcomes. Our voice in the group may have been unpopular if we led with that – after all, this group is full of people who dedicate their lives to TVET. Instead, we led with our strength in convening many stakeholders, and managed to put ourselves in the position of co-leading a large conference on TVET in January, which was a slower introduction to the working group leadership. Then, as we shared the mixed evidence on TVET, it was better received, and we are better able to position ourselves as partners in figuring out what kinds of innovations in TVET do lead to impact. There’s much left to do, but we now have an important seat at the table in the conversation with all the right actors in a sector which may have avoided our original message had we just dropped a report in their laps and walked away.

Liberia: Build Buy-in From the (Very) Top


In other cases, making it easy means getting the top committed early to let rigorous research evaluate the success of a program. Our team is working with researchers and the Ministry of Education in Liberia to evaluate the impact of Partnership Schools for Liberia, a public-private partnership model for delivering public education, on education outcomes. The minister has publicly committed to using this rigorous evaluation to inform Liberia’s education sector strategy, enhancing the credibility of the ongoing pilot program and any policy decisions coming out of it. More importantly, getting this pre-evaluation commitment to make policy decisions based on evidence, and brokering a relationship between academics and policymakers to design such an evaluation, means that the decision to incorporate that evidence is already made.

There are many paths to effective education policy, but we’ve found that making it easy seems to be about being helpful – whether we’re working at the level of the day-to-day, long-term iterative policy process, or at the top with the minister himself. It’s about taking the time to not just come in with the “you need evidence” mantra, but to deeply understand and care about the policymakers’ needs, work hand-in-hand with them to identify and address these needs, and when warranted, address them with evidence. This takes time and requires a slower, listening-oriented approach, but if time is the key to getting evidence off the shelf and making it easy to be used, then it is cost-effective time well spent.

April 25, 2017