October 03, 2017

by Nicola Jones and Danielle Moore

This is the third in a series of blog posts summarizing the discussions from a May 2017 researcher gathering on measuring women’s empowerment in impact evaluations. Read the first post on household decision-making power here.

KenyaResearch.jpg

Last week, Dr. Sarah Baird told us about the benefits and challenges of integrating qualitative methods into impact evaluations on women’s empowerment, particularly a multi-country longitudinal study on gender and adolescence. You can read our conversation here.

Today, we’ll hear from the director of that study, Dr. Nicola Jones, to understand the qualitative researcher’s perspective. Nicola is the Principal Research Fellow at the Overseas Development Institute in London and specializes in qualitative and participatory evaluation approaches. In this post, we discuss why qualitative methods are important for understanding the social norms that shape women’s lives. She also shares her experiences using mixed methods studies to promote evidence-informed policymaking with qualitative methods by helping to illuminate the mechanisms behind quantified impact estimates. As we discuss below, this combination can allow researchers to think more critically about their findings, tailor dissemination to different stakeholders, and explain unexpected results.

Nellie:   Last week, Sarah Baird and I talked about how impact evaluations that are looking at women's empowerment could benefit from incorporating qualitative methods, particularly in the DfID-funded Gender and Adolescence Global Evidence (GAGE) mixed methods study that you're working on with her. Could you tell us why, in your experience, qualitative and participatory methods are particularly useful in evaluations that focus on women and girls?

Nicola:  Qualitative methods—particularly those that allow the participant to lead the conversation—can allow you to unpack the role that deeply-entrenched, often sticky, social norms have in perpetuating the barriers to women’s empowerment. Norms don't necessarily disappear, but they can be quite malleable, as they come to be practiced in more covert or underground ways. Recently in Ethiopia we’ve seen that, while people might be more aware that child marriage is illegal, and even aware of the penalties of going against those legal bounds, they are still finding ways to continue practicing this tradition. Qualitative methods help us understand those nonlinear change pathways that won't necessarily come up in quantitative survey data.

Nellie:   What we're able to capture or not about social norms came up continually throughout the workshop that inspired this blog series. Could you expand on why quantitative methods alone might not be able to capture what is going on in social norms?

Nicola:  There is progress being made in potential survey modules on social norms, but there are limitations to what can be done quantitatively. First, it’s not clear whether surveys administered to individuals are capturing individual attitudes or really capturing norms that are operating at a community level. Understanding those levels remains tricky without making the survey incredibly long and cumbersome. The beauty of qualitative methods is being able to unpack whether it's the sanction of nonconformity that is driving people's behavior or whether it's really a deep-seated belief that they indeed hold.

Nellie:   And what can an economist bring to the table in a mixed methods study, such as GAGE?

Nicola:  Having development economists on the team helps us to think in a much clearer and more precise way about impact, what are we measuring, and what are the claims that we can make about change. For myself, and I think some other colleagues who tend to use more qualitative methods, it’s been valuable to see the creative ways that experienced development economists can exploit multiple treatment arms to tease out the contribution that different program components or variations are having on adolescent lives. We typically don't exploit that kind of multi-arm design within qualitative research. In fact, good examples of qualitative methods that sit alongside multi-armed interventions are very hard to come by in the literature, beyond just tacking on focus group discussions in each group.

good examples of qualitative methods that sit alongside multi-armed interventions are very hard to come by in the literature

Nellie: Given the potential benefits of integrating qualitative methods into impact evaluations looking at women's empowerment, I naturally wonder why don't we see more partnerships between researchers across disciplines in this space. Is there an example of a study that you worked on that could have benefited from a stronger quantitative component? What were the factors that played into why you weren't able to or didn't include more robust quantitative methods in that study?

Nicola:  A couple of cases come to mind. In one study, we examined citizen perceptions of cash transfers for vulnerable populations in the Middle East and Africa. We found that it's critically important to make sure transfer recipients have opportunities to engage in that programming, rather than taking a top-down approach. But while the findings that our country partners unearthed through the in-depth qualitative work were fascinating, receptiveness to those findings among policymakers was relatively limited, partly because we weren’t able to quantify some of the impacts that we were seeing. It also could have led us to think more critically about some of our qualitative findings if there wasn't a convergence.

Nellie:   It sounds like in this case, policymakers may have had a bias toward quantified results. And so, providing context for your qualitative findings with quantitative work would have encouraged take-up of those findings on a local level.

Nicola:  Yes, ministries of finance especially are more receptive to quantitative findings. Also, when you are trying to shift discourse, multiple methods allow you to tailor arguments to different stakeholders. If you're confirming what people want to hear, then they don't interrogate your methods necessarily with the same depth. But when you are bringing new ideas, it becomes particularly pertinent to use robust mixed methods.

If you're confirming what people want to hear, then they don't interrogate your methods necessarily with the same depth. But when you are bringing new ideas, it becomes particularly pertinent to use robust mixed methods.

Nellie:   That makes a lot of sense. Finally, what would you say to encourage development economists to incorporate qualitative methods into their work? What are some of the challenges that you'd want them to be prepared for?

Nicola:  First, an understanding of the change pathways underpinning the numbers that they're finding will be critical if they need to defend their findings, particularly if the results are disappointing or unexpected. Sometimes the program can have a very robust design but still might fail due to the political context. In those cases, qualitative work with a political science perspective could shed light on why the policy processes in which the programs are situated are either facilitating or hindering program effectiveness.

In terms of the drawbacks, mixed methods research is more time and resource intensive, both in terms of data collection and in terms of the analysis process. Particularly because there's not yet a universal consensus on what constitutes gold standard mixed methods work. We are still learning the best way to weave together different approaches in a way that is most compelling and methodologically robust.

And sometimes participatory approaches, where there may be repeat interactions with individuals or communities, are more transformative for participants than the actual program that's being evaluated. In the Gaza Strip, some of the adolescents involved in our participatory research said that the questions that we were asking had transformed the way they thought about their current situation and their next steps more than the curriculum that we had been evaluating. On the one hand, it's really positive that we can have that kind of effect, but it's important to then disentangle what the intervention is aiming to do from the effects more interactive evaluation methods might have on program beneficiaries.

Nellie:   Thank you—I think it's so important that when researchers want to use mixed methods that they go into it with eyes open and see the potential challenges that they might face. To that end, I really appreciate you making time to join in this conversation.

 

Read the full interview with Dr. Jones on GAGE's website here.