Toward a better understanding of multiple influences
2019-12-12
This blog is written by Josef Fischer, Data Manager at the Centre. His blog considers why a new research methodology, 'QuIP', may be useful in impact measurement and evaluation by outlining some of the method's key distinguishing features and why it may be well-suited in the youth sector.
At the Centre for Youth Impact, we are always curious about new approaches and ideas within the arena of impact evaluation. Thus when a qualitative methodology termed “Qualitative Impact Protocol” or “QuIP” came to our attention, we leapt at the opportunity to learn more. I was delighted to go to the beautiful city of Bath when the opportunity arose for a couple of days’ training. This blog outlines some of the reasons I found it fascinating, and why subsequently the Centre is considering the use of QuIP in our future research.
The QuIP, held under licence by Bath Social Development and Research from the University of Bath, has better impact evaluation as its core purpose. Specifically, it uses contribution analysis to attempt to address the question of attribution of impact in complex environments. In other words, it looks at how we as practitioners understand the relationship between the impact that we measure and our specific activities. QuIP helps with qualitative data analysis by providing tools to explore and confirm causal pathways in a more transparent, structured approach, yet maintains the qualitative ethos of listening to the voices of people and communities as narratives.
Here’s a brief summary of how it works.
Firstly, the QuIP addresses confirmation bias, a methodological concern sometimes attributed to qualitative research where we tend to seek information that supports what we already believe. The QuIP does this by partially or completely ‘blindfolding’ researchers to the aims and sometimes the commissioner of the research – that is, researchers don’t know what the project or provision was intending to achieve, or who’s funded the research into its impact. Secondly, to address the so-called “black box analysis” problem where the words of people and communities are somewhat mysteriously reworked to produce research findings, clear coding and visualization on a dashboard allow instant referral back to source data: the ‘beneficiaries’ stories. The visual depiction on the dashboard not only makes immediately clear how many different ‘causes’ of change are mentioned, but also how often. The QuIP approach can be used as a confirmatory study to get a sense of whether evidence exists for a particular theory or an exploratory study to get a sense of the causal relationships between concepts in a theory.
Resonant with other qualitative techniques, the QuIP does not claim to arrive at universal truths, but rather incrementally build existing knowledge. Importantly, it can be used alongside other qualitative and quantitative methodologies. Cost effectiveness and credibility are balanced in a careful manner.
The attribution of impact has long been a significant challenge for youth organisations, in part because of the semi-structured, informal nature of provision but also because change in the lives of young people has so many other influences. Where provision has a specific aim of reducing young people’s exposure to or involvement in so-called risky behaviours, providers can find it particularly difficult to ‘prove a negative’ – demonstrate that something that would have otherwise happened, didn’t, and that their work was responsible. Many of the same providers have long argued that we should better value the voices of young people in telling us how provision impacts their lives.
Even where attribution feels less significant as a driver, we still have much to learn about how and why youth provision contributes to change for young people. There are multiple theories – trusted adults, safe spaces, experiences of fun and challenge – but we know less about how these experiences interact across different forms of provision, and how young people experience change over longer-term relationships with youth provision.
As I digested the QuIP and spoke with colleagues here at the Centre, there was a sense of excitement about how we could use the QuIP methodology to get a better understanding of how young people see youth settings in their lives and how they think those settings affect them. It offers new opportunities where there is less reliance on counterfactual data to infer statistical impact, or clever statistics to just about tease apart casual pathways. QuIP provides us with another way to better understand impact. If executed properly, we could get a clear visualization with supporting data of how youth provision is a critical part of the web of influences in young peoples’ lives.