“Cracking the impact nut” or How the Youth Investment Fund learning and impact strand responds to the challenges of evaluation in open access provision
This blog has been written Matthew Hill, Head of Research and Learning at the Centre for Youth Impact.
Our YIF approach to data collection
NPC and the Centre for Youth Impact are leading the learning and impact strand of the £40 million Youth Investment Fund (YIF), which is a joint programme supported by government funding from DCMS and National Lottery funding from Big Lottery Fund. Eighty-six youth providers are being supported for three-years (2017-2020) to develop and expand their open access youth provision, and we are currently working them to design an evaluation approach that captures the value of their work, and supports their learning and improvement. Our overarching approach is an attempt to crack some of the perpetual practical and methodological nuts (my favourite bar snack) in measuring the impact of open access provision. This blog outlines five ways in which we are confronting these challenges within our work.
Moving away from blanket outcomes measurement
The past decade or so has seen a concerted push to be more outcomes focused. We continue to support an outcomes focused approach to service design and delivery (that is why we all do what we do after all) but our YIF-work represents a shift away from blanket outcome measurement (i.e. trying to capture every outcome for every young person for every organisation). This shift is a direct response to many of the perpetual challenges of outcome measurement in open access settings including fleeting or irregular engagement, defining generalised outcomes for individualised provision, developing robust metrics for broad personal change and the issue of measuring long term impacts. Far from abandoning outcome measurement, we are focusing on high quality targeted measurement with a sub-sample of the YIF cohort, which will ultimately provide us with more robust and meaningful data.
As well as this targeted approach to outcomes, our YIF-work places increased emphasis on the experience of young people and the quality of the provision they receive. Crucially, we are aiming to link the data on outputs, user feedback and quality to the targeted outcome data so we can understand not only whether the provision is having an impact on young people – but why.
Focusing on the user experience
Another challenge is that young people often feel overburdened with rather obscure and meaningless (to them at least) surveys. In response, our YIF-approach focuses on the elements of delivery that are most relevant and meaningful to young people – namely their experience of services (e.g. feelings of safety, respect and positive challenge). We are working with Keystone accountability to develop a set of standardised feedback questions around this experience. Instead of large annual surveys this feedback process uses regular light touch feedback - perhaps 3-5 questions once a month. This ensures that user feedback is embedded in ongoing reflective practice, and crucially, means that organisations can respond more immediately to the findings. Critically, we are also working with providers to process and act on this feedback, and tell young people what has changed as a result.
Improving as well as proving
Another nut that needs cracking is practitioners’ sense of dislocation between a lot of impact measurement and their everyday work. Part of our commitment to ‘going with the grain’ of provision is a focus on the quality of youth work practice. This data absolutely has to be linked to outcomes data – as ultimately this dictates what is and what isn’t quality provision – but by emphasising considerations of quality we are focusing on those elements of provision that are most relevant and meaningful to youth workers themselves. Our YIF work is drawing on an established quality improvement framework from the US – the Youth Program Quality Assessment – which relies on peer observation with youth workers identifying ‘markers’ of quality in the delivery of their colleagues. This framework is not a critique of existing quality assessment frameworks but is, in fact, a complement to them – ensuring quality is also monitored and increased as part of ongoing practice improvement rather than just assessed against an existing standard.
Understanding young people’s journey through services
Although most providers collect detailed attendance data, many tell us that they use this for monitoring overall service demand rather than truly understanding the way that individuals engage with their services. By utilising existing data and trialing new digital methods such as Yoti we aim to build a much more nuanced picture of what young people do with their feet i.e. how often they attend, for how long, and how they move through provision – as a proxy for their levels of engagement and ‘exposure’ to interactions.
Arguably the greatest opportunity presented by the YIF is the potential to collect shared data across 86 grantees for three years. This offers a rare (probably unique) opportunity to build an evidence base across a huge diversity of open access provision (detached/ building-based; structured/ unstructured; universal/ targeted) and, by comparing the results across different types of provision, we will be able to really understand the strengths and weaknesses of different services.
We recognise the many challenges that open access providers face and believe that the dominant paradigm of measurement is not fit for such settings. Our YIF work has the potential to overcome some of these challenges, and develop approaches that are applicable across the wider sector. It is certainly ambitious and we are trying new things out – some of which will work but some of which will no doubt fail. We will confront this uncertainty with a pioneering spirit and a humbleness to admit when things don’t work. As well as grantees we are committed to working with the wider sector and we would greatly value your input in testing, refining and reflecting upon the tools and evidence that emerges. Our dedicated YIF learning and impact website will be live soon… so please stay posted… or get in touch with Matthew.Hill@youthimpact.uk or Anoushka.Kenley@thinkNPC.org if you want to find out more in the meantime.
Inspiral Targets: the measurement of everything and the value of nothing
Dan Gregory, of Common Capital, makes the case for ‘uncertainty, complexity and modesty’, as he reflects on his recent keynote presentation at The Centre for Youth Impact Gathering 2017.
Dan has over 10 years’ experience of funding and financing voluntary and social enterprises, through developing policy at the highest level and delivering in practice at the grassroots. He has worked for the Treasury and the Cabinet Office where he led the development of government policy on third sector access to finance, social investment and the role of the sector in service delivery. Dan spends some of his time at Social Enterprise UK and also works independently under the banner of Common Capital.
I’m not much of a natural public speaker. Certainly not an exciting or inspiring one. So when I am invited to speak at a conference or event, I try to be at least informative. To try to satisfy the audience with substance if not style. So lots of facts, evidence, insight or expertise from a field I have worked in for a decade and more. Talking from the solid ground of territory I know well and feel comfortable upon.
Sometimes this goes down well. But other times I sense the audience feels a bit bombarded with wonk grenades. They don’t seem to really warm to me, they feel my facts and evidence are a bit relentless and dry. They don’t ask questions afterwards because, frankly, they’ve had enough already, thanks.
Recently I spoke at The Centre for Youth Impact Gathering 2017: Shaping the future of impact measurement. The event was for practitioners, researchers and funders with an interest in learning, evidence and evaluation in work with young people.
It seemed to go very well. A few people said afterwards that they really valued my presentation. In an unprecedented turn of events, a few even told me they enjoyed it. This was confusing for me because, to be honest, I wasn’t really sure what I was talking about and certainly wasn’t on solid ground. Frankly, I don’t really know much about social impact measurement. I’m not an economist or accountant. I’m not a social impact guru or measurement Maharishi. I haven’t got a social impact measuring stick. So why did it seem to go so well, at least compared to normal?
In short, my presentation was about how social impact measurement is largely a load of rubbish. Although not entirely. I started off admitting I was a bit of a fraud and didn’t have any technical expertise as such. But I do have a broadperspective, having worked in and around this area for 15 years now, in government and outside, and from watching the rise of new fields of social impact measurement, social impact bonds and so on.
So why would someone argue that social impact measurement is a load of nonsense?
Perhaps above all, because the world is just too complex to reduce to the logic of input > output > outcome > impact. Such linearity is a joke. Perhaps this works in the laboratory - in very controlled and restricted conditions. But out in the field, people are more complex than bacteria, social programmes are not vaccines, homeless people are not a disease. As social innovation guru Geoff Mulgan has pointed out, evidence can be as unreliable and contingent as humans are irrational and unpredictable. Mulgan has described how, “Unlike molecules, which follow the rules of physics rather obediently, human beings have minds of their own, and are subject to many social, psychological, and environmental forces…. Very few domains allow precise predictions about what causes will lead to what effects.” Sadly, later in the same article, he subsequently suggests his own, new, social impact measurement methodology as the answer to this problem, immediately undoing all his good work in rising above the fray, and bringing himself down to the level of the more common or garden impact measuring gun for hire.
Second, and related to complexity, is the realisation that the impact of any action is always to some degree context specific - to a particular family, community or individual, for instance. Until the day arrives when we are able to construct multiple realities, it remains impossible to ever really know what would have happened otherwise. Randomised Control Trials, for instance, might tell us what happened elsewhere but not what would have happened here. And even RCTs are somewhat problematic. Beyond the issue of their considerable expense, even the World Health Organisation is losing faith in RCTs - “As the complexity of interventions or contexts increases, randomization alone will rarely suffice to identify true causal mechanisms.”
Third, as Zhou Enlai once pointed out, when it comes to impact, it’s just too early to say. When is impact? Perhaps my biggest professional fear is that charities and social enterprise have been doing such a good job for the last few decades in mitigating the worst excesses of capitalism, mopping up the problems and making just enough of a difference to hold together our society that would otherwise rip apart at the seams, that we have kept our prevailing economic system in place just long enough to allow it to do irreparable environmental damage to our planet, putting at risk all life on earth! Add up all those SROI ratios from among our sector and how’s that for impact?
Fourth, and more practically, half the social impact metrics we use are total nonsense. Lives touched, for instance, is one of the most common and sadly, not even the most ludicrous of the currencies we circulate. How do Stalin and the Chuckle Brothers measure up against that yardstick? Even if we did somehow develop less blunt and limited methodologies, they would be inevitably discredited by some other new social impact salesperson within weeks, hawking a supposedly new and improved model. Some of the metrics out there are frankly a total sham. I have listened to panellists at events describe how they have methodologies endorsed by The Pope and Will.I.Am which can calculate your social impact in under 7 seconds. Yet no-one calls these charlatans out.
Fifth, social impact measurement can bring accompanying dangers. Maybe metrics can tell us what happened in the past but that doesn’t mean they can tell us what to do in future. The world changes. What works changes. Not acknowledging this may bring dangerous consequences. Measurement can be dangerous if it is used to influence behaviour, often creating perverse incentives. Italian fireman paid by results start lighting fires to put out. Big outsourcing companies are rewarded for dead people not reoffending. NHS Patients are unable to book appointments with their GP more than two weeks in advance. In fact, as researchers have concluded, “Target based performance management always creates gaming”.
Finally, social impact measurement can be expensive, bringing negative or little benefit while diverting resources away from other work. Metrics can also be demotivating for staff and volunteers at charities and social enterprises, undermining their public service ethos, crowding out creativity, freedom, intuition, trust and the human touch.
So much for social impact measurement then. What a waste of everyone’s time?
Well largely, yes. But not entirely. I concluded my presentation by suggesting a different tack and one which seemed to go down quite well. Much of my frustration with this field is that social impact measurement always seems so sure of itself. “You have to get better at measuring your impact “ they say. “Of course it’s possible” they proclaim. That can be really annoying. So with that in mind, I also admitted where I might, in fact, be wrong.
First, if clinical tests and trials and RCTs have brought us so far forward in medicine and the health profession then perhaps it’s only a matter of time before we develop similar capabilities for better understanding wider fields of human activity. Who knows how far technological advances and artificial intelligence might take us in future?
Second, perhaps our metrics and methodologies will get less stupid. Jeremy Nicholls of Social Value UK - who has done more than anyone to advance this somewhat preposterous cause - is nevertheless doing a fantastic job in building bridges and overcoming divides between competing schools of metrics. Jeremy – and other good folk like Tris Lumley at NPC - have been working hard for many years to get the social auditors to agree to shared principles with the SROI merchants and to find common cause among competing clans. This can only be a good thing. Now we just need to call out the charlatans.
Third, and a point which Jeremy himself makes well, even if the numbers are nonsense, the process of social impact measurement, done well, can serve to empower those who are too often forgotten by charities, social enterprises, government and funders. Maybe the calculations and the spreadsheets and the ratios turn out to be meaningless. But if they were developed in a way which gives voice to previously powerless stakeholders – the beneficiaries – then the process isn’t entirely pointless.
Fourth, maybe there’s money here. Maybe funders and financiers and customers and contractors may also be fooled by these nonsensical numbers. If this brings money in, then this is just as useful to charities and social enterprises as a rebrand, a snazzy website or a shiny annual report.
So where does that leave us? Why was this train of thought slightly less boring for my audience than my usual barrage of wonk?
I think the message here is to be proportionate, to be humble, to even be sceptical. But nevertheless, to keep on trying. Trying to understand your impact is a laudable ambition at least. Often metrics might be too complex, too uncertain, too contingent, of limited or even dangerous practical application. They might be expensive. Sometimes, we might better focus on means, values and behaviours, ownership, co-operation, openness and respect. Sometimes we might just throw the spreadsheets in the bin.
But like emojis (in a text message, not on a gravestone) these metrics may yet have a time and a place. Who really knows for sure? People seemed to like my message that certainty is overrated. Uncertainty, complexity, modesty and admitting fallibility seem all together more human and more popular. Maybe that is exciting and inspiring.
This blog was written by Jack Welch, who is an autistic and youth voice activist, and has been working with the Centre as a young researcher. You can find him on Twitter and Medium.
Throughout the time I have been involved with the youth sector, it has become increasingly necessary for organisations to attempt to evidence a tangible and defined impact for their beneficiaries in an environment where funding is scarce and demand has only risen. However, what struck me throughout in the series of presentations and conversations among delegates at this third annual gathering for the Centre was an increased willingness to share expertise, as well as resources, where a silo approach is simply not viable in the current landscape.
This message became particularly apparent through Ruth Rickman-Williams’s presentation, which set out Youth Focus West Midlands’ recent challenges, and journey in response to those challenges. As austerity began to bite and resources became ever more limited, a struggle to survive ensued. However, it was clear that Ruth’s approach to restructuring her organisation and network to facilitate a more united agenda across the sector, including commissioners and those delivering services, could lead to a more holistic approach that is more likely to improve outcomes for young people and serve the needs of the wider community.
For local networks that are pooling their resources as part of regional ‘Youth Impact Networks’, there is a detectable sense of just how vital collaboration across the range of organisations in the voluntary sector and public services is. The networks seem to be providing a solid foundation for sharing learning, ideas and resources. From my own recent advisory work in patient participation in the health agenda, I have seen how all areas in England now have a Sustainability and Transformation Plan. These plans go far beyond just how healthcare is provided, but much more about the wider needs of local populations and are showing how voluntary sector organisations working in partnership with one another can improve health and wellbeing outcomes for young people. I am interested to see how more youth sector services might become part of significant pieces of work like this.
I was also struck by Dan Gregory’s emphasis in his keynote that much current impact measurement risks being meaningless to the organisations to whom it relates. I would add that this is a particular risk where young people that are not kept at the core of service design, delivery and evaluation.
Within the breakout workshops, I was drawn to the new initiatives led by the Centre and New Philanthropy Central, on how data can be effectively captured in open access settings and ensuring an individual’s journey can be tracked, specifically in services funded through the Youth Investment Fund (YIF). The YIF will create a new body of evidence about whether and how services are making a difference to the lives of young people.
From my own experience, having attended the same open access settings for various unrelated projects, I know that young people can be transient and the location could be playing a whole host to diverse services from access to housing and welfare to career assistance. While the new ‘Footfall’ resource in gathering data via mobile devices will be an innovative means of building a consistent record, I particularly remember a comment from a delegate in the room that many young people in the most disadvantaged circumstances will not have access to smartphones and in some cases have the signal to even make use of it. A fellow Young Researcher made the point too that without young people seeing how their role in evaluation has influenced the practice later on, their participation in this is more likely to be tokenistic. I believe these should be important considerations as the YIF evaluation plans are developed.
I began as a volunteer with Dorset Youth Association in 2010. Since then, we have seen a profound upheaval within the sector, and many are still learning how to thrive as well as survive in the new circumstances, structures and network. We are still in a time of flux and unpredictability, but it looks to me like we are reaching a point where the sector is becoming more resilient against much of the shock to its financial security post-2010 and more willing to work in more collaboration. With closer partnerships and even mergers changing the structures through which organisations are able to have an impact on young people’s development, I look forward to seeing what comes next for the sector.
Our monthly newsletter collects news, events, research and blogs from the Centre, our networks and practitioners and organisations around the world. Sign-up here