Youth work beyond the measurement imperative? Reflections on the Youth Investment Fund Learning Project from a critical friend
In this blog Tania de St Croix, Lecturer in the Sociology of Youth and Childhood at King's College London, offers her thoughts on the Youth Investment Fund Learning Project, which the Centre is leading with NPC and others. You can find out more information on the YIF Learning Project at https://yiflearning.org.
Many involved in the youth work field are critical of the youth impact agenda, particularly its emphasis on the quantitative measurement of outcomes for individuals, and its neglect of process, group work, and structural inequalities. Those of us involved in ‘In Defence of Youth Work’ have argued that the contemporary emphasis on impact and outcomes cannot be separated from its context, the neoliberal ‘desire to financialise human existence’, and its consequences for which practices are valued and who gets to decide. We have claimed that open access youth work is particularly unsuited to outcomes based management, and that open youth work's future existence is undermined by an emphasis on impact measurement.
While those of us making a political critique of impact measurement (within and beyond young people’s services) face an uphill struggle against dominant understandings of ‘what works’ and ‘what counts’, there has been a growing recognition of the specific challenges in evaluating open access youth work. In this context, it has been interesting to follow the development of the Youth Investment Fund (YIF), a £40 million government (DCMS) and Big Lottery Fund investment in open access youth work. While we might start by noting that £40 million over 3 years is dwarfed by a decade of youth work cuts, the YIF is nevertheless significant: it suggests that someone, somewhere in policy recognises the potential value of open youth work. The YIF is also significant in relation to impact debates, as it included “an explicit objective to strengthen the evidence base on the impact of non-formal learning opportunities for young people”.
A change of emphasis?
This objective to ‘strengthen the evidence base’ of open access youth work is carried out by the YIF Learning Project, led by NPC and the Centre for Youth Impact. Its tone and approach are encouraging, and some of the significant concerns of the youth work field have been taken on board. This is demonstrated both by a language of learning and openness, and an emphasis on collaboration with young people, practitioners and youth work organisations. The principles of the YIF Learning Project laudibly include:
The YIF evaluation approach is more closely aligned to youth work’s approaches and methodologies than it might have been, and this is great to see. And yet, there is still a sense that it attempts to ‘measure the unmeasurable’. As I write this, I imagine the weary sighs of colleagues in the youth impact world; however much they take on board youth workers’ views, it is never enough to stop us complaining! None of what follows is intended to criticise for criticism’s sake, or to take away from the respect with which the Centre for Youth Impact (in particular) has treated those of us who are critical of the very tenets of the youth impact agenda they were set up to promote. The following are five dilemmas that are important to address:
1) Despite moving away from ‘blanket outcomes measurement’, quantitative outcomes measurement continues to play a central role. Given the tendency that ‘what gets counted’ is too often the only thing that ‘counts’, how can the project guard against the preference for more structured, time-limited, ‘project-based’ youth work (that is easier to ‘measure’) over informal, open-ended, open access practice? How can the group processes that are central to youth work be recognised, when it is individual change that tends to be measured?
2) What are the dangers of standardising and quantifying ‘youth work quality’ and ‘young people’s views’, of inventing new tools (or importing them from other countries), and of engaging private sector consultancies and agencies to do this work?
3) It is inevitable that evaluation – especially on behalf of a funding agency – will affect practice, including in unintended ways. How much of a ‘data burden’ will be created for organisations? Will they really feel free to share their experiences, reservations, and honest reflections?
4) Can evaluation be separated from top-down performance management, judgement, comparison and control? Measurement changes how practitioners are perceived, and how they perceive themselves in relation to their work. How can data be used for collective learning without it also being used as evidence of ‘success’ or ‘failure’ by individual practitioners and organisations, and even by the field of open access youth work as a whole?
5) How can ‘footfall’ and other data be collected without unacceptable levels of surveillance, and breaches of confidentiality about young people’s whereabouts and their activities? How can the most marginalised young people, many of whom are (rightly) suspicious of authorities and institutions, be assured that their privacy is respected?
So what? And what next?
The current approach to evaluating the Youth Investment Fund demonstrates thoughtfulness and attention to the special characteristics and challenges of open access youth work. As a result, the experiences of young people and youth workers funded by this scheme will be more meaningful and less onerous than they would have been under a more prescriptive top-down approach. The YIF Learning Project goes some way towards challenging dominant approaches to impact measurement. Yet in other ways it is reinforcing the status quo: continuing to prioritise the measurement of individual change, converting qualitative elements of youth work (its quality and young people’s experiences) into statistics, and aiming towards a financialised ‘value for money’ analysis.
Ultimately, without questionning the broader context - the basis on which measurement is still preferred by most funders and governments, as a neoliberal tool of governance and control – many of these problems remain intractable. Moving beyond such dilemmas, then, is not merely a matter of creating more congruent impact tools, reducing the data burden, and involving young people and practitioners in the process (important though all of these things are). It requires imagining meaningful evaluation beyond a focus on outcomes and measurement, thinking seriously about the social and political purpose of youth work, and the role of young people in creating change. It involves working with others - beyond the youth sector and beyond our national and regional borders - to challenge the global dominance of finance and investment logic in activities that hold to a different version of ‘value’. While such aspirations may seem momentous, there is nothing to stop us dreaming of a different world, and doing what we can to make it real in our everyday lives.
Tania de St Croix has been a youth worker for over 20 years, and is an active part of ‘In Defence of Youth Work’ (IDYW); she thanks IDYW colleagues for helpful feedback on this blog post. She is a lecturer at King’s College London. Her book, ‘Grassroots Youth Work: Policy, Passion and Resistance in Practice’, was published in 2016, and her forthcoming research project is entitled ‘Rethinking impact, evaluation and accountability in youth work’.
here to edit.
“Cracking the impact nut” or How the Youth Investment Fund learning and impact strand responds to the challenges of evaluation in open access provision
This blog has been written Matthew Hill, Head of Research and Learning at the Centre for Youth Impact.
Our YIF approach to data collection
NPC and the Centre for Youth Impact are leading the learning and impact strand of the £40 million Youth Investment Fund (YIF), which is a joint programme supported by government funding from DCMS and National Lottery funding from Big Lottery Fund. Eighty-six youth providers are being supported for three-years (2017-2020) to develop and expand their open access youth provision, and we are currently working them to design an evaluation approach that captures the value of their work, and supports their learning and improvement. Our overarching approach is an attempt to crack some of the perpetual practical and methodological nuts (my favourite bar snack) in measuring the impact of open access provision. This blog outlines five ways in which we are confronting these challenges within our work.
Moving away from blanket outcomes measurement
The past decade or so has seen a concerted push to be more outcomes focused. We continue to support an outcomes focused approach to service design and delivery (that is why we all do what we do after all) but our YIF-work represents a shift away from blanket outcome measurement (i.e. trying to capture every outcome for every young person for every organisation). This shift is a direct response to many of the perpetual challenges of outcome measurement in open access settings including fleeting or irregular engagement, defining generalised outcomes for individualised provision, developing robust metrics for broad personal change and the issue of measuring long term impacts. Far from abandoning outcome measurement, we are focusing on high quality targeted measurement with a sub-sample of the YIF cohort, which will ultimately provide us with more robust and meaningful data.
As well as this targeted approach to outcomes, our YIF-work places increased emphasis on the experience of young people and the quality of the provision they receive. Crucially, we are aiming to link the data on outputs, user feedback and quality to the targeted outcome data so we can understand not only whether the provision is having an impact on young people – but why.
Focusing on the user experience
Another challenge is that young people often feel overburdened with rather obscure and meaningless (to them at least) surveys. In response, our YIF-approach focuses on the elements of delivery that are most relevant and meaningful to young people – namely their experience of services (e.g. feelings of safety, respect and positive challenge). We are working with Keystone accountability to develop a set of standardised feedback questions around this experience. Instead of large annual surveys this feedback process uses regular light touch feedback - perhaps 3-5 questions once a month. This ensures that user feedback is embedded in ongoing reflective practice, and crucially, means that organisations can respond more immediately to the findings. Critically, we are also working with providers to process and act on this feedback, and tell young people what has changed as a result.
Improving as well as proving
Another nut that needs cracking is practitioners’ sense of dislocation between a lot of impact measurement and their everyday work. Part of our commitment to ‘going with the grain’ of provision is a focus on the quality of youth work practice. This data absolutely has to be linked to outcomes data – as ultimately this dictates what is and what isn’t quality provision – but by emphasising considerations of quality we are focusing on those elements of provision that are most relevant and meaningful to youth workers themselves. Our YIF work is drawing on an established quality improvement framework from the US – the Youth Program Quality Assessment – which relies on peer observation with youth workers identifying ‘markers’ of quality in the delivery of their colleagues. This framework is not a critique of existing quality assessment frameworks but is, in fact, a complement to them – ensuring quality is also monitored and increased as part of ongoing practice improvement rather than just assessed against an existing standard.
Understanding young people’s journey through services
Although most providers collect detailed attendance data, many tell us that they use this for monitoring overall service demand rather than truly understanding the way that individuals engage with their services. By utilising existing data and trialing new digital methods such as Yoti we aim to build a much more nuanced picture of what young people do with their feet i.e. how often they attend, for how long, and how they move through provision – as a proxy for their levels of engagement and ‘exposure’ to interactions.
Arguably the greatest opportunity presented by the YIF is the potential to collect shared data across 86 grantees for three years. This offers a rare (probably unique) opportunity to build an evidence base across a huge diversity of open access provision (detached/ building-based; structured/ unstructured; universal/ targeted) and, by comparing the results across different types of provision, we will be able to really understand the strengths and weaknesses of different services.
We recognise the many challenges that open access providers face and believe that the dominant paradigm of measurement is not fit for such settings. Our YIF work has the potential to overcome some of these challenges, and develop approaches that are applicable across the wider sector. It is certainly ambitious and we are trying new things out – some of which will work but some of which will no doubt fail. We will confront this uncertainty with a pioneering spirit and a humbleness to admit when things don’t work. As well as grantees we are committed to working with the wider sector and we would greatly value your input in testing, refining and reflecting upon the tools and evidence that emerges. Our dedicated YIF learning and impact website will be live soon… so please stay posted… or get in touch with Matthew.Hill@youthimpact.uk or Anoushka.Kenley@thinkNPC.org if you want to find out more in the meantime.
Our monthly newsletter collects news, events, research and blogs from the Centre, our networks and practitioners and organisations around the world. Sign-up here