Skip to main content

Learning from collaborative outcome reporting approaches to evaluating youth voice

2023-11-23

"Hindsight is notably cleverer than foresight"

US Fleet Admiral Chester Nimitz

 

Our Evaluating Youth Voice project launched in 2021 with an ambitious brief: how has youth voice been happening in practice within the #iwill Fund and other youth voice opportunities, and what impact has it had?

 

We applied an approach to evaluation known as 'Collaborative Outcomes Reporting'. This participatory approach sought to collate and collect data alongside continual and iterative reviews by practitioners, participants and interested bodies.

 

The Young Evaluators Panel at the College were essential to this process, shaping the evaluation’s delivery and management. The Panel worked alongside our team to analyse existing data, undertake interviews and focus groups, and lead an Outcomes Summit, where emergent findings were considered, scrutinised, and challenged.

 

For my part, I developed the project while Interim Executive Director at the Centre for Youth Impact (now YMCA George Williams College) and returned to lead the final report and a set of associated ‘How to’ guides for practitioners to take the learning into their day to day work.

 

Being involved in setting up the project and then returning to see it to a close has given me perspective on how even ‘the best made’ plans can hit the reality of evaluation implementation. Below I explore what I have taken from the collaborative outcomes reporting approach, and the lessons it provides for those interested in participatory evaluations.

 

  1. Early engagement was essential to later success

The project launched with a series of scoping workshops, bringing together #iwill funders, practitioners, and young people from across the #iwill movement. They reviewed our initial plans and provided input on our evaluation questions and intended activities.

 

The combined expertise was immensely useful and influential, enabling us to sharpen our research questions and engagement approach. Undoubtedly, this made the final process more efficient, and ensured that the final report and toolkits were more valuable and directly related to work undertaken by practitioners across the sector.  

 

  • Lessons for funders: don't rush to narrow contract evaluation outputs without the opportunity for wider engagement and flexibility for change.
  • Lessons for evaluation practitioners: build in scope for evolution and change with stakeholders at the start and throughout your planned process.

 

  1. Relationships with young evaluators were key (obviously!)

Youth voice ran at the heart of this evaluation through the participation of the Young Evaluators. It should not therefore be a shock that investment in relationship-building with our Panel was a critical factor in supporting their influence and input into the evaluation.

 

With a gloriously diverse, experienced, skilled, and engaged group of young people, significant staff time was ringfenced throughout the project to invest in the necessary level of relationship-building and support. They needed time for relationships, not just with the young people, but also to engage with other influential adults in young people's lives – such as teachers, support workers, and parents. We needed to ensure appropriate reflective supervision for staff holding those relationships. Policies and procedures required (re-)development. Residentials brought the Panel together for training and plenty of fun, but did require considerable investment.

 

  • Lessons for funders: if you want genuine participation, be prepared to invest in doing it well. Residentials are not ‘holidays’ but key spaces for relational practices. Make the investment in building young people’s skills and relationship building.
  • Lessons for evaluation practitioners: beyond direct staff time for planning and delivery with young people, consider the time needed to build relationships with other adults in young people’s lives. Ensure there is time for managers and leaders who support direct delivery staff to develop high quality support.

 

  1. Be cautious of what learning you can draw on

A key plank of the evaluation was collecting insight via our Data Trawl; reviewing existing #iwill project evaluations, case studies, and monitoring reports to funders. We wanted to avoid creating new burdens for hard-pressed and busy youth organisations and instead, sought to review what learning practitioners had already generated. We always knew this could be challenging, with risks of a self-selecting sample or misinterpreting findings when ‘we’ (and young people) review ‘their’ reports.

 

The level and depth of information within individual reports was much more variable than anticipated. Contrastingly, most reports we received had been submitted to funders rather than projects directly and didn’t always have the level and depth of information we hoped.

 

However, as result of our call for evidence, we still received nearly 300 different documents – a significant pile of evaluations, case studies and reports to analyse. This gave us a broad insight into youth voice activities in the #iwill Fund and generated interesting insights into youth voice practice.

 

  • Lessons for funder: for large-scale programmes, consider how to ensure consistent areas for learning in individual projects to enable surfacing of common themes across projects.
  • Lessons for evaluation practitioners: we gained precious insight, but do not underestimate the time needed to gather and analyse others’ learning. Plan for low response and consider options for further primary data collection. Triple check how data will be recorded and stored to enable easy analysis.

 

  1. Facilitation planning and delivery is essential for participatory engagement

Workshops through the project and our Outcome Summit generated a significant amount of insight but also a ton of workshop flipcharts, jamboards, post-its, observation notes, and lots more in between. The sheer volume of materials and ideas was sometimes difficult to cope with and make sense of. The process reaffirmed the importance of planning the facilitation of workshops carefully and with intent on the output generated.

 

  • Lessons for funders: programmes with multiple stakeholders are enriched when different people are bought into the evaluation process.
  • Lessons for evaluation practitioners: ensure session plans consider how the process will build consensus while creating space for minority views to be surfaced. Be cautious of activities which generate a volume of comments without prioritisation.

 

Reflective practice is core to effective youth work relationship building. The same can be true for effective evaluation in the youth sector. While much focus goes on final evaluation reports and ‘outputs’, beneath this is often rich learning about the process we – and the young people we worked with – went through.

 

Hindsight is often easy. However, this hindsight when shared can provide foresight to those seeking to apply similar approaches in the future.

 

----

 

Tom Burke is a freelance consultant and Research Fellow of YMCA George Williams College. He worked with the College – or Centre for Youth Impact as it was known – from March 2021 – June 2022.

 

Many thanks to the team at The National Lottery Community Fund, who met with Tom and discussed his personal reflections and insights, which informed the development of this blog. All views reflect the authors.