Skip to main content

Piloting partnerships to hear practitioners perspectives: Five lessons learnt


In May 2021, the Centre for Youth Impact and UK Youth joined forces to extend the reach and better build insight through Just One Question - the Centre’s survey tool that invites youth practitioners to respond to one multiple-choice question about their work each week. As we explained when we launched the pilot, we sought to grow the number of people taking part by experimenting with new recruitment routes; changing the registration process to learn more about who takes part, and broadening the range and format of questions posed. 

What happened? 

In many ways, the pilot has been a great success. Working together, and increasing reach into UK Youth’s movement of over 7,000 youth organisations, achieved a 52% increase in the number of total users registered to take part. We rolled out new demographic questions for new and existing registrants with high response rates. However, while the number of people registered to receive the survey questions grew, the weekly average number of weekly respondents as both an absolute figure and proportion of total registered has stayed broadly static. In short, more people can respond every week, but few choose to take part every week. Most weeks, a similar core of people respond with their take on the issue of the day. 


What did we learn? 

Whilst the headline is a mixed picture, the process of working together and reflecting on progress has generated broader insight for us to take forward…  

1. Practitioners are happy to tell us about who they are, but how to ask needs thought

As the number of respondents increases, we can be more confident of understanding who is expressing what views and the potential to analyse perspectives and experiences by individuals’ identity and role in youth organisations. We didn’t see any significant difference in the number of practitioners taking part when asked to answer demographic data. 

We tried to choose questions against what has been used in other sector-wide surveys or wider labour force statistics. Here we found a challenge. Different organisations use slightly different wording for demographic information. Insight from the Centre’s Practitioner Panel gave similar feedback: how the youth sector asks ‘who we are’ or ‘who we work with’ varies. This is a more significant issue for the sector as a whole to generate shared understanding. 


2. Asking questions of all practitioners, not just “youth workers” takes care 

Not everyone who works with young people is or thinks of themselves as a youth worker. Much of our youth provision is driven by volunteers, those without formal youth work qualifications, or those who engage with young people in allied sectors such as health or housing. Framing the questions to be inclusive of all those marvellous people is a challenge. Not assuming knowledge, using exclusive language or being too oriented at just those with formal qualifications (for example) needed to be balanced with ensuring we’re asking meaningful questions to give us substantive findings.                                                                                                                                                                                       


3. We suspect practitioners are experiencing survey fatigue

Without the ability to get out there and speak to people, run focus groups and see what is happening, our evidence gathering methods have been limited. In part due to Covid, surveys have become the sector’s best friend - the Centre and NYA tracked over 130 different surveys in the first lockdown in 2020. However, it is probably fair to say that many are a bit fed up with surveys. While Just One Question is literally just one question a week, it's still a survey. Informal feedback from our contacts shows a real sense of survey fatigue and we suspect this impacted the success of the pilot.


4. Some practitioners are interested in the answers to our questions 

One of the biggest changes during the pilot is the proportion of people who click-through from our weekly summary results email to read our analysis. This grew significantly from just 8% to 30%. Whilst this is a minority, we also saw growth in people visiting from social media and the wider links sent out by both organisations. We think that many practitioners are interested in and hungry to learn more about the status of the sector and the experiences of their peers. We need to consider how we package this to meet the needs of practitioners to access, absorb, and consider implications for their practice. 


5. Lack of sector insight drives us and our partnership

The youth sector can feel bitterly under-evidenced and, arguably, misunderstood with little deeper appreciation of the 'art of youth work' or the impact of relationships with and on young people.

It's not just our organisations seeking to build evidence with and for the youth sector; many other organisations play a part too. We risk asking the same people slightly different versions of the same questions to serve slightly nuanced organisational agendas. That is why the partnership on Just One Question is so essential. Together we found much common ground on areas of interest and intrigue. Having a shared space to come together and put those interests into words and then words into questions reduced the number of questioning voices. It allowed the answers to become louder. 


So, what next? 

We have decided to continue to work together and try out further changes to Just One Question. We will continue to test new ways of recruiting people to take part. We will be moving to a deeper monthly analysis of results to better look at key trends and see where key themes on a topic exist overall. Perhaps most importantly, we want to continue to work together to generate the questions that allow practitioners' views and experiences to be recorded, shared and heard.


Tom Burke is Executive Director at the Centre for Youth Impact. Steph Talbut is Assistant Director of Research and Performance at UK Youth.