Social impact made simpler
Icon-Focus.png

NEWS&BLOG

Thoughts, news and updates


Data collection myths debunked. Yes we can!

Collecting good quality impact data is now very do-able.

In our blog ‘All you have to do is ask’, we explained how simple it can be to measure social impact with a short survey. And the brand new analysis for parkrun shows the kind of evidence you can produce with good data.

State of Life have worked hard over the last 5 years to make collecting data as easy as possible with what is now a Treasury recommended Green Book methodology and sector-leading progressive web application tech (uses less memory on phone, works on and offline).

You can now share a link via WhatsApp to collect robust data on your social impact. And our clients have collected data from very sensitive groups including a homeless youth group and adults in social care.

So why is there still resistance to collecting data?

Making it really simple to ask real people doesn’t seem to be enough. We still encounter resistance to collecting data and numerous problems come up as to why it’s not possible. Here are the most common barriers we encounter. In our next blog we’ll look at what actually could be behind why groups and projects sometimes struggle to collect data.

The people we work with won’t like these questions.

Often with young people or vulnerable groups there is a duty of care and the people who care feel that asking questions could be problematic. This is obviously an important question of research ethics. But this is why the State of Life methodology only uses questions from large national data sets that have been tested and validated to work with a wide-range of ages, abilities and challenges being faced. Of course there will be exceptions where these questions genuinely won’t work for some groups but we hear this way more than that is true.

The people we work with don’t use or have access to smart phones

Often if a project is in a deprived area we will hear that access to digital technology is limited or that people will be reluctant to use up their own data allowance completing a survey. Again, a legitimate concern. Our tech is a progressive web app, so uses very little memory on the phone. In our experience smart phone penetration is almost never a problem in the UK - and where is has been - we’ve had success with the group leader enabling people to use their phone to fill in the survey.

It will take too long and be too disruptive to the project

This often comes from unstructured youth work - so drop-in youth clubs or groups. A project where regular attendance is not possible to track. Yes, this lack of structure can make things difficult but far from impossible. If a person is benefitting from attending, being there and it’s being run by a person with authority - then there is always a very basic set of information to collect (age, gender, ethnicity, attendance etc) and perhaps a quick survey can also be completed. We have surveys that are well under 2 minutes and 20 questions to cover the relevant outcomes.

People are bored of surveys, they have survey fatigue - it’s just a pain in the arse

Sometimes true. But often we find that people actually enjoy the wellbeing questions; it makes them think about life. Survey design can enable this and length can be as little as 20 questions. In our experience, this barrier can come more from the person running the group who for various understandable reasons probably has enough on their plate and might struggle to find time for another survey. We explain why the survey is being done, how it will help them and their project attract more funding, and provide evidence of how valuable their work is in a way that is robust and meaningful. It’s not another survey just for the sake of it. If we don’t do that, then yes - collecting data is just a pain in the arse.

This isn’t ‘real world’ research and you can’t capture everything with a survey

This is more something we hear from those working or running the project - they have an ethical, methodological opposition to collecting data. And perhaps a preference for more qualitative methods.

Again, all very fair and wherever possible we combine data with more qualitative methods (as long as they are systematic and representative rather than just cherry-picking). Completing surveys online and doing it privately and anonymously generates a more honest response in many of the outcome areas we look at. Quantitative wellbeing data is hugely complementary to qualitative methods in being able to represent the whole group or project in what can be a more affordable and accurate way.

So what’s the real reason folks don’t like data?

All of the above are logical reasoning that we use all the time but it may still not cut through. In our next blog we explain why. The opposition to data that makes you accountable for the impact you have is an emotional one. It’s….

FEAR!

So next up, how do we change the language of measurement and evaluation to stop scaring the very people it’s trying to help?

Will Watt