Mary loved art, but at 82 she hadnât picked up a paintbrush for decades. Then she was invited to a painting session with artists in her care home. A few months later, she sifted through dozens of paintings that she had made â beautiful, dramatic pieces. Painting had invigorated her; it made her reconnect with her passion and with herself. Art gave Mary a new lease of life.
The founder of the arts group that went into Maryâs care home said in an evaluation report that she had been inspired by the Campaign to End Loneliness. She started a new initiative in a local area, using art to build connections and create change for lonely older people. The report said that their work influenced the local authority to fund another arts project in another area. This is just one of the heart-warming stories, and positive results of our work since 2010.
It can be a haphazard endeavour to find those stories â but learning whether youâve created the changes you set out to is critical to a campaigning organisation.
Evaluation can offer insight into the impact of all civil society organisations. At the Campaign to End Loneliness, weâve been thinking hard about our own evaluation recently. This is proving to be trickier than any other evaluation I have ever worked on. Evaluation is a different beast for campaigning organisations, compared to other charities. But there are lessons in both.
There are six tips I would give to any organisation about to embark on an evaluation:
- Invest in cash and time: All of our funders have encouraged us to invest around 10% of our budget in an external evaluation. On top of an external evaluation, we spend significant staff time to on monitoring, reflection, analysis and learnings and improvements. Time is critical to allowing creative, âwhat next?â learning from your evaluation. So itâs important to start early, involve many, and make even more time for learning from success or failure.
- Richness: think creatively about the rich seam of information you can mine to help your organisation thrive. Surveys and interviews of beneficiaries or partner organisations; and desk-based research for policy landscape mapping are all intensive but often worth it for the case studies and qualitative data that great evaluations can uncover.
- Communicate: donât wait to communicate your findings, even if âjustâ internally; learn as you go, not when you write your next strategy.
- Focus: donât try to do too much with your evaluation â sure, some funders want this, and others want that; your trustees want the other. But ultimately â focus your efforts on the long-term change youâre trying to create, your theory of change.
- Integrate learning with all other operations â here is where I depart from calling it âevaluationâ â it is learning, it is improvement. It is scary and exciting at the same time.
And here are my five tips for campaigning organisations:
- Theory of change /outcomes focused: Knowing the purpose of your evaluation is not enough; you have to know in great detail the change you want to create; this is particularly hard for campaigning organisations because impact can be so far removed from activity â and things can change very rapidly in the space youâre working in.
- Flexibility: This is a critical one. Youâre here to create change – so if things do change, your work plan has to as well. And your evaluation will need to follow suit. So yes, have an evaluation plan, but expect it to flex.
- Impact debate: We have been advised not to invest money in a broad and long-term population-wide measurement (i.e. reducing the number of lonely people): it would need significant money to be done well, which would have far outweighed our expenditure on operations since 2010. Time to Change do it. It is a robust piece of evaluation and at significant cost â and has shown some excellent results across a whole population. Their total annual expenditure has, I think, warranted this population-wide measurement from the start. We use a different approach â a realist approach. Essentially, it takes the wider context of your work, and tracks progress towards your intended outcomes using a range of success indicators and data.
- Attribution: Even if you are able to track long-term, population-wide change, you are unlikely to be able to attribute it to your own activity. Frustrating. But hey, control the controllables.
- Wider âunintended consequencesâ â picking up positive change and learning from negative change: we always aimed to inspire more action to tackle lonelinessâ a very broad outcome that is hard to measure. But harder to measure are any unintended consequences of our work. These are easy to ignore – but might be the things people talk about but donât tell you; the things that niggle in the back of your mind that you think are happening but you donât have any data to prove it; the insights from qualitative data or interviews that point to something else happening out there which is hard to pin down. Think carefully about how you can surface this. It might not be that you need more data – just more honesty among close stakeholders to reflect on your data and learn from it.
Weâre still learning about how to make the most of our evaluation. Showing the difference that a long-term campaign on a complex issue like loneliness has had on the lives of the 4 million lonely older people in the UK is complicated. It is hard to get your evaluations to do what they are intended to do: to prove your impact. But when you can, itâs worth everything.
Using evaluation findings for communication with funders