How evaluation helps you learn and improve

Pippa Knott, Head of Delivery at the Centre for Youth Impact, and Alice Thornton, Research Project Manager at Renaisi, reflect on the Centre’s experience of evaluating its own work.

Evaluation helps you learn and improve

In November 2015, the Centre for Youth Impact asked Renaisi to undertake a developmental evaluation of the Centre’s work. This was the first time the Centre had opened itself up to the scrutiny of external evaluators, which can be scary enough. But as an initiative that encourages others to use evaluation to improve their services, the team faced the additional pressure of wanting to live their values and principles. So what was their experience of being evaluated, and how has it changed their work for the better?

Developmental evaluation

The team chose to pursue a developmental evaluation, rather than a more traditional ‘summative’ one. A developmental evaluation is designed to support organisations to learn and improve in a process of continuous feedback and development, rather than waiting for a ‘final judgment’ on whether its intervention is effective or not. Unlike summative evaluations – which usually require the intervention or activity to stay the same during the evaluation period – developmental approaches are designed to cope with change and adaptation in delivery as the evaluation progresses. In fact, this is positively encouraged.

An evaluative challenge

The Centre for Youth Impact aims to facilitate a shift in culture across the youth sector. It supports organisations to reflect on their practice in a way that is informed, structured and transparent, and is building a community of organisations who share learning, challenge one another and build consensus to progress conversation and practice around impact.

Pippa Knott

Pippa Knott (credit: Centre for Youth Impact)

Although critically important to the Centre’s values and approach, culture change is a notoriously difficult outcome to measure: it is both nebulous and contested. Equally, how conversation and practice is ‘progressing’ is difficult to track across a sector of diverse organisations, where there is not even an agreed baseline against which to compare any progress. And, being a young and evolving initiative in a rapidly changing sector meant that the nature of the Centre’s work, and the context in which it operates, was constantly evolving. It was clear that in these circumstances evaluation would be a challenge. But this shouldn’t be an excuse to hold back.

Evaluation is never perfect. Often organisations shy away, thinking that it isn’t the right time, or that it isn’t possible to do an evaluation that is robust enough. But it can be incredibly insightful to have someone else’s perspective on a problem, as long as one chooses an evaluator who is supportive and will understand your values. It is about helping organisations to do what they do as well as possible, without worrying about being judged.

Taking the plunge

The Centre knew it couldn’t rely on the hypothesis – and hope – that its work was somehow having a positive impact. So, in consultation with its partners and others working in the field of evidence production and take-up, it developed an evaluation design appropriate to its circumstances.

A developmental evaluation design is not common. Many organisations’ experience of evaluation has often been of something done ‘after the event’, with findings emerging once thinking and policy has already moved on. But the team at the Centre, along with its key stakeholders, felt impatient to know: are we making a difference? The developmental evaluation process encouraged the team to keep challenging what it looks like to be ‘working’, and how accurately it can expect to know whether this is the case.

Evaluation is never complete, but the Centre is now far more informed about how practitioners and organisations are responding to its work, and how it can best focus its activities. And having been through an evaluation itself, it also has the experience of how to support other youth sector organisations that are considering an evaluation of their own work.

What has changed?

As a result of Renaisi’s recommendations, the Centre is already changing how it works with networks across the youth sector. In particular, it is evolving how it works with partners to develop more nuanced, comprehensive, and better targeted relationships with youth sector organisations. It is doing some focused thinking about its own outcomes and the activities most likely to achieve them, while examining assumptions made in the early days of designing the initiative about what approaches are most effective.

The developmental evaluation process prompted the Centre for Youth Impact team to keep ‘resetting’, understanding that there is no right answer, or perfect model through which to be improving impact measurement in the youth sector. The process has not necessarily been about being told that the approach is wrong or right, but instead about bringing in a new perspective, a rigorous approach, and supportively challenging voices, that keep the Centre’s work fresh and responding to the changing environment.

Evaluation isn’t just for organisations that are big or well-established, or projects that are coming to an end – it is a valuable journey, and a new relationship of trust, for everyone.

Alice Thornton
Alice Thornton

Related articles