Session:        1
Room:        Kitchener

Session title : Sharing successful ways to measure (end to end, front to back) whole services

Session leader :

Volunteer to continue conversation after :

Notes taken by : Martin Lugton

Notes

Who wants measurement, and what are they using it for?

Local government context. Different teams all working in siloed ways - makes it hard to work out the true cost of a service. (Cost of service is a key factor, given the need to save money)

How do you help measure the cost of a poor user experience, to make the case for investing in improvement?

How do you even get the numbers in the first place to work out whether it’s worth trying to improve things, if you don’t have budget?

Maybe start off by mapping the end to end service, to help people see and identify with the whole service? (Then it will become easier to work out how to measure things)

“What are the top 3 things that you need from each other as a team” - a prompt question, when convening the group of people working on the different parts of the service. They hadn’t conceived of themselves as a team before.

They want to measure the CDDO-mandated metrics (completion rate, etc), but they are really expensive to measure, so there’s lots of pushback.

Lack of data join up is a problem - e.g. one council was asking people to prove their blue badge eligibility, when it was the same council that issued blue badges.

People worry about opening up their performance data and losing funding and losing control.

What about measuring things at a level of abstraction up, looking at overall council-wide outcomes more than the individual services that contribute to them.

Of course, measuring things incentivises behaviours, so be wary of what unintended consequences might be.

Metrics can help bring other stakeholders to the table to have conversations about how to make the service better - e.g. bringing policy teams closer to user researchers.

Finance people will likely have their own approach to measurement - work with these established measurement methodologies, and collaborate with them to bring in digital experience-type measurement.

Numbers will start off looking bad. Then you collaborate to work to make them better. Senior leaders need to understand and accept this.

One approach is to keep reporting on the old numbers, and start adding some new, more useful numbers, along too, and gradually shifting to them over time. e.g. operational cost, user experience

NAO speech this week on the importance of maintaining services. This might be one to quote.