In last year’s roadmap for digital and data, government committed to getting 50 of the Top 75 services to a ‘great’ standard by 2025.
Over the last year, we at the Central Digital and Data Office (CDDO) have worked with service teams across government to baseline all 75 services in order to understand their current performance and what improvements are needed to get them to ‘great’. This started with agreeing on a clear definition of what a ‘great’ service looks like and establishing a common framework for assessing service performance.
One size does not fit all
Crafting a framework for assessing service performance across such a diverse range of services has been no easy task. The ‘Top 75’ list includes a range of services, including signing up to vote, applying for a passport and recording MOT results. It caters to a wide array of citizens' needs, but also services that assist civil servants day to day such as the Civil Service Learning Platform.
Recognising this diversity, and after engaging with teams across government, we devised six distinct service types, each with its own set of benchmarks:
- Requesting information
- Providing information
- Amending / deleting information used in a government service
- Requesting benefits payments or loans
- Making a payment to receive goods or support
- Requesting a licence or consent to carry out an activity
A great service is usable, efficient and accessible
When we started exploring how to define a ‘great’ service, it was clear that there are no silver-bullet criteria for determining if a service is great or not.
Instead, we honed in on three key elements:
- Is the service user-friendly?
- Is the service efficient?
- Does the service comply with accessibility standards?
Underlying each of these elements are specific criteria that allow us to rate services as 'great,' 'good,' or 'requires improvement.' The thresholds vary based on the service type.
A user-friendly service is one where the majority of users are able to start and complete their whole service journey online (but we’d always expect the service to offer a great offline route too), and receives excellent user feedback too. An efficient service has relatively low costs and processing each user’s journey is done in the most streamlined manner possible, saving users’ time and taxpayers’ money. A compliant service also meets accessibility standards so that everyone can access the service digitally.
A service earns a ‘great’ label only if it meets the standards across all three elements.
Context is everything
Those three elements provide the quantitative rating, but numbers can only reveal part of the story.
We know that service provision is complex and involves a huge range of factors. To account for this, we overlay the quantitative rating with a qualitative review. This involves delving deep into each service through validation sessions, supported by additional performance indicators, independent assurance, and insights from the service teams themselves.
By embracing context, we better ensure a fair assessment that truly reflects the service's performance and its unique circumstances.
An example of this is where a service is collecting a penalty. In these scenarios, user satisfaction data collected after the penalty has been paid is more likely to be low regardless of how well the service worked. That’s human nature, as users may be unhappy at having to pay, which will be reflected in the user satisfaction survey. Taking into account other research about how users feel about the rest of the user journey gives us the confidence to adjust the scores.
The contextual overlay allows us to reassess the quantitative score, and check that it is a fair and accurate reflection of the service's performance.
Looking ahead to 2025
We’re using the Great Framework as a mechanism to support services. It’s helped CDDO and service teams identify areas services can make big and small changes to improve usability for users, efficiency for government, and accessibility for all. The data and insights collected have been used to make the case for change with data driven evidence, which in time will lead to measurable improvements in services. In time, we want this work to set the standard for great services across government, including those beyond the Top 75
We’ve now got a view of service performance across the Top 75 and are working with service teams to ensure they each progress towards the ‘great’ standard by 2025. It is an ambitious aim, but one we’re very excited about working towards.
If you’re interested in learning more, please email email@example.com.