While we’re familiar with assessing developing services against the Digital by Default Service standard, here at HO Digital we've been also thinking about how to continue to monitor the quality of services that are already live.
We want all Home Office Digital services to be of a consistently high quality, irrespective of go live dates. This is life after live: we want services to innovate and respond, not fade away or be unable to change as has happened in the past.
Getting fast feedback on strategy
We had an established way of assessing new and redesigned services, but no ongoing monitoring of existing and live services. We also had feedback from our assessors that there were issues with our established methods of assessing. In particular, assessments were being seen as ‘gates’ and service teams were responding by doing lots of extra work prior to assessments. There was a clear need to avoid this happening with live services and to generally improve our ways of assessing. We decided to draft our strategy for live services iteratively, and by getting feedback from as many different areas as possible. We also used the feedback to improve how we assessed new and redesigned services.
Among others we talked to our assessors, heads of profession, a range of service team members, assessors from other Departments and our Service Optimisation team, who look after services once they have passed a live assessment. We also talked to GDS about the new shorter standards and how these might apply to a mature service.
Acting on feedback
Among the feedback the consistent points were:
- Both assessors and services were concerned about the spikes of work before assessments.
- Services and assessors felt that too much time was being spent on introductory information during assessments, rather than having a conversation about the service.
- Some people wanted a more visual perspective on the standards, rather than a list.
- There was confusion over terminology. Live Assessments is already used for assessing services that are going live.
To address these we:
- Decided to trial shorter more random sessions, based on triggers such as a live service’s performance dashboard.
- Encouraged services to send information before assessments by designing a preview note.
- Developed a visual representation of the standards.
- Renamed assessments of already-live services to Live Reviews.
Our first live review
We held our first live review on June 26th in Croydon. We talked to the Change of Address service within UK Visas and Immigration. This led to some useful recommendations on how the service can continue to improve and remain responsive to its users. We asked the service team to give us feedback on the session. We have already used this feedback to amend the preview note and the way we invite services to a review. We will be continuing to run live reviews and iterate how we run them based on feedback from the service teams and our panellists.
Let’s share
Do you have a different approach or thoughts on what we’re doing? We’re open to further improvements in how we approach live services and would love to get your thoughts, inputs and experience in the Comments section below.
Leave a comment