The document discusses Service Standard Assessments conducted by GDS. It provides an overview of:
1) Why assessments are conducted, which is to ensure services meet the Digital Service Standard and protect the quality of GOV.UK, as well as provide feedback and help solve problems.
2) What assessments involve, which is a four hour peer review with an assessor panel and service team members to determine if a service meets the Service Standard.
3) How assessment panels aim to create a safe space for discussion by making sure all voices are heard, teams have context to explain their services, and no judgements are made.
3. Sam Villis @stamanfar
Local Collaboration Lead - MHCLG
Clara Songaila @clarasongaila
Digital Engagement Manager - GDS
Steve McCready @stevemccready
Spend Controls Lead - GDS
5. What we will cover:
● Why and how we assess services against
the Service Standard
● How assessments help teams to deliver
the best services for citizens
● Code of Conduct
6. What we will cover:
● Why and how we assess services against
the Service Standard
● How assessments help teams to deliver
the best services for citizens
● Code of Conduct
7. What we will cover:
● Why and how we assess services against
the Service Standard
● How assessments help teams to deliver
the best services for citizens
● Code of Conduct
9. The Standards Assurance Team in GDS run
the digital and technology Spend Control
and Service Assessment processes on
behalf of the Cabinet Office.
10. The aim is to reduce government waste
and ensure departments make the right
technology decisions to deliver great digital
services to citizens.
11. Maybe we could all talk more about what the service
standard enables: a transparent way to measure progress,
a lodestar for government to work towards, a fair and
reasoned way to avoid departments marking their own
homework, an important sponsor of working in the open
and, most importantly, a catalyst for better services for
users.
12. Senior Technology Advisors (STA's)
work with departments to approve
(...or not) requests to spend public
money.
13. Digital Engagement Managers (DEM's)
work with departments to make sure
that the services they are creating meet
the Digital Service Standard.
19. ● A four hour peer review of a service,
● using an agenda-guided discussion,
● with up to five assessors on the panel
● and five members of a service team,
● to understand if the service meets the Service Standard.
20. ● A four hour peer review of a service,
● using an agenda-guided discussion,
● with up to five assessors on the panel,
● and five members of a service team,
● to understand if the service meets the Service Standard.
21. ● A four hour peer review of a service,
● using an agenda-guided discussion,
● with up to five assessors on the panel,
● and five members of a service team,
● to understand if the service meets the Service Standard.
22. ● A four hour peer review of a service,
● using an agenda-guided discussion,
● with up to five assessors on the panel,
● and five members of a service team,
● to understand if the service meets the Service Standard.
23. ● A four hour peer review of a service,
● using an agenda-guided discussion,
● with up to five assessors on the panel,
● and five members of a service team,
● to understand if the service meets the Service Standard.
54. It’s impossible to know everything about
other departments
Dedicating time to understanding the
context of services is imperative
55. This extends outside the realm of the
assessment…
Service assessors can request more
information from teams before or after the
assessment.
56. 3. Work hard to create a space free of
judgement
57. Looking out for language
Making sure everyone understands the
details
Clarifying
58. 4. Make sure that everyone in the room
feels that the time has been valuable
59. We believe it is important for assessors to
look for opportunities to add value - if we
can
Building connections, making introductions,
suggesting reading or case studies
Important to note we’re not speaking ON BEHALF of GDS - so anything I overcommit to isn’t binding
Spend controls were introduced with the creation of GDS back in 2012.
They exist across multiple functions (property, marketing)
Spend controls are delegated from the Treasury
Team of 12
Individual relationships with departments
The important part of the team.
Guidance - organisation - coordination
STA’s and DEM’s are reliant on each other to deliver that services are delivered that adhere to the standards - STA’s use assessment outcomes and DEM engagement to drive spending decisions, DEMs use spending conditions to help frame an assessment
Clara
Services are assessed to make sure they meet the Digital Service Standard and to protect the quality of GOV.UK.
This is important as users don’t have a choice when they interact with Government.
They need to be able to carry out the task they need to, simply, easily and have trust in the service they are using.
By creating services that meet the Digital Service Standard and therefore GOV.UK design patterns the user can more easily trust they are using the right service and doing the right thing to get the task done.
They also help you get feedback from a panel of experts and solve problems with your service as you build it.
This is important as we all face many challenges in government.
It can be really easy for us to remain in silos, whether that be our teams, our programmes or department. Assessments enable us to learn about other contexts and ultimately share how we’ve overcome challenges, passing on that knowledge to more teams, programmes and departments
Assessments are a great forum for improving our own capability whether you are a service team member or an assessor in the assessment.
Clara
Assessments are positioned as a peer review as that is exactly what they’re supposed to be, a conversation from one colleague to another to provide feedback in a constructive way.
The agenda runs with the service team giving a show and tell for roughly the first 45 mins to an hour. Gives an opportunity for the team to explain any context that the assessors might not know and ‘show the thing’ to the panel.
Then it moves on to sections based on 5 specific themes, user research, team, design including service design and content design, technology and service performance.
For each of those sections the assessors and service team are given a short list of questions before the assessment so they can get an idea of the types of questions they might ask or be asked.
The 5 assessors reflect the different themes mentioned in the agenda. There is a lead assessors who covers team and leads the assessment, then a user research, design, technology and analytics assessor for the other themes respectively.
The idea is that the service team mirror those who are on the panel so we expect the service manager and product owner, the user researcher, designer (interaction, service or content), sometimes 2 of those or on the rare occasion all 3!, technical architect and performance analyst.
The Digital Engagement Manager works with the service team if they are unsure of who to bring along to an assessment as it might be that a legal or policy person has been working on the team and they might bring more relevant knowledge to show how the team have been working collaboratively.
The panel then collectively decide on whether the service meets the standard or not. This result and the feedback which is written up in the form of a report is communicated by the Digital Engagement Manager, who also sets up any further conversations that might be needed in more complex situations.
There is information on GOV.UK for service teams to understand if the service they are working on needs to be assessed by GDS.
The Digital Engagement Manager allocated to the department will also help them to do this.
Sam
From Matt Jukes: We need to talk about Service assessments
Matt talks about the assessments as being a necessary evil. In that they build capability and give assurances that government transformation is underway, but he also says that they are problematic too, especially when they are linked to funding… but there are other opinions of assessments out there which are far worse…
Panic! If you know anything about assessments this is probably the main thing you have heard about them.
“Bricking it”
Lots of service teams view service assessments as an scary exam marking process.
Eeek!
Lots of service teams view service assessments as an scary exam marking process
Sometimes it’s much worse
This is from a weeknote by Ian Ames at HM Land Registry.
It feels personal. People take it personally, this is their work!
That’s why it’s so important to create a safe space.
But if we go back to Juksie’s blog you can see that the aim has been for a long time to provide a peer review system for services.
Peer review. This is what we want to foster.
And if we think about assessments as aiming for a peer review process it all looks quite different.
We know from showing and telling and working in the open that bringing in new perspectives on a project can help to improve it - diversity of thought can be difficult when you’re in the rhythm of agile. But assessments give a really good opportunity to break that routine
Communities of practice - opportunity to build capability and to learn from best practice (for both the service team and the panel!
That’s one of the principles of the One Team Gov movement - working across boundaries and breaking silos to make better services for the citizen
Andy - Gambling Commission
A good way to get validation that your team are on the right track!
Colleagues moving from central to Local Gov are taking their experiences of service assessments with them. Local Authorities (with increasingly squashed budgets) are starting to see the benefits of a check and challenge process
Clara
Clara:
We’ve made improvements that reinforce this being a peer review and guidance opportunity - we’re doing this by engaging with service teams earlier in the assessment process and development cycle.
We can have conversations earlier about the types of areas that we know service teams find a challenge when trying to solve a problem for users. We’ve seen the value of this in service teams being able to better express how they’ve addressed these challenges during assessments. We’re getting more and more feedback that assessments are a good place to voice those challenges to get cross government advice. Service teams feel assessments are a safe place to do so because they are more comfortable with the idea that the service is a peer review from early engagement with people like me!
Clara:
An agenda guided discussion also helps enable that conversation style peer review.
I explained earlier that the agenda has 5 different sections that have questions under each of them so that the service team and assessors know what might be asked. We use this as a guide as there have historically been long lists of prompts being used as a checklist. An assessment is about the enabling an open conversation instead of ticking points on the standard off.
The agenda also allows for change - if the service team feel they haven’t been able to cover something of there is space in the discussion for that to happen and vice versa with the assessors.
This agenda-guided approach has improved people’s attitude towards assessments.
Sam and Steve
I love finding out what other teams are doing because it’s a great opportunity for me to learn.
When I asked the question on Twitter Jess replied with this gem!
Andy - Land Registry
From Andrew Travers, On service assessments, Medium
We decided to become assessors because it’s an privilege to see great teams in action and learn from them
From The other side of the table: leading service standard assessments by Kit Collingwood
It’s really important that we create the right environment for a positive, useful, peer review session.
Sam
This should be what most of your slides look like
Steve - This should be what most of your slides look like
It’s not a job interview
It should be a relaxed atmosphere that enables service teams to show the great work they have done
Creating the opportunity for people to have a say (through introductions, explaining)
Often the panel of assessors won’t know the details and the on
Reinforcing that they can talk openly and won’t be quoted verbatim.
This should be what most of your slides look like
This should be what most of your slides look like
This should be what most of your slides look like
It should be a valuable experience for everyone
Some of our most successful assessments are where relationships continue after the assessment
Really important and encourages the cross-govt collaboration that GDS are there to encourage
Sam
This should be what most of your slides look like
we might not know each other ahead of the day, so we need to quickly build trust with the panel members.
The panel is just like a multidisciplinary team, we compliment and challenge each other, to reach a consensus.
Once the assessment is over the Lead Assessor has to translate all of those views into a service assessment report
Writing a service assessment report requires turning things that could be construed as negative into workable, useful advice
Reports should enable service teams to unblock issues within their organisation. They are supposed to be pragmatic but reinforcing of the standard.
Remember Trilly? Well when we write service reports we also have one eye on the organisation. If we can, we think about how we can support the service team and give them a lever to make changes within their organisation.
Hattie Kennedy who by rights should really be up here with us, kicked off this work. We first looked for codes of conduct took these from established places, including One Team Gov, Gov Camp, other conferences and anywhere else that the good people of Twitter suggested.
Here’s one, it’s from One Team Gov. The team reviewed them all and grouped by theme, there were eight themes in total that we distilled into the following statements.
Participants are encouraged to be open, honest, respectful and discrete. The aim is to empower everyone to make better government services focused around user needs.
Participants should ensure they respect and are sympathetic to different individuals’ needs. Assessments should be as safe, accessible and open as the services we are building.
Ensure everyone is heard, be aware of those around you, and do your best to avoid talking over your neighbour or monopolising the conversation.
Assessments must be a fair and honest reflection of the evidence presented. Make sure you check or clarify if you don’t understand a question or answer
Be mindful of how people feel when assessing their work. Ensure you maintain a positive spirit and mutual respect for one another. Avoid making discussions personal, and ensure feedback is positive, encouraging and constructive.
Be mindful of how people feel when assessing their work. Ensure you maintain a positive spirit and mutual respect for one another. Avoid making discussions personal, and ensure feedback is positive, encouraging and constructive.
Retrospective direct:
"Regardless of what we discover, we understand and truly believe that everyone did the best job they could, given what they knew at the time, their skills and abilities, the resources available, and the situation at hand."