Skip to main content
Blog

Failure isn’t the end – it’s where better services begin

Liz Glidewell explains why failing a GDS service assessment is a normal part of the improvement process – and how she supported the Be Part of Research team through multiple assessments to build a better, more accessible service for their users.

Government and NHS services have a duty to work for everyone, including those with low digital confidence and non-digital users. This is why services need to pass a GDS service assessment.

It’s not unusual for a service to fail its first assessment. But what happens next?

The author Liz Glidewell standing outside looking at the camera

For the teams involved, it can be tough. While we understand that failing is a source of learning, often there is emotional baggage attached to the term. Many of us feel fear and shame when we ‘fail’ at something, which can limit what we go on to achieve.

That’s why I embrace ‘failing higher’ – failing smarter and faster – in my role as a GDS assessor.

Failing an assessment does not mean we give up on the service. Instead, it’s an important part of the improvement process. Service assessors give teams constructive, actionable feedback, grounded in the principles of user-centred design (UCD). This supports service teams to iterate and create a better service for everyone.

A great example is Be Part of Research (BPOR), which aims to make it easier for health researchers and potential study participants to find one another.

The path to creating effective services for the public is a journey of continual improvement.

Briefly, the BPOR team built a website where researchers could register their studies, making them searchable by the public. The public, in turn, learn more about research projects and can apply to participate.

Their first beta service assessment took place in March 2022, but it took 2 more reassessments before the team met all service standards in October 2024.

I was part of the assessment team for the second and third assessments. This is how we supported the BPOR team to 'fail higher' and ultimately create a service that better met the needs of all its users.


1. Understand users and their needs: the foundation of any good service

Designing services for a group as diverse as ‘the public’ is challenging. We need to identify all our users and gain a deep understanding of their needs. This is the heartbeat of user centred design.  

It is important to map all users who interact with our services, not just end users. Initially, the team conducted research with ‘healthy volunteers’ who prioritised online journeys and also user groups with complex co-morbidities and their carers.

They also included public health and social care participants and researchers. They identified their needs to maximise uptake in studies.

The team went beyond sampling people with different levels of digital literacy to recognise people ‘under-served by research’, such as those with:

  • low digital confidence
  • disabilities, impairments and access needs
  • English as a second language
  • low health literacy
  • ethnic minority backgrounds
  • a distrust of government or healthcare services

The team sampled and screened for these characteristics to build confidence in their findings.  


2. Solve a whole problem: consider the full journey of all users

People want seamless journeys. We need to understand users’ needs across their end-to-end journey.  

Following assessment, the team mapped out the end-to-end user journey for all their users.

They introduced:

  • a self-serve digital journey
  • a facilitated non-digital journey
  • a combination of digital/non-digital throughout their users’ end-to-end journey

Users can now learn about research, register, manage their experience and receive study results via the channel works for them, including online, phone, paper or in person.

This ensured an equitable experience for everyone.


3. Provide a joined-up experience across all channels: consistency matters

We expect services to work seamlessly, whether we’re accessing a website, using an app, or calling a support service.

Initially, the BPOR service encouraged users to sign up online. Following assessment, they modified the different journeys for offline, web, and app experiences to create a cohesive experience across channels.

They are continuing to research how under-resourced research projects can deliver non-digital options and. have also created feedback loops to optimise across different methods of access. 


4. Make the service simple to use: less is often more

Users often engage with government and NHS services with specific goals. Making things simple with as few steps as possible is crucial, particularly for those who are time short, stressed or unwell. Overly complex services lead to frustration and user disengagement.  

The initial BPOR multi-step registration process confused research teams, but following assessment, the team automated this process. This created more time to support research teams more effectively.


5. Ensure accessibility for all: meeting users where they are

It’s essential to make services accessible to everyone, regardless of their abilities, language skills or digital literacy. Accessibility is not just about following guidelines and meeting Web Content Accessibility Guidelines; it’s about empathy and inclusivity.  

While the BPOR service met Web Content Accessibility standards, it was still difficult for users to explore the large range of research opportunities on the registry.

The team conducted further research with users with different accessibility and language needs and introduced monthly accessibility audits linked to design updates to ensure the service continuously evolves and is inclusive for all.

They made inclusion a core part of every design decision.


Conclusion: failing higher and acting on learning along the way

The path to creating effective services for the public is a journey of continual improvement. Service standard assessments show us where services shine and where they could ‘fail higher’. With constructive, clear, and actionable feedback, and a commitment to user centred design, reassessments are opportunities. The BPOR team used failure to build more accessible services to ensure no user gets left behind. 

Thank you to my colleagues Gary Cullen (Senior Product Manager, National Institute for Health and Care Research), Nicola Taylor-Dawson (Senior User Centred Design Manager, National Institute for Health and Care Research), James Higgott (Lead Product Manager, NHS England) and Matthew King (Senior Content Designer, Department of Health and Social Care) who co-created this blog. 


Last edited: 12 September 2025 1:50 pm