ComplianceForm designResearch-ledPattern decisions

Designing a compliance platform when every pattern breaks

Every user struggled with the corporate relationship mapping section. Not some users. Every user. That's not a domain complexity problem. That's a design problem.

Duration
2023 – 2024
Year
2024
Outcome: Resolved a 100% user failure rate through evidence-based pattern decisions
Designing a compliance platform when every pattern breaks cover

The platform

The service was a supplier information and compliance platform: a system that enables organisations and sole traders to register, declare key information about their business, and share that data with a procurement body for compliance purposes.

Every supplier wishing to participate in major procurement contracts needed to complete the service. That meant the people using it were not casual users. They were business owners, finance directors, and legal representatives who knew their domain but were navigating an unfamiliar digital process under commercial pressure.

The design challenge was substantial: the service collected complex, interdependent business data across multiple journeys, constrained by policy requirements, built on an established design system with governance rules around component usage. Getting the design wrong didn’t just mean a bad experience; it meant suppliers unable to participate in contracts, and commercial processes failing.

The constraint

Two types of constraint shaped every decision.

The first was the design system. The service was built using an established component library with governance rules, not a blank canvas. Decisions about which patterns to use, and when to deviate from them, required justification. Convention had weight.

The second was policy. This wasn’t a product team with full autonomy over requirements. The service existed to implement specific legal and commercial frameworks. Some design decisions couldn’t be changed regardless of user feedback; others could be changed, but only if the change could be shown to still meet the policy intent. Design had to work within that boundary, and sometimes work with it.

The crisis: when 100% of users failed

The most critical part of the service was a section called Connected Persons: the declaration of corporate relationships, ownership structures, and associated individuals relevant to the supplier entity.

In the first round of user research, every single user struggled with this section. Not some. Not most. Every user experienced difficulty of some kind: confusion about what was being asked, uncertainty about how to proceed, errors in their responses.

This is not a data point you can attribute to user error. When 100% of users fail, the design has failed. Full stop.

The problem was structural. The Connected Persons section required users to map relationships between legal entities and individuals, specifying who owns what, who controls what, and who is associated with whom, using an interface that presented this as a sequence of independent questions. But the relationships aren’t independent. They’re a network. Presenting a network as a sequence created a fundamental mismatch between the mental model and the interface.

We went back to the drawing board. We mapped the relationship types, modelled the decision logic, and redesigned the journey to reflect how the relationships actually work, giving users a clearer picture of what they were building before asking them to build it. Subsequent research showed the failure rate dropping significantly with each iteration.

Breaking convention: full disclosure

One of the most instructive decisions on the project involved a section called Financial and Economic Standing, where suppliers declare the possibility of prior audit findings and upload supporting evidence.

The initial approach was conventional: one thing per page, staged disclosure, three to five boolean questions depending on responses, file upload, check your answers. This is exactly what the design system was designed for.

Research showed it wasn’t working. The questions were structurally similar, different enough to require individual answers, but similar enough that users were losing their orientation. They couldn’t see where they were going, which made it harder to answer where they currently were. Completion rates and confidence both suffered.

We made a deliberate decision to break the pattern. Instead of staged disclosure, we showed all the questions simultaneously, full disclosure, everything visible at once, so users could see the complete scope of what they were being asked before committing to any answer.

This required justification. Breaking from a design system convention isn’t a small decision. We documented the research finding, articulated the alternative approach, stated the principle behind it (users needed the full picture to answer confidently), and tested it. The second research round confirmed improved understanding and task completion.

The pattern wasn’t wrong in general. It was wrong for this specific context, with this specific set of questions, for these specific users. Knowing the difference, and being willing to make the case, is the job.

The craft: where detail becomes design

Three smaller decisions, each of which illustrates why the details are not small.

Hint text. Form fields are not self-describing. What looks obvious to someone who designed the service is often ambiguous to someone using it for the first time under time pressure. The role of hint text is to close that gap: not to explain what a field is for in abstract, but to answer the specific question a user is likely to be asking at that moment. On this service, hint text was treated as a design decision, not a copy task. Each piece of contextual guidance was written against a specific user need identified in research.

Address replication. Suppliers frequently needed to provide the same address for multiple entities: registered address, trading address, associated individuals’ addresses. Each time, they would re-enter the same data. We introduced an address replication feature: if a user had already provided an address, they could select it rather than re-enter it. Small change, significant friction reduction, measurable improvement in accuracy.

Postcode search. The UK address entry journey was improved with postcode lookup: enter a postcode, select from matched addresses, confirm or edit. This is a well-established pattern, but the implementation details matter: how the results are presented, how many results surface, how edge cases (partial postcodes, no results, overseas addresses) are handled. We treated each of these as a distinct design problem, not an implementation detail.

Aligning with policy

One of the more unusual challenges on this project was designing pattern choices that could satisfy both user needs and policy requirements simultaneously; when they conflicted, being transparent about which was which.

For certain sections, the policy intent prescribed a specific type of interaction: a yes/no declaration, a specific order of questions, a particular confirmation mechanism. The design work wasn’t to override these requirements but to implement them in a way that users could actually follow. Sometimes that meant accepting a design that wasn’t ideal from a pure UX perspective, documenting why, and focusing effort on the elements that were within scope to improve.

Being honest about what can and can’t be changed: with the team, with stakeholders, in the design histories. That’s part of the work.

Outcome

A service that started with 100% user failure on its most critical journey was iteratively improved through documented, research-led design decisions to a point where users could complete their compliance declarations with confidence.

The design histories for this project, nine separate posts documenting specific decisions, their rationale, their testing, and their outcomes, remain published and accessible. They show the full arc: from orientation through crisis through craft through delivery.