4 min readJust now
–
Consent is still treated as a one-off checkbox, while organisations’ use of data is constantly changing — including its use in training AI systems.
This post introduces three new patterns to our design catalogue that reframe consent as something visible, revisitable, and meaningful.
These patterns support consentful experiences that help people understand and control how data about them is used. They give organisations confidence that they’re collecting data in the right way and using it in ways people are comfortable with.
Consent is broken
When people join a new service, they’re typically asked to agree to lengthy terms and conditions. These agreements describe what data the services will collect about them, and…
4 min readJust now
–
Consent is still treated as a one-off checkbox, while organisations’ use of data is constantly changing — including its use in training AI systems.
This post introduces three new patterns to our design catalogue that reframe consent as something visible, revisitable, and meaningful.
These patterns support consentful experiences that help people understand and control how data about them is used. They give organisations confidence that they’re collecting data in the right way and using it in ways people are comfortable with.
Consent is broken
When people join a new service, they’re typically asked to agree to lengthy terms and conditions. These agreements describe what data the services will collect about them, and how that data will be used. Apps often add their own layer of permission-setting, asking people to grant access to device sensors.
In theory, these mechanisms exist to obtain a person’s permission. In practice, people often accept them without meaningful understanding. This form of ‘consent’ is optimised for legal compliance, not giving people control.
Worse still, it’s one-and-done consent. Once the agreement is accepted, data may flow unseen in the background. The details of the agreement and controls for changing it tend to be buried deep in settings menus. People are left with little visibility of what’s happening or how to revise their choices. Consent becomes out of sight and out of mind.
Not only is this broken state bad for people, but it’s also bad for businesses. Confusion about how services may use data — including for training AI — can quickly deter people from using them, as seen in recent reactions to services like WeTransfer.
From consent → consentful
Our Responsible Technology by Design framework emphasises transparency, accountability, and participation. These properties make services more trustworthy, and they’re also drivers for consentful experiences.
Consentful experiences move beyond “one and done” agreements. Instead of collecting permission once and burying it in a settings menu, services actively support people’s ability to understand and shape how data about them is used over time.
When experiences are consentful, people feel both respected and safe. They can understand what they’re agreeing to, monitor what’s happening, and correct course when needed. Our new patterns focus on supporting these qualities.
Make consent decisions meaningful
Designing for meaningful consent is difficult. Interactions need to be clear and accessible, but they also need to engage people to make a considered decision.
Research on digital HIPAA agreements shows this challenge clearly: improving the user experience made forms easier to complete, but didn’t make people more informed. People simply found it easier to click through.
The Trust, Transparency and Control Labs team at Meta argue that effective consent design should slow people down,* *creating space and time for reflection before agreeing. Small, intentional frictions can help, such as brief task lockouts or short comprehension checks.
Another useful approach is just-in-time consent, already in our pattern library. Rather than gathering permissions up-front, services ask only for what they need in the moment. This makes each decision contextual and easier to understand.
But there’s a balance to strike. Too many prompts, or too much friction, leads to fatigue. People begin to ignore requests entirely.
This is why it’s especially important to focus on the moments where engagement truly matters — such as when data is about to leave a trusted environment. Our first new pattern — consent to data sharing with a third party — addresses this critical decision point.
Make consent use visible
People’s decisions don’t end once permission is granted. To stay in control, they need visibility into how those decisions are being used.
Today, this visibility usually lives in background tools, including reports, logs, or settings pages that people rarely visit. Apple’s Privacy Report and Meta’s Off-Facebook Activity tool are examples. They show when permissions were granted and how they’ve been used, but only if people know they exist and choose to look.
Our second new pattern, show how consent has been used, captures this category of tools. In our research, people appreciated being able to see what they had authorised and what had been done with it. It gave them confidence, and it made the system feel more accountable. While surfacing these tools creates opportunities for more consentful experiences, the challenge is presenting complex histories in ways that feel meaningful rather than overwhelming.
Our third new pattern is: remind people of their choices. Like just-in-time consent, it surfaces permissions at the moment they matter. For example, notifying someone before a service reuses a data-sharing permission they granted earlier.
The pattern also includes checking in on long-standing permissions. Apple’s iOS reminders about apps with ongoing access to location or photos are a good example.
As always, there’s a balance to strike: if reminders are too frequent, they risk losing their impact.
Bringing consent into sight, and into mind
By introducing these three patterns, we’re shifting consent from something hidden and forgettable to something people can see, think about, and change over time.
- Consent to data sharing with a third party This pattern focuses on the key consent moment of third-party data sharing.
- Show how consent has been used This pattern is about providing people with a record for review of what they have authorised, and the effects of this authorisation.
- Remind people of their choices This pattern is about reminding people at opportune moments of their consent choices to surface the effects.
Get in touch
We work with private and public sector organisations worldwide on some of the most challenging implementations of AI systems. We make trust actionable, transforming principles into execution that makes an impact.
Get in touch to hear more about our work or if you want to pilot one of these patterns in your organisation.