Brace yourselves; today you are in for a treat. I have invited the one and only Valeria Adani to share her thoughts on Trust and Technology. I have known Valeria for over 10 years, she is an outstanding design leader. Valeria has recently made the bold move to join as a partner at Project by IF, a strategy and design firm that specialises in trust. The company is at the forefront of the thinking and experimentation on trust and technology, and it is also female-founded, which makes it just impossible to ignore.
Before starting, please take a moment to subscribe to this blog.
Happy reading!
Marzia
It’s Time to Design for Trust
Design has been focusing a lot on creating “better” experiences for people to “better” respond to their needs. Over the years, user experience efforts converged with the pursuit of seamless, effortless, painless interactions. The work of designers focuses on making things not just easy, but enjoyable and even delightful for users.
However, as we advance, it’s becoming evident that good UX/CX alone will no longer suffice. We need organisations to design for trustworthiness too1.
Can you trust a person? Can you trust that a restaurant will serve you food that doesn’t harm you? Can you be sure that the personal details you’ve just entered into a healthcare app will go to your GP, and not a scammer?
Trust is a feeling2 - is the liminal space that defines our relationship. It cannot be saved, or stored, and certainly it can’t be built. Trustworthiness, which is the quality of being deserving trust, is perhaps more important when it comes to organisations. You can not force someone to trust you, but you can demonstrate that you can be trustworthy: for example you can always meet expectations (e.g. always showing up on time) or you can strive to always be accountable (e.g. taking responsibility for your mistakes). Most importantly trustworthiness can be designed and measured.
In an era marked by growing concerns over security3, privacy4, and authenticity5, being trustworthy emerges as a crucial component for success6 (and survival too).
This Isn’t a New Problem
We are living in an era where organisations and businesses need to be intentional about trust. This is not just the right thing to do it; it’s critical to commercial success too7. What is your company doing to reduce the trust gap between you and your customers?
Nowadays there are more and more people and organisations to trust: the world is becoming more connected, more automated and more complex. We are increasingly asked to trust people and organisations that we have never met, and may never know - for very important things.
Technology Is Making Things More Complex
We increasingly rely on, delegate to and depend on technology: it is setting new expectations for what is possible. All of this, while we don't really understand it, know in whose interests it operates, or how to put up appropriate guardrails.
Trust in technology8 is needed. But it can be vague9, overwhelming10, inaccessible11 and impractical12 too. With artificial intelligence systems permeating various facets of our lives, the intricacies of trust13 are evolving into a systemic challenge. Discerning between reality and fabrication becomes increasingly challenging14.
Mind the (Trust) Gap
There’s a big gap between trustworthiness and the products and services we are designing today. For businesses, a lack of trust translates into very tangible, commercial outcomes. To name a few: reduced customer loyalty, declining sales, and damaged reputations15. For society it can lead to a loss of faith in public institutions16, reduced civic engagement17 and polarisation18.
The trust gap refers to the disparity between the level of trust people have in certain organisations or people and the actual level of trust that those entities deserve or have earned. The trust gaps19 widen when there is a significant difference between the perception of trustworthiness and the reality of an entity's actions or intentions. Things like broken promises20, unaligned expectations21 and lack of transparency22 contribute to this.
This gap manifests in many ways: from the inherent distrust you feel when you click a cookie banner, an automated fine that you are helpless to correct, or the app that doesn't work for you because its algorithm is discriminatory.
For businesses, the gap shows up in a myriad of critical risks from algorithmic bias, to regulatory readiness, to being left behind by the next technology wave.
For product teams, it's the blocker of wanting to do the right thing, but being faced with KPIs that prioritise conversion rather than comprehension, or how to prioritise trust when your backlog is already full?
For regulators, the trust gap looks like how much harder it is to stay ahead in a market continuously disrupted by exponentials, to understand nuanced and changing user behaviour and know how to translate that into impactful guidance.
The Time to Design for Trust is Now
At IF, we believe there has never been a more important time to design responsibly. The reason I joined IF this year23 is exactly because I think, as designers, we can and must play a crucial role in shaping a more trustworthy future. As designers and design leaders, we are accountable. We shouldn’t behave as if we were outsiders or victims of the system that creates potentially harmful, unethical and untrustworthy products and services.
Much more needs to be done to ensure care for people is always at the heart of products and service development.
Responsible Technology by Design Framework
To guide our work in this space, at IF we created the Responsible Technology by Design Framework24. It helps us organise and guide our work in designing and creating trustworthy products and services. It provides us with a shared language around responsible technology needs, principles and patterns.
The framework is a work in progress and it continues to evolve as our thinking evolves. It helps us assess and design products/services that deliver human experiences that make people feel safe, respected and empowered.
Filling the Gap by Looking at the Trust Stack
At IF we believe trust influences every aspect of a product/service. We think that building trustworthy experiences requires system change- or 'full-stack25' change- which is why we work across experience (research and design), tech and data, organisational change and policy. Building trust also requires clear intention on how the different elements play with each other: from new user interface patterns that give people agency, to innovation in the underlying technical infrastructure, and real accountability.
Trust needs to be entrenched in how a product or a service is designed, delivered, maintained and scaled. That requires an evolution of how design teams work with other parts of the organisation.
Regulators Can Be Useful Co-Designers
Following regulations shouldn’t be perceived as a necessary hurdle or a mere compliance tick-box exercise. It’s time to see regulators as co-designers, especially in a time where design affects not only trust but even safety26. In the past years many regulatory efforts are emerging in EU, US and UK as a direct result of a lack of trust (for example GDPR27, DSA28, DMA29, ICO & CMA joint paper on harmful design30) and more regulatory actions are materialising with the raise of AI (EU AI Act31 and US Executive Order on AI32 to name a few)
That said, regulations can only provide a certain level of guidance, but it deliberately leaves grey areas. Very often the final user experience is crafted by designers with very little guidance. Most design and product teams lack confidence and require new tools, new design principles33 and frameworks to cope with this complexity.
Collaborating with regulators as co-designers is an organisational change that we will see more of in 2024. As design leaders you need to ask yourself:
How well does your design team understand the value of regulation?
Are they equipped with the right tools?
How do you work with your legal, safety and compliance team from the start?
Closing the Trust Gap Can’t Wait
Trust, transparency and accountability aren’t new problems, but organisations need a new approach. And designers can play a crucial role in it. .
We need to solve trust challenges and make trust intervention actionable by looking at the full trust stack, from UI to policy, from organisational change to technical architecture - and be organised for that. In this shift - a shift is not merely procedural but a cultural transformation that strengthens internal bonds with other departments - design leaders can play a fundamental role. It’s time to start designing for trust, with intention.
Why Trust Is More Important Than Good UX, Sarah Gold, Medium, May 2023
The Power of Trust Is in Your Hands, Rachel Botsman, LinkedIn, December 2023
23andMe Confirms Hackers Stole Ancestry Data On 6.9 Million Users, Lorenzo Franceschi-Bicchierai, TechCrunch, December 2023
Dropbox’s AI Integration with OpenAI Turns into a Messaging Mess – As Amazon’s CTO Apologises over Data Protection Post, Ed Targett, The Stack, December 2023
Google Admits AI Viral Video Was Edited To Look Better, Tom Gerken, BBC News, December 2023
Trustworthy Companies Offer Superior Investment Returns with Less Risk, Barbara Kimmel, Medium, July 2022
Trust: How To Measure It and Why It Matters For Business, Francesca Cassidy, Raconteur, August 2022
Companies Need to Prove They Can Be Trusted with Technology, Daniel Dobrygowski, Harvard Business Review, June 2023
Trust in Tech Has Eroded: Here Are 3 Ways to Rebuild It, Michael Miebach, World Economic Forum, January 2023
Edelman Trust Barometer, Edelman, 2022
Six Ways Tech Leaders Can Build Trust in Their Customers, Francis Dinha, Forbes, February 2023
Big Tech is Still Struggling to Earn the Public’s Trust, Taylor Barkley, The Center for Growth and Opportunity at Utah State University, August 2022
The AI Trust Crisis, Simon Willison, December 2023
AI and Trust, Bruce Schneier, December 2023
The Trust Crisis in Business, David Michels, Forbes, June 2019
Trust in Politicians Reaches its Lowest Score in 40 Years, Michael Clemence & Laura King, Ipsos, December 2023
Lack of Trust is a Barrier to Civic Engagement, Charles Thomas, Knight Foundation, November 2018
Distrust, Political Polarization, and America’s Challenged Institutions; David W. Oxtoby, Henry E. Brady, Tracey L. Meares, Lee Rainie, Kay Lehman Schlozman; American Academy of Arts & Sciences, January 2023
Trust, Populism and the Psychology of Broken Contracts; Eric Beinhocker, Edelman, 2023
Private UK Health Data Donated for Medical Research Shared with Insurance Companies, Shanti Das, The Guardian, November 2023
Sainsbury’s Boss Defends Decision to Sell Customers’ Nectar Card Data, Alex Lawson, The Guardian, December 2023
Adobe Faces Big Fines from FTC over Difficult Subscription Cancellation, Amber Neely, Apple Insider, December 2023
Trust Matters, Valeria Adani, LinkedIn, May 2023
Introducing IF’s Responsible Technology by Design Framework, John Ridpath, Medium, August 2022
Full Stack Service Design, Sarah Drummond
Tesla Recalls More than 2m Vehicles in US over Autopilot System, Reuters, The Guardian, December 2023
EU AI Act: First Regulation on Artificial Intelligence, European Parliament, June 2023
Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence, The White House, October 2023
Design Principles for a New AI World, writingprincess, Medium, January 2022