Are you designing for the user’s values — or your own?
The future of design will be the negotiation between multiple moral worlds.
Photo by Vince Fleming on Unsplash
It’s not a stretch to suggest that as technology becomes more omnipresent, the designer’s role will shift from “hands-on” making to shaping how humanity interacts with machines. We will evolve from designing screens to defining moral guardrails. And like any profession with the power to influence lives — doctors, lawyers, policymakers — designers will need a strong foundation in ethics.
This is the core reason I developed the Five Pillars of Ethical Interface Design — an evolving framework I actively invite other designers to engage with and offer feedback on. The pillars are meant to push designers to confront the moral weight of their decisions rather than hide behind aesthetics, heuristics, or empathy theater.
The truth is designers rarely operate from a neutral position. Even when they insist they’re “designing for the user,” their own assumptions, values, and biases inevitably slip into the work — usually without them noticing.
This is mainly due to what I call ethical misalignment — the gap between the user values designers think they’re honoring and the internal values that actually shape their decisions.
One of the main drivers of this misalignment is the industry’s habit of treating empathy as a stand-in for values — and sometimes as permission to slip personal priorities into the work.
Designers start assuming that understanding what someone experiences automatically tells them what should matter. It’s a comforting belief. It’s also wrong.
As Don Norman has argued, designers can’t actually have empathy in any literal sense — we can only observe behavior and imagine internal states. And imagination is unreliable.
The worst example of this is applying empathy before actual research occurs—because imagined empathy is just projection dressed up as insight. We all do this, and it feels like basic human decency, but it can be dangerous to assume you know what another individual experiences or what their values actually are.
Empathy clarifies what a user experiences, but it does not tell you which experience should matter more. It doesn’t decide which trade-offs are acceptable, or how to weigh one user’s needs against another’s. Those choices come from values, not empathy. Empathy describes—values prescribe.
The difference becomes obvious the moment a team has to choose between competing needs. Users may want granular control in a settings panel, yet the team removes those options because they value simplicity over user sovereignty. Empathy was present—in the assumption that users prefer a smoother experience—but the value driving the decision wasn’t the users’. It was the team’s.
The problem escalates when teams try to empathize with a stereotype — a move that turns empathy into projection. They might imagine an older user base is “not technically savvy,” redesign the entire flow to be slower and more guided, and later find out these users are actually power users who view the simplified experience as condescending.
Once you recognize these patterns, the mythology around empathy falls away, and the real driver of design choices becomes clear—the value system operating underneath every decision.
How designer values sneak into the interface
It shouldn’t be surprising that everyone’s moral compass points in different directions. Flip between Fox and CNN for thirty seconds and you’ll see two completely different moral realities presented as fact.
But this doesn’t stop at politics. Design carries values too. For example, when inclusion rises to the top of a designer’s priorities, the experience naturally becomes more nuanced. Accommodating a wider range of needs often softens the straightforward simplicity a majority audience might expect.
If well-being dominates, the pattern shifts again. Efficient or engaging features get softened or buried to reduce compulsion. That may feel responsible inside the team, but it frustrates users who value speed, momentum, or freedom.
And when transparency becomes the user’s priority, the tension flips. Instead of overwhelming people with disclosures, teams may hide or compress information to preserve a clean, frictionless flow. It satisfies an internal bias for elegance, but it leaves users with less clarity and less agency.
These types of choices rarely come directly from user research. They come from the ethics of the team itself — values that quietly harden into the architecture of the product.
Why explicit ethical structures matter
Ethical value tensions show up in nearly every product decision, yet the industry still struggles to confront them explicitly. Batya Friedman’s Value Sensitive Design (VSD), developed decades ago, was one of the first efforts to systematize this concept. It provided a structured way to examine which human values ought to guide technology — from the conceptual level to empirical research to technical implementation.
However, VSD failed to gain widespread adoption in industry, largely because it remained an academic framework rather than a practical tool designers could apply directly to everyday product decisions.
My framework aims to fill that gap. It offers a more usable system while also addressing a different blind spot — what happens when the team’s stated values and the user’s lived values drift apart once those values are expressed in the interface itself.
But before we can make ethical decisions, we need to understand the user’s values with precision. That’s why I’ve started to develop a simple Ethical Interface Design evaluation tool to sit alongside the Five Pillars of Ethical Interface Design framework.
The tool uses a short 10-question survey. For each of the five pillars, users answer two prompts: one that captures their ethical preference — how important that pillar is to them — and another that evaluates how well the product currently delivers on that value, each rated from 1 to 5.
Together, the five pillars and the evaluation tool are not about prescribing which values you should hold. They expose the gap between what users actually prefer and the moral defaults the product is quietly imposing. The math is simple:
User preference (1–5) minus Product score (1–5) = Gap.
0 → alignedNegative → you’re not meeting the value the user actually wantsPositive → you’re pushing a value harder than the user prefers
The key to making this survey work is to implement questions that expose ethical design tradeoffs. For example, if we look at the pillar of transparency (note: these questions are still being refined):
Screenshot from Ethical Interface Design Evaluation
Across a group of users, the patterns show you exactly where your design values match the audience — and where your own worldview is leaking into the interface.
The ethical pivot point
Technology is no longer just entertainment, utilities, or conveniences.
It mediates attention, communication, decisions, even identity. That makes every value baked into an interface more consequential. We’re not arranging screens anymore. We’re shaping the logic of human–machine interaction — and by extension, shaping behavior and culture.
By focusing on Ethical Interface Design frameworks and tools we can help designers see where their instincts, assumptions, and moral defaults are quietly overriding the people they’re designing for. Not to eliminate value tension — that’s impossible — but to recognize it and make trade-offs consciously instead of unconsciously.
Ethical design isn’t about being virtuous. It’s about being honest with yourself about the values you’re imposing.
And in the end, it comes down to one unavoidable question: Are you designing for the user’s values — or your own?
References
Bias in computer systems: https://dl.acm.org/doi/10.1145/230538.230561Why I Don’t Believe in Empathic Design: https://medium.com/thinking-design/why-i-dont-believe-in-empathic-design-c3dd0a956de9Value-Sensitive Design: https://dl.acm.org/doi/pdf/10.1145/242485.242493Do Artifacts Have Politics?: https://monoskop.org/images/8/8c/Winner_Langdon_1980_Do_Artifacts_Have_Politics.pdf
Don’t miss out! Join my email list and receive the latest content.
Are you designing for the user’s values — or your own? was originally published in UX Collective on Medium, where people are continuing the conversation by highlighting and responding to this story.
