Bridging the Gap: Designing for AI-Human Alignment

Kieran Evans
6 min readApr 7, 2025

--

Illustration of a man and a robot standing on a bridge in the sky.

Designing for Understanding

I’ve been designing alongside AI for the past year, and just as importantly, designing for it. One of the most revealing projects involved building a data analysis product that helped analysts to work with complex datasets and uncover insights they could act on. This experience vividly illustrated that good AI design is less about raw technological capability and more about the shared grounding we create between users and AI.

Context Is the Interface

Chat often feels like the easiest way to introduce AI into a workflow. It’s conversational, flexible, and seemingly intuitive. But in complex scenarios, chat quickly runs into problems if it lacks a strong anchor point.

In our data analysis project, chat was a core part of the initial product experience. We expected analysts to use natural language queries like, “Why did this metric drop last month?” or “Where are my anomalies?” But in practice, chat struggled to deliver meaningful results. The analysts worked with highly bespoke datasets, internal jargon, and nuanced metrics unique to their organizations. The AI, unfamiliar with these specifics, repeatedly asked clarifying questions or failed to respond in helpful ways.

Two user experience wireframes. Each has a data visualization and controls to edit, as well as different manifestations of chat in the UI.
Early concepts integrated chat into data-centric views, revealing how hard it is to drive useful conversation without mutual understanding.

But it wasn’t just the AI that lacked context. Analysts were dropped into dense tables or dashboards without a clear starting point. Faced with hundreds of rows of data, they weren’t always sure what to ask (or how to ask it). That uncertainty, combined with a chat interface that expected them to drive the conversation, created friction from both sides.

Essentially, we were asking end users to teach the AI and to drive the conversation at the same time. That’s the wrong dynamic. The lesson became clear: chat itself isn’t the interface. The context is. Without aligning the AI and the user around shared knowledge, interactions break down rapidly.

If neither user nor AI can answer “What are we talking about?” — then the conversation is going nowhere fast.

To nurture a common language between users and AI, the way we present AI outputs becomes just as important as the outputs themselves. That’s where design plays an active role. Not just in layout, but in helping shape how meaning is delivered.

Shaping AI Outputs

Once shared understanding was established as a goal, the next challenge was making the AI’s outputs readable, trustworthy, and easy to act on. Even when AI generates useful information, it often arrives in a form that’s hard to parse. That’s why structure, visual clarity, and tone matter.

Early in our project, the AI delivered dense, text-heavy insights that analysts struggled to parse quickly. Our UX team acted like editors, actively influencing how AI outputs appeared. We experimented with structure, length, visual hierarchy, typography, iconography, and formats like cards and bullet points. We recommended shortening overly dense insights. We also provided clearer visual cues to highlight key points.

An example of an unstructured AI output, which renders as a few paragraphs of copy. An example of a structured AI output, which uses typographic hierarchy and visual design to help the user parse the information more easily.
UX designers actively shape how the AI communicates. Good framing builds trust and supports user understanding.

This was more than a presentation layer exercise, though. The way we framed the outputs fed directly into how we structured the AI’s responses. It became a cyclical process: our design decisions led to suggestions about formatting, tone, and content structure, which we worked on with the engineers shaping the AI outputs. This iterative loop between AI development and UX was where real progress happened. We weren’t just designing around the AI, we were helping shape how it communicated in the first place.

Through repeated cycles of design and review, we focused on how to frame outputs clearly, so users could quickly grasp what the AI was saying. We learned that the UX designer’s role was to clarify the AI’s message. Our job was to turn insights into actionable takeaways.

Summary as Context Anchor

The turning point in our data analysis project came when we shifted our UX approach. We began using AI-generated summaries as the starting point for analysis rather than beginning with raw data views. Even the earliest summaries, basic and imperfect, served as alignment anchors. Analysts and AI now had a mutual reference to work from.

Initially, these summaries might’ve been rudimentary. But they gave the analyst and the AI something tangible to discuss. Ambiguity drastically reduced. Analysts could now ask questions based directly on what the AI had explicitly stated, rather than guessing how to phrase their queries.

Over time, as AI capability improved, these summaries evolved from a protective measure into a true asset. In addition to deepening alignment with users, they also delivered more meaningful insight. Richer summary analysis helped analysts spot patterns faster and offered smarter paths to explore next. As a result, users were better aligned with the AI, and were also more empowered in their own analysis workflows.

Resolved visual design of an analysis summary interface, with chat woven in, allowing the user to have a conversation related to the summary.
Summary grounds the interaction between the user and AI. It’s less about a complete story, and more about a reliable place to begin.

Context and Capability

Once summaries became part of the workflow, chat began to take on a new role. It evolved into a way to explore and extend the summary in real time. Analysts could explicitly reference the summaries. This made follow-up questions clearer and easier for the AI to handle.

This shift moved chat from being frustrating and unclear to genuinely collaborative and useful. The improvement in AI capability magnified the value of a strong anchor point. Analysts explored details confidently, knowing the AI understood their references.

Resolved visual design of the summary UI paired with a data table.
Grounded in the same summary, AI and user are free to explore data together.

Grounding AI Across Verticals

This recent work focused on analysts, but the design challenges weren’t unique. Across industries, from healthcare to education to finance, an integrated AI experience depends on whether the user and AI are aligned.

With data analysis, summaries became that alignment anchor. In other contexts, it could be a health assessment, a transaction history, an events calendar. The format varies, but the need for a shared foundation doesn’t.

Some principles hold across verticals:
Design context into the experience. Don’t rely on users to fill in the gaps.
Promote exploration without disorientation. Clear framing builds trust.
Treat AI output as a design material. Structure, tone, and delivery are just as critical as content.

The surfaces vary, but the shared foundation underneath should not.

AI as a Collaborator

Our data analysis project underscored a deeper shift in how UX designers should approach AI. It’s less about designing interactions with tools and more about facilitating a productive relationship with collaborative intelligence.

When we design for mutual understanding, we’re not just clarifying outputs. We’re building trust. We’re giving users the confidence to explore, question, and co-analyze with the AI. The shift from tool to collaborator fundamentally changes how products behave, and how users behave in return.

AI capability will continue to evolve, fast. But the foundation of alignment, clarity, and structured support is design’s job. It’s how we create not just smarter systems, but more meaningful ones.

--

--

Kieran Evans
Kieran Evans

No responses yet