Nike —
Reframing Alignment,
Not Just Dashboards
We weren’t fixing a tool.
We were fixing how teams think about data.
Nike’s U.S. retail partner teams operated in silos
— with different systems, data formats, and definitions of truth.
The product wasn’t broken
— it just didn’t align.
So we didn’t redesign the interface.
We restructured the thinking behind it.


I served end-to-end design for Nike’s new retail insight platform — from framing the problem to shaping the system. I aligned cross-functional teams, translated ambiguity into structure, and turned fragmented data into a shared strategic tool.
Team:
Two Product Designers (incl. myself), Two Product Managers, Three Data Scientists, Two UX Researchers, Partners from Retail Ops & Data Governance
Timeframe:
May –
November
2022
Context & Challenge
Data wasn’t the problem.
Consensus was.
Regional teams used different tools, metrics, and mental models. The system showed data — but didn’t guide decisions. There was no shared structure, no common questions, and no clear next step.
We weren’t redesigning a product.
We were realigning perspectives.

Research & Strategy
We didn’t stop at empathy
— we designed for alignment.
Two questions shaped the study:
-
What do managers need to make local, confident decisions?
-
What breaks their trust in the system?
We interviewed 12 key roles, audited system use, and mapped retail planning cycles.
The problem wasn’t in the interface
— it was in how people thought about data.

One of the Personas:
Stephen Carter
Nike Store Manager
Stephen manages multiple stores, but still relies on Excel — because his tools don’t align, and his system doesn’t guide.
Behavior
Manages planning and reporting across cities. Uses multiple disconnected systems to collect data. Rarely trusts what he sees without manual validation.
Goals
Wants faster clarity: understand what changed, what matters, and what to do next — without switching tools.
Challenges
Conflicting metrics across tools. No shared judgment structure. Frequently loses trust in system outputs.
Opportunities
Design for alignment, not preference. Provide structured views that support recurring decisions — not static reports.
Current Product &
Pain Points
Users described the system as:
- "Too dense to scan"
- "Hard to compare"
- "Feels like reporting, not decision-making"
......
So they left — and returned to others.
When insight happens outside the product, the product has failed.


When the same data means different things to different teams
— the system isn’t a tool, it’s a barrier.
User Journey
Managers moved through three recurring tasks:
-
See what changed
-
Understand what matters
-
Share what’s next
But between those steps, the system vanished. There was no connective tissue.
No flow.
No logic.


A journey was mapped
— but decisions kept falling through the gaps.
Strategy
& System Design
We didn’t design features.
We designed a thinking scaffold.
Instead of building dashboards, we built structured responses to recurring questions:
-
What changed?
-
What matters?
-
What’s next?
This led to:
-
Modular insight cards
-
Intent-based filters
-
Comparative templates

.png)
System Outputs
We translated decision logic into three modular patterns:
-
Insight Cards
→ focused direct answers -
Intent Filters
→ show only what matters now -
Comparative Templates
→ support side-by-side framing
Design Principles
Clarity wasn’t about making things simple
— it was about making them make sense.
In a forest of metrics, filters, and modules, minimalism alone wasn’t enough. We needed structure that surfaced relevance, not just reduced clutter. So we emphasized hierarchy over density, comparability over volume, and defaults over customization. Every component was built not to decorate the interface, but to support a clear decision.

Designed for decision, not display.

Structure guides trust.
Wireframes to Fidelity
Wireframes weren’t about layout — they tested logic.
Every flow answered:
— does this next step make sense?
In high fidelity, we built structure through rhythm, spacing, and hierarchy — not visual appeal, but decision support.

Measured Outcomes
After launch, we saw clear, measurable impact:
-
Planning cycles became 65% faster, reducing cross-regional delays
-
System abandonment
dropped by 40%, reflecting regained user trust -
Cross-functional alignment scores rose
by 1.5 points, showing better strategic coherence
These numbers told a simple story: teams were not just using the system
— they were relying on it.
Organizational Shift
But beyond metrics, the real impact was cultural.
A new thinking model had taken root
— teams no longer made decisions in isolation, nor treated data as passive reports.
They asked better questions. Aligned faster. Trusted the system more.
The system worked not because it was sophisticated, but because it made people feel more certain, not more burdened.


Hi-fi design emphasized clarity through flow
— structure, not visuals, shaped trust.
As of 2024, the system is still running
— and evolving with AI.
That means something.
It means the foundation we built was resilient enough to last, and flexible enough to grow.
I left the company long ago.
But I still think of this project — not as a success story,
but as a benchmark.

The product interface is under confidentiality. What’s shown here is a simulated version.
[ Click to view ]

This page marks a pause, not the end. © 2025 Simon.