This project is protected

Enter the password to view this case study.

That's not it — try again.
Case Study — PhotoG

The AI was brilliant. Nobody trusted it.

Four AI agents, one marketing platform, and a 62% drop-off rate before anyone published a single post. I slowed the system down on purpose — and that's what made it work.

Role
Lead Product Designer
Focus
User Flow & Trust Architecture
Team
4 Designers @ VSDesign
Timeline
2025 — v3.0 → v3.5
Welcome to PhotoG!
🔍
Research Analyst
📊
Brand Strategist
🎨
Creative Director
⚙️
Ops Manager

An efficient AI-Marketing team solve all your marketing problems.

Create Post Account Analysis Create Image
0.1
Context

Four agents. One marketing team. Nobody sticking around.

PhotoG promised e-commerce brands something ambitious: a full AI marketing team in one platform. A Research Analyst to study your market. A Brand Strategist to position you. A Creative Director to build the posts. An Ops Manager to publish them. The technology worked. The experience didn't.

62%
Gone before their first post
Users who signed up, started the flow, and left
5 of 8
Lost at the Creative Director stage
The exact moment the flow went opaque
"Where am I?"
The question that defined the problem
Most common feedback in 8 user sessions
User Research Findings
Research synthesis: usability test heatmap showing the 62% drop-off, session recordings highlights, or affinity map from 8 user interviews.
Suggested: 1200 × 600 · Research artifacts from Figma
0.2
Problem

The system was smart. The experience was a black box.

I watched eight people use PhotoG in test sessions. Same thing every time. They'd type in their product, the AI would start working, and within thirty seconds their eyes would glaze. They didn't know which agent was active, what stage they were in, or whether they could change anything. Three root causes, one pattern.

01

Opaque Pipeline

A single "Thinking..." spinner for a four-agent system. No indication of who was working or what they were doing. Five of eight testers used the exact same word: black box.

02

No Sense of Place

Research, strategy, content creation, and publishing — all dumped into one scrolling chat thread. No stages, no structure, no breadcrumbs. People lost track of where they were.

03

No Control

The flow auto-advanced from one agent to the next. No checkpoints. No approval gates. Users were watching a machine make decisions for them — and they hated it.

1.0
Strategy

I made the AI slower on purpose

Here's the counterintuitive thing about agentic AI: people don't want instant results. They want to feel like they're part of the process. Speed without comprehension is just noise. So I designed friction back into the system — deliberate pauses where users could read, review, and decide before the next agent took over.

Optimal Friction

Zero friction in a multi-agent system means zero trust. Every "Continue" button in v3.5 isn't a speed bump — it's a moment where a user goes from watching to owning.

01

Flow Audit

Mapped every step of v3.0 across 8 user sessions. Found 12 friction points — and identified exactly which ones mattered.

02

Stage Gates

Introduced three "Continue" checkpoints between agents. Each one is a conscious decision to proceed, not an automatic handoff.

03

Dual Panel

Split the screen. Left side: the conversation with the AI. Right side: the editable deliverable. No more guessing what the system produced.

04

Transparent Pipeline

Added a stepper showing which agent is active, what it's doing, and what comes next. Real-time thinking states replaced the generic spinner.

Flow Audit Map
User flow diagram: v3.0 pipeline with 12 friction points marked and drop-off rates. Shows where the "Continue" gates were inserted for v3.5.
Suggested: 1200 × 700 · Annotated user flow from Figma
2.0
Solution

Four agents, three gates, one story

The redesigned flow treats each agent as a chapter. You don't move to the next one until you've read what the current one produced — and decided it's right.

Agent 01

Research Analyst

Pulls industry data and competitor analysis. Delivers an editable report with source verification — every claim linked back to where it came from.

Agent 02

Brand Strategist

Takes the research and builds positioning. Reference posts, tone guidelines, strategic recommendations. All reviewable before the next step.

Agent 03

Creative Director

Generates post content and product images based on the approved strategy. This was where 5 of 8 users dropped off in v3.0 — now they stay.

Agent 04

Operations Manager

Handles the logistics: timing, hashtags, platform formatting. One final preview before anything goes live.

HomePictureVideo
2
3
4
⚡ Thinking...
Research complete! I've analyzed your industry and key competitors.

📄 China Beauty Industry Report 2025
Continue to Strategy →
Research Report
EditDownload
Executive Summary
China's beauty market continues to demonstrate strong momentum, driven by Gen Z consumers and social commerce...
Market Breakdown
Full UI Walkthrough
Complete v3.5 interface: 4-panel sequence showing Research → Strategy → Creative → Publish stages. Each panel shows the dual-panel layout with the stepper progressing. The core deliverable of the redesign.
Suggested: 1440 × 900 · Figma node 204:10335 and related frames
Design Decision 01

"Continue" gates as trust architecture

Every agent transition requires a deliberate click. Not because the system needs it — because the person does. After the redesign, 7 of 8 users said they felt in control. In v3.0, it was 1.

Design Decision 02

Left side talks. Right side shows.

The conversational panel follows the AI's reasoning. The deliverable panel shows what it actually produced — editable, downloadable, real. You can read and act at the same time.

Design Decision 03

Parallel systems, linear stories

Under the hood, agents run concurrently. The user never sees that. They experience a clear sequence: research, then strategy, then content, then publish. Systems thinking, translated into narrative.

HomePictureVideo
3
4
Ready to build your brand strategy? I'll use these insights to position your brand effectively in the market.
Continue to Creative →
Ask what's on your mind
Market Statistics
Download
Weekly Design Service Demand
MONTUEWEDTHUFRISATSUN
$25k
Category 2 $25,000
Category 3 $9,000
Category 4 $7,000
3.0
Evolution

Before and after — the same system, redesigned

Same four agents. Same underlying technology. Completely different experience.

Before — v3.0

One chat thread. No structure.

Everything in a single scrolling pane. The AI auto-advanced between agents. No progress indicator, no control, no reason to trust it. 62% left before their first post.

After — v3.5

Two panels. Three gates. Full visibility.

Conversation on the left, deliverable on the right. A stepper showing exactly where you are. Three "Continue" gates where you decide when to move forward. 87% said they felt in control.

v3.0 — Single Thread
AI Chat
Analyzing your industry...
Building strategy...
Creating content...
No progress indicator. No control.
v3.5 — Dual Panel + Stepper
234
📄 Report ready for review
Continue →
Research Report
4.0
Designing for Failure

AI will fail. The design has to plan for it.

Any system with four AI agents running sequentially is going to break sometimes. The question isn't whether — it's how gracefully. I designed for every failure mode I could find.

Empty Results

When the Research Analyst can't find enough data, it doesn't just say "no results." It shows a "Low Confidence" state with alternative queries and explains what went wrong.

Hallucination Safeguards

Every claim in the research report gets a source tag. Green means verified from a real source. Amber means the AI synthesized it. Users know exactly what to trust.

Mid-Flow Recovery

Each "Continue" gate doubles as a save point. Close the browser mid-flow, come back tomorrow — you're exactly where you left off, with all previous agent work preserved.

Graceful Degradation

If the Creative Director's image generation fails, users see a placeholder with two options: regenerate with adjusted parameters, or upload their own. The flow never dead-ends.

Error State Designs
Empty results, hallucination warnings, mid-flow recovery, and graceful degradation states — the actual UI for each failure mode.
Suggested: 1200 × 700 · Error state screens from Figma
5.0
Leadership

I led the design. Three designers made it real.

My role went beyond screens. I defined the interaction architecture, set the design principles, and made the case to stakeholders that slowing down the AI was the right call.

Design Direction

Defined the entire interaction architecture — the stage-gate model, the dual-panel layout, the transparency system. Led engineering reviews to make sure the frontend could support real-time agent state changes without jank.

Stakeholder Alignment

The hardest sell was convincing stakeholders that adding friction to an AI product was a feature, not a bug. I presented the v3.5 strategy using session recordings and drop-off data. They funded the full restructure.

Team Collaboration

Jia, Danni, and Jingyi owned the visual execution — component design, illustration, motion. We synced weekly. Their constraints around animation performance and component reuse actually strengthened the architecture.

6.0
Impact

Slower AI. Better numbers.

+38%
More people published their first post
Internal beta, same user cohort
87%
Said they felt in control of the AI
Up from 12% in v3.0
Funded
Stakeholders approved the next cycle
Based on the research and beta results
Final Product
Polished hero shot of v3.5: the complete dual-panel interface in a browser mockup with real content. The "money shot" that demonstrates the full redesign before closing reflections.
Suggested: 1440 × 900 · Device mockup with v3.5 UI
7.0
Reflection

Three things I'll carry forward

Friction is a design material

The most impactful thing I designed for PhotoG was a pause. Not a feature, not an animation — a moment where someone stops and thinks before continuing. I'll use that pattern again.

Transparency beats speed

Every user in our tests preferred watching the AI work step-by-step over getting instant results. Understanding builds trust. Trust builds retention. Speed alone doesn't.

Translate systems into stories

Engineers think in parallel processes. Users think in narratives — first this happens, then that. The designer's job is to bridge those two worlds without losing either one.

Next
Continue Reading

LastMinute — When a nurse calls out at 2 AM

A hospital scheduling system that replaced phone trees and spreadsheets with something the staff could actually use at three in the morning.

Read the case study →