Intuit

Intuit

Intuit

I led the design of Intuit’s Human-Driven Data Labeling (HITL) system within the Expert Network (EN) platform. This new workflow lets experts efficiently label datasets as part of their existing operations, for high-quality annotations without disruption.
I led the design of Intuit’s Human-Driven Data Labeling (HITL) system within the Expert Network (EN) platform. This new workflow lets experts efficiently label datasets as part of their existing operations, for high-quality annotations without disruption.
I led the design of Intuit’s Human-Driven Data Labeling (HITL) system within the Expert Network (EN) platform. This new workflow lets experts efficiently label datasets as part of their existing operations, for high-quality annotations without disruption.
ROLE
Product Designer
TYPE
E2E, B2B
AI / ML
Dashboards
STATUS
V1 Shipped
85.7% Reduction time
50% Increase in annotators

Context

Intuit is a financial software company that has financial assistants that help customers with taxes, called Experts. Experts rely on the Intuit Engagement Portal (IEP) to manage their status and track progress across clients.

As part of the FY26 ‘Expert in the Loop’ (Intuits HITL system) initiative, I designed Expert-Driven Data Labeling to embed AI evaluation tasks directly into IEP and Salesforce
PROBLEM
All annotations were done multiple fragmented spreadsheets. The system of assigning, evaluation and analyzing this data is slow and inevitably unscalable as Intuit expands and integrates AI in all their products.
1
1
1
Not scalable
Cannot meet projected volume (~15M/year) or adapt to needs.
Not scalable
Cannot meet projected volume (~15M/year) or adapt to needs.
Not scalable
Cannot meet projected volume (~15M/year) or adapt to needs.
2
2
2
Cluttered interface
Visually heavy, hard to navigate, and disconnected from IEP
Cluttered interface
Visually heavy, hard to navigate, and disconnected from IEP
Cluttered interface
Visually heavy, hard to navigate, and disconnected from IEP
3
3
3
Workflow inefficiency
All manual from assigning, inputing and feedback
Workflow inefficiency
All manual from assigning, inputing and feedback
Workflow inefficiency
All manual from assigning, inputing and feedback
4
4
4
Inconsistency
Leading to variability in data interpretation and scoring.
Inconsistency
Leading to variability in data interpretation and scoring.
Inconsistency
Leading to variability in data interpretation and scoring.
GOALS
Integrate annotation tasks into existing workflows on the experts’ main platform (IEP & Salesforce) at scale, balancing these new tasks without disrupting customer service operations.
Integrate annotation tasks into existing workflows on the experts’ main platform (IEP & Salesforce) at scale, balancing these new tasks without disrupting customer service operations.
Integrate annotation tasks into existing workflows on the experts’ main platform (IEP & Salesforce) at scale, balancing these new tasks without disrupting customer service operations.
Sucess Metrics
Reach 1M expert-evaluated data points Expand expert-labeled model coverage Reduce manual execution time
Sucess Metrics
Reach 1M expert-evaluated data points Expand expert-labeled model coverage Reduce manual execution time
Sucess Metrics
Reach 1M expert-evaluated data points Expand expert-labeled model coverage Reduce manual execution time

Solution

Introducing the Expert Annotation System, a platform for experts to provide high-quality labeling and model feedback across offline, online, and live workflows. This framework makes products better aligned with the company’s AI innovation strategy and growth.

1
1
1
IEP integrated dashboard
for centralized expert assignment, task navigation, progress tracking, and feedback delivery in one platform
IEP integrated dashboard
for centralized expert assignment, task navigation, progress tracking, and feedback delivery in one platform
IEP integrated dashboard
for centralized expert assignment, task navigation, progress tracking, and feedback delivery in one platform
2
2
2
Modular design
that supports all experts and future evaluation methods across formats through standardized components
Modular design
that supports all experts and future evaluation methods across formats through standardized components
Modular design
that supports all experts and future evaluation methods across formats through standardized components

Research &
Telemetry

I combined existing AIMV research and concept / usability testing methods. My goal was to validate workflow design, uncover usability issues, and identify ways to optimize the in-platform annotation experience.

SIGNIFICANCE
With no integrated system, annotation volume stays far below what’s needed— making it slow and difficult to improve. There are 37K experts in Intuit’s network, yet only a fraction of their potential is tapped for these AI evaluations.
Opportunity
Integrating AI evaluation tasks into IEP/Salesforce can streamline workflows, scale annotation volume, and strengthen model quality and compliance.
Opportunity
Integrating AI evaluation tasks into IEP/Salesforce can streamline workflows, scale annotation volume, and strengthen model quality and compliance.
Opportunity
Integrating AI evaluation tasks into IEP/Salesforce can streamline workflows, scale annotation volume, and strengthen model quality and compliance.

Challenges

(1) Visual load

How might we make make wordy definitions easier to reference?

(2) Progressive disclosure

How might we let experts navigate tasks non-linearly?

Outcomes

Handoff, ship, and metrics

DESIGN HANDOFF
I broke down the anatomy of each component, clickable E2E prototype. As well as basic design requirements: typopographic variations, grid spacing.
KEY TAKEAWAYS
Applying existing patterns
require understanding their applicability and malleability for different contexts and edge cases.
Applying existing patterns
require understanding their applicability and malleability for different contexts and edge cases.
Applying existing patterns
require understanding their applicability and malleability for different contexts and edge cases.
Design system
Learning new ones takes time, knowing how to adapt them is necessary for problem solving.
Design system
Learning new ones takes time, knowing how to adapt them is necessary for problem solving.
Design system
Learning new ones takes time, knowing how to adapt them is necessary for problem solving.
Balancing constraints
Time, technical feasibility, and compromises — whilepushing for designs are worth the effort.
Balancing constraints
Time, technical feasibility, and compromises — whilepushing for designs are worth the effort.
Balancing constraints
Time, technical feasibility, and compromises — whilepushing for designs are worth the effort.
Key Results
8

Iteration versions

8

Iteration versions

8

Iteration versions

~50%

Increase in annotators

~50%

Increase in annotators

~50%

Increase in annotators

2

Design critique presentations

2

Design critique presentations

2

Design critique presentations

85.7%

Reduction in turnaround time

85.7%

Reduction in turnaround time

85.7%

Reduction in turnaround time

Gallery