Case study
Problem
Campaign reviewing was historically one of the most painstaking, manual, and error-prone tasks in the entire workflow. Agencies were tasked with matching campaign data across five different media types. Each media type had its own data format, volume, and quirks, creating an enormous volume of entries to reconcile. This wasn’t just busywork: it accounted for over 40% of agencies’ time, creating a critical bottleneck that threatened both operational efficiency and the company’s bottom line.
Users were essentially forced to painstakingly link each campaign entry one-by-one without meaningful system guidance or an overview of recurring patterns. The process was repetitive, cognitively draining, and highly error-prone. User frustration was growing alongside the risk of costly mistakes that could impact compliance and client satisfaction.
The business challenge was clear: streamline the workflow while maintaining the precision required in a regulated environment. The team’s goal was to radically redesign the flow reducing manual workload and errors, and accelerating throughput without sacrificing precision or compliance.
Research
We shadowed five users closely over multiple sessions, documenting their workflows, pain points, and workarounds. In parallel, we analysed 80 related user feedback tickets.
Mapping the full user journey surfaced multiple bottlenecks and frustration points from initial data loading to the final reconciliation step. Users expressed fatigue and anxiety over potential errors, as the system provided little operational support. The process was highly manual, with users spending excessive time individually linking campaign entries.
Held workshops across customer success, and development teams to gather perspectives on business impact, and technical constraints.
Research underscored critical needs:
- AI assistance to identify likely matches, reducing manual linking.
- Bulk action capabilities to handle repetitive entries efficiently.
- Manual overrides to maintain user control.
Ideation & Concept development
Armed with insights, the team kicked off ideation sessions involving the PM, developers, customer success and myself as UX/UI lead. We brainstormed multiple approaches to reduce manual effort while ensuring control and accuracy. The breakthrough concept: AI surfaced a pre-selected list of high-confidence matches based on predefined campaign KPIs, enabling users to quickly choose a matching entry without rewriting data. While manual input remained possible, the suggested list covered the needed options streamlining the process and reducing unnecessary effort. Manual linking of the first campaign entry would trigger AI-generated suggestions for matching subsequent entries.
This approach allowed users to guide the system initially, then review and bulk-approve AI recommendations. We had to move fast. With high season approaching, there was no room for exploration, every decision needed to be grounded in real user behaviour and data.
Technical constraints
We had 2 months to design and ship before the busiest season. This tight timeline, combined with the complexity of campaign data across media types, meant every decision had to be intentional. We prioritised stability, control, and clarity over unproven automation.
Key constraints and how we responded:
- Our initial automation failed when handling mixed media inputs. We switched to a user-driven single-select approach for reliability.
- Campaigns with multiple plans and different KPIs confused the model. To avoid risky mismatches, we let users select campaigns manually.
- Users had to review each campaign’s KPI and metrics before acting. We replaced this with an AI-suggested modal only when data clearly aligned.
- Users had to scroll up and down to verify calculations. We moved metrics and auto-calculations into the table.
- Loading all campaigns at once caused lag. Lazy loading improved responsiveness, especially on older machines.
User journey
With concepts in hand, we mapped the new AI-assisted user flow, contrasting it explicitly against the legacy manual process.
The redesigned flow introduced:
- Automated AI suggestions triggered after the first manual match.
- Safe revert options enabling users to undo actions if needed.
- Clear confirmation notifications to notify users.
This flow was crafted to ease cognitive strain, enhance efficiency, and ensure integrity through built-in fallback mechanisms.
Design
We focused on simplifying interaction patterns and reducing friction in a data-heavy interface. Following collaborative workshops, we leaned on the design system to speed up delivery while staying within tight budget constraints. Additionally, we explored separating matched and unmatched campaigns into two tabs within the table view. However, testing revealed this added unnecessary navigation overhead. We also explored letting users select multiple campaigns to match in bulk, but this introduced a significant risk. Each campaign had a different data structure, and if incompatible sets were selected, the modal would regularly return errors.
To reduce error friction, we opted for a “one-at-a-time” entry point:
- Plus icon opened a modal with suggestions based on reusable KPIs.
- Users could select from the list or manually search.
- After confirming a match, a second modal appeared with bulk suggestions.
- Users could multi-select campaigns to apply the same match or skip the step.
User tests
We ran four user interviews in a live demo environment to gather qualitative feedback. Based on those sessions, we adjusted microcopy and reduced the number of match suggestions in modals. Rather than overwhelm users, the refined UI presents a focused shortlist of high-confidence matches, making decisions faster and more reliable.
User feedback highlights:
“We can finally focus on verifying results.”
“This version feels way more intuitive, fast and trusting.”
Impact & Next steps
The redesigned flow delivered dramatic improvements:
- 80% reduction in task completion time.
- 40% increase in user throughput.
- 90% adoption rate of AI-assisted matching features.
- Internal cost savings estimated at 30%.
- User satisfaction rose significantly.
The upcoming roadmap includes:
- Proactive AI prompts on import once media data is uploaded, the system will suggest matches automatically.
- Improved confidence scoring, to help users quickly gauge the reliability of AI-suggested matches.