
AI @ Work
How might we turn scattered AI hacks into focused internal tools?
Team
5 members
Role
UX Researcher
Strategist
Skills
Mixed-Methods Research
Workflow Mapping
Use Case Prioritisation
Duration
6 months
TLDR;
Challenge
Teams across a creative agency were using AI tools across the company, but there was no structure, no model, and no visibility.
Insight
What teams needed wasn’t access to more AI tools, but clearer direction on where AI made sense for their role.
Solution
I led cross-team research to map real AI usage and surface repeatable needs. This informed two internal tools now in testing: a general-purpose assistant and a role-specific agents.
Note: For confidentiality reasons, many details have been blurred or summarised at a high level.
CONTEXT
Teams were using AI, but no one was learning from each other
In late 2024, I joined a cross-functional group at a creative agency exploring how AI could support internal workflows and new service offerings.
I led research to understand how people were actually using AI day-to-day — and where it made the most sense to support them.
THE CHALLENGE
Everyone was trying something, but in isolation
Strategy teams used ChatGPT to rewrite slides or speed up research. Creatives tested GenAI tools like Midjourney. Account teams used AI for notes or formatting. Others weren’t using anything at all. No one had a shared method, model, or reason. Prompts weren’t widely shared. Workarounds stayed private.

RESEARCH APPROACH
I mapped daily AI use across Strategy, Creative, and Accounts
Our focus was three departments with very different needs:

Strategy: insight gathering and planning

Creative: concepting and production

Account Management: client communication and coordination
Across 3 months, I ran 12 cross-department interviews, shadowed workflows, and documented key moments of friction. The goal wasn’t to evaluate tools, but to understand how people were working, where help was wanted, and what kind of support would feel natural.

FINDINGS BLURRED FOR CONFIDENTIALITY PURPOSES
PATTERN RECOGNITION
Across teams, the same small pain points kept stacking up
Strategy teams struggled with conducting and synthesising large volumes of research. Creatives dabbled with GenAI tools but lacked guardrails. Account teams leaned on AI for surface tasks like emails or formatting, but not deep work.
Despite different workflows, the patterns were clear: people were stuck doing repetitive, high-volume tasks — often manually. Tools helped, but wins weren’t shared, and pain points persisted.


FINDINGS BLURRED FOR CONFIDENTIALITY PURPOSES
DEFINING OPPORTUNITIES
We narrowed down to what was frequent, painful, and easy to solve
Rather than tackle every friction point, I filtered use cases through three simple criteria:

Is the task common?

Is it painful or tedious?

Can AI help without changing behaviour too much?
I used this lens to identify the most valuable use cases to hand off to the Development team.
THE SOLUTION
Insights informed two immediate AI tools now in testing
The research led to two focused development tracks:

A general-purpose assistant, designed to support all teams with universal needs like summarising, rewording, and internal lookups.

Role-specific agents, built for team-level use cases.
For the Strategy agents, I collaborated with the Development team as an internal tester: providing feedback, identifying edge cases, and refining features through ongoing iterations.
Creative agents are being developed in parallel, informed in part by early research insights I shared to guide their direction.
A SHIFT FROM ACCESS TO INTENTION
From "can we use AI?" to "where does AI actually help?"
AI was already in people’s hands. But it wasn’t always useful or usable. This project helped the company shift focus from tool access to tool relevance.
By surfacing existing behaviour and unmet needs, we gave the company a way to focus its AI development with purpose and a shared model for future experimentation.

SCREENSHOT OF THE GENERAL ASSISTANT'S INPUT FIELD :)
AI @ Work
How might we turn scattered AI hacks into focused internal tools?
Team
5 members
Role
UX Researcher
Strategist
Skills
Mixed-Methods Research
Workflow Mapping
Use Case Prioritisation
Duration
6 months
TLDR;
Challenge
Teams across a creative agency were using AI tools across the company, but there was no structure, no model, and no visibility.
Insight
What teams needed wasn’t access to more AI tools, but clearer direction on where AI made sense for their role.
Solution
I led cross-team research to map real AI usage and surface repeatable needs. This informed two internal tools now in testing: a general-purpose assistant and a role-specific agents.
Note: For confidentiality reasons, many details have been blurred or summarised at a high level.
CONTEXT
Teams were using AI, but no one was learning from each other
In late 2024, I joined a cross-functional group at a creative agency exploring how AI could support internal workflows and new service offerings.
I led research to understand how people were actually using AI day-to-day — and where it made the most sense to support them.
THE CHALLENGE
Everyone was trying something, but in isolation
Strategy teams used ChatGPT to rewrite slides or speed up research. Creatives tested GenAI tools like Midjourney. Account teams used AI for notes or formatting. Others weren’t using anything at all. No one had a shared method, model, or reason. Prompts weren’t widely shared. Workarounds stayed private.

RESEARCH APPROACH
I mapped daily AI use across Strategy, Creative, and Accounts
Our focus was three departments with very different needs:

Strategy: insight gathering and planning

Creative: concepting and production

Account Management: client communication and coordination
Across 3 months, I ran 12 cross-department interviews, shadowed workflows, and documented key moments of friction. The goal wasn’t to evaluate tools, but to understand how people were working, where help was wanted, and what kind of support would feel natural.

FINDINGS BLURRED FOR CONFIDENTIALITY PURPOSES
PATTERN RECOGNITION
Across teams, the same small pain points kept stacking up
Strategy teams struggled with conducting and synthesising large volumes of research. Creatives dabbled with GenAI tools but lacked guardrails. Account teams leaned on AI for surface tasks like emails or formatting, but not deep work.
Despite different workflows, the patterns were clear: people were stuck doing repetitive, high-volume tasks — often manually. Tools helped, but wins weren’t shared, and pain points persisted.


FINDINGS BLURRED FOR CONFIDENTIALITY PURPOSES
DEFINING OPPORTUNITIES
We narrowed down to what was frequent, painful, and easy to solve
Rather than tackle every friction point, I filtered use cases through three simple criteria:

Is the task common?

Is it painful or tedious?

Can AI help without changing behaviour too much?
I used this lens to identify the most valuable use cases to hand off to the Development team.
THE SOLUTION
Insights informed two immediate AI tools now in testing
The research led to two focused development tracks:

A general-purpose assistant, designed to support all teams with universal needs like summarising, rewording, and internal lookups.

Role-specific agents, built for team-level use cases.
For the Strategy agents, I collaborated with the Development team as an internal tester: providing feedback, identifying edge cases, and refining features through ongoing iterations.
Creative agents are being developed in parallel, informed in part by early research insights I shared to guide their direction.
A SHIFT FROM ACCESS TO INTENTION
From "can we use AI?" to "where does AI actually help?"
AI was already in people’s hands. But it wasn’t always useful or usable. This project helped the company shift focus from tool access to tool relevance.
By surfacing existing behaviour and unmet needs, we gave the company a way to focus its AI development with purpose and a shared model for future experimentation.

SCREENSHOT OF THE GENERAL ASSISTANT'S INPUT FIELD :)
AI @ Work
How might we turn scattered AI hacks into focused internal tools?
Team
5 members
Role
UX Researcher
Strategist
Skills
Mixed-Methods Research
Workflow Mapping
Use Case Prioritisation
Duration
6 months
TLDR;
Challenge
Teams across a creative agency were using AI tools across the company, but there was no structure, no model, and no visibility.
Insight
What teams needed wasn’t access to more AI tools, but clearer direction on where AI made sense for their role.
Solution
I led cross-team research to map real AI usage and surface repeatable needs. This informed two internal tools now in testing: a general-purpose assistant and a role-specific agents.
Note: For confidentiality reasons, many details have been blurred or summarised at a high level.
CONTEXT
Teams were using AI, but no one was learning from each other
In late 2024, I joined a cross-functional group at a creative agency exploring how AI could support internal workflows and new service offerings.
I led research to understand how people were actually using AI day-to-day — and where it made the most sense to support them.
THE CHALLENGE
Everyone was trying something, but in isolation
Strategy teams used ChatGPT to rewrite slides or speed up research. Creatives tested GenAI tools like Midjourney. Account teams used AI for notes or formatting. Others weren’t using anything at all. No one had a shared method, model, or reason. Prompts weren’t widely shared. Workarounds stayed private.

RESEARCH APPROACH
I mapped daily AI use across Strategy, Creative, and Accounts
Our focus was three departments with very different needs:

Strategy: insight gathering and planning

Creative: concepting and production

Account Management: client communication and coordination
Across 3 months, I ran 12 cross-department interviews, shadowed workflows, and documented key moments of friction. The goal wasn’t to evaluate tools, but to understand how people were working, where help was wanted, and what kind of support would feel natural.

FINDINGS BLURRED FOR CONFIDENTIALITY PURPOSES
PATTERN RECOGNITION
Across teams, the same small pain points kept stacking up
Strategy teams struggled with conducting and synthesising large volumes of research. Creatives dabbled with GenAI tools but lacked guardrails. Account teams leaned on AI for surface tasks like emails or formatting, but not deep work.
Despite different workflows, the patterns were clear: people were stuck doing repetitive, high-volume tasks — often manually. Tools helped, but wins weren’t shared, and pain points persisted.


FINDINGS BLURRED FOR CONFIDENTIALITY PURPOSES
DEFINING OPPORTUNITIES
We narrowed down to what was frequent, painful, and easy to solve
Rather than tackle every friction point, I filtered use cases through three simple criteria:

Is the task common?

Is it painful or tedious?

Can AI help without changing behaviour too much?
I used this lens to identify the most valuable use cases to hand off to the Development team.
THE SOLUTION
Insights informed two immediate AI tools now in testing
The research led to two focused development tracks:

A general-purpose assistant, designed to support all teams with universal needs like summarising, rewording, and internal lookups.

Role-specific agents, built for team-level use cases.
For the Strategy agents, I collaborated with the Development team as an internal tester: providing feedback, identifying edge cases, and refining features through ongoing iterations.
Creative agents are being developed in parallel, informed in part by early research insights I shared to guide their direction.
A SHIFT FROM ACCESS TO INTENTION
From "can we use AI?" to "where does AI actually help?"
AI was already in people’s hands. But it wasn’t always useful or usable. This project helped the company shift focus from tool access to tool relevance.
By surfacing existing behaviour and unmet needs, we gave the company a way to focus its AI development with purpose and a shared model for future experimentation.

SCREENSHOT OF THE GENERAL ASSISTANT'S INPUT FIELD :)