You ran the training. Everyone attended. The feedback was positive. People said things like “that was really useful” and “I can see how I would use this.”
Six weeks later, almost nobody has changed how they work.
This is the most common outcome of AI training in agencies. It is not because the training was bad. It is because training is not the same as adoption. Knowing how to use a tool and actually using it every day are separated by a gap that training alone cannot bridge.
That gap is a change management problem. And most agencies do not treat it as one.
The gap between “trained” and “using it daily”
When someone finishes AI training, they have knowledge. They know what the tools can do. They have seen demos. They have even tried a few prompts themselves.
But then they go back to their desk, and their inbox is full, their project list is overdue, and the fastest way to get through the day is to do things the way they have always done them. The new AI workflow requires thought, experimentation, and a willingness to be slow before they get fast. Under pressure, people default to the familiar.
This is not resistance. It is human nature. The solution is not more training. It is designing the environment so the new way is easier than the old way.
The three barriers: fear, friction, forgetfulness
Every person in your agency who has been trained on AI but is not using it is stuck behind one (or more) of three barriers:
1. Fear. “If I use AI and the output is bad, I will look incompetent.” “If I use AI and the output is good, people will think I am not needed.” “If I admit I am using AI, clients will think the work is not original.”
Fear is the most powerful barrier because it operates below the surface. People will not tell you they are afraid. They will tell you they are “too busy to try it” or “it does not really apply to my work.” Both usually mean “I am afraid of what happens if I use it.”
How to address it: Make AI usage visible at every level. When leadership shares that they used AI for a presentation, it gives permission. When a senior creative says “AI helped me with the research on this,” it removes the stigma. Create safety by going first. We covered the cultural side of this in detail in what AI-first culture actually looks like.
2. Friction. The tool is hard to access. The prompt is hard to write. The output needs too much editing. The workflow requires five extra steps. Any friction, no matter how small, is enough to make someone revert to their old process.
How to address it: Remove every possible step between “I have a task” and “I am using AI for it.” Pre-built prompt templates. Bookmarked tools. AI integrated into the platforms they already use (Notion, Google Docs, Slack), not sitting in a separate browser tab. The fewer clicks between their current workflow and AI, the more likely they are to use it. This is where having AI champions makes a real difference, because they can build these shortcuts for their teams.
3. Forgetfulness. People simply forget that AI is an option. They have been doing tasks manually for years. The neural pathway is deeply worn. When a brief lands, they start working on it the way they always have because it does not occur to them to do it differently.
How to address it: Build triggers into existing workflows. Add “AI assessment” as a checkbox on every brief template. Put a reminder in the project kickoff agenda. Create a Slack bot that asks “Did you consider AI for this?” when someone starts a new task. You are not asking people to remember. You are making the environment remember for them.
Embedding AI into existing workflows
The most effective change management strategy is not creating new AI workflows. It is embedding AI into workflows that already exist.
Do not say: “Here is a new AI-powered research process.” Say: “We have added an AI research step to the existing briefing process. Before you start your research, run the brief through this prompt template. It takes 5 minutes.”
The difference is crucial. The first requires people to learn a new process. The second adds one step to a process they already follow. The adoption rates are dramatically different.
Practical examples:
- Briefing: Add an AI pre-analysis step to your existing brief template. Before the team reads a new brief, AI summarises the key points, identifies potential challenges, and suggests reference material. This is integrated into the brief document itself, not a separate tool.
- Reporting: Replace manual data pulling with AI-assisted report generation. The report template stays the same. The data collection becomes automated. The human writes the insight and recommendations, just as before.
- Meeting follow-up: Integrate AI transcription and summary into your existing meeting workflow. The meeting happens as usual. The notes write themselves. The action items are extracted automatically. For tool recommendations, see our guide on AI meeting tools for agencies.
Each of these changes one step, not the whole process. That is manageable. That gets adopted.
Handling resistance from senior staff
Junior and mid-level staff typically adopt AI faster. They have less to unlearn and less identity wrapped up in the old way of working. Senior staff are harder, and they matter more because their behaviour signals what is acceptable.
Senior resistance usually comes from one of three places:
“My experience is what clients pay for.” True. And AI does not replace experience. It replaces the grunt work that sits underneath the experience. A senior strategist who uses AI for research and analysis has more time for the strategic thinking that clients actually value. Frame AI as something that lets their experience shine brighter, not something that diminishes it.
“I have tried it and the quality is not good enough.” Often true, because senior staff have higher standards. The problem is usually prompt quality, not tool quality. Pair them with someone who writes better prompts. When they see AI produce output that meets their standards, resistance drops.
“If AI can do my job, why do they need me?” This is the fear barrier wearing a rational disguise. Address it directly: AI cannot do their job. It can do the 30% of their job that is not actually their job (admin, formatting, data gathering, first drafts of routine communications). They are still needed for the 70% that matters.
Celebrating early wins (properly)
When someone uses AI and it works, make noise about it. But do it right:
Be specific, not generic. Not “Great job using AI!” Instead: “Sarah used AI to cut the competitor analysis from 6 hours to 90 minutes. The quality was the same. Here is the prompt she used.” Specificity makes it real and replicable.
Quantify the benefit. “Saved 4 hours” is more compelling than “worked well.” Time saved, quality improved, client feedback received. Numbers make wins tangible.
Share the method, not just the result. The purpose of celebrating wins is not to make someone feel good (although that helps). It is to show others how to do the same thing. Every shared win should include the prompt, the workflow, or the approach so others can replicate it.
Do it regularly. A weekly “AI win” in your team meeting or Slack channel. Consistency matters. One celebration is an event. Weekly celebrations build a culture.
Measuring adoption, not training completion
Most agencies measure the wrong thing. They track: how many people attended training, how many have AI tool accounts, how many prompts have been written. These are activity metrics. They tell you nothing about whether AI is actually being used in daily work.
Measure these instead:
- Active usage rate: What percentage of your team used AI for client work this week? Not “has an account.” Used it. For actual work.
- Workflow integration: How many of your standard processes now include an AI step? Not “could include.” Actually include.
- Time reallocation: Are people spending less time on tasks AI can handle and more time on strategic, creative, or relationship work?
- Self-reported confidence: Ask your team quarterly: “On a scale of 1-5, how confident are you using AI in your daily work?” Track the trend.
If active usage is below 60% after three months, your change management is not working. Go back to the three barriers and figure out which one is dominant.
The 90-day plan
Days 1-30: Remove friction. Set up templates, integrate tools into existing platforms, build prompt libraries. Make AI the path of least resistance. Address skills gaps with targeted support.
Days 31-60: Build habits. Embed AI triggers into workflows. Run weekly win-sharing. Champions provide peer support. Leadership models AI usage visibly.
Days 61-90: Reinforce and measure. Track active usage. Identify people who are not adopting and understand why. Adjust the approach. Celebrate progress publicly.
After 90 days, you should see meaningful adoption, not from training, but from making AI the easiest, most natural way to work.
This is part of Delivery Notes, a series on implementing AI inside your agency. Subscribe to the newsletter to get new articles weekly.