Every design agency has had the conversation. A client sees a Midjourney image on LinkedIn and asks: “Can you do this instead of a photoshoot?” A junior designer starts generating concepts with DALL-E. Someone suggests replacing the mood board process entirely with AI.
None of these are a design strategy. They are reactions to a new tool without understanding where it actually fits.
AI is genuinely useful for design agencies. But image generation is the least interesting part. The real value is in the unglamorous work that eats your margins: asset production, layout iteration, resizing, and the repetitive production tasks that keep your senior designers away from creative direction.
What Midjourney is (and is not)
Midjourney produces striking images. It does not produce design. Design is solving a communication problem within constraints: brand guidelines, user needs, business objectives, technical requirements. Midjourney knows nothing about any of these.
Use it for exploration and inspiration. Generate 20 visual directions in 10 minutes. Use those as conversation starters with clients or as mood board inputs. But the moment you try to use Midjourney output as a final deliverable, you hit the wall: it cannot match exact brand colours, it cannot maintain consistency across a campaign, and it cannot respond to the specific feedback that design work requires.
The same applies to DALL-E and Stable Diffusion. They are concept tools, not production tools.
What actually works for design agencies
Mood boards and visual exploration. This is where AI image generation earns its place. Instead of spending two hours pulling references from Behance and Pinterest, generate 30-40 visual options that explore different directions. Present a curated selection to the client. The quality of the early creative conversation improves because you are reacting to more options, faster.
Layout variations. Figma’s AI features and tools like Galileo AI can generate layout variations from a design system. Feed it your component library and content structure, and get multiple layout options to evaluate. This is not replacing your designer. It is giving them more starting points.
Asset generation and adaptation. Background removal (Remove.bg, Adobe Firefly), image upscaling (Topaz), and asset resizing (Adobe’s generative fill for extending images) save hours of production time. A social campaign that needs 15 format variations across platforms used to take half a day of production work. AI tools cut that to under an hour.
Copy generation for design work. Designers spend a surprising amount of time writing placeholder copy, button labels, headlines for mockups, and microcopy. AI handles this well. Feed it the design context and ask for 10 headline options or 5 CTA variations. It is faster than waiting for the copywriter and good enough for client presentations. For the full approach, see how agencies are using AI content production across the workflow.
Presentation and pitch decks. Tools like Gamma and Beautiful.ai use AI to turn content into polished presentations. For agency pitches and client-facing decks, this saves the designer from doing layout work that is necessary but not creative.
Pattern and texture generation. For packaging, branding, and environmental design, AI generates seamless patterns, textures, and background elements that are genuinely useful as production assets. This is one area where AI output can go directly into final deliverables with minimal adjustment.
The tools worth knowing
Figma AI. Native AI features within your existing design tool. Auto-layout suggestions, component recommendations, and increasingly useful design assistance that works within your design system rather than outside it.
Adobe Firefly. Integrated into Photoshop and Illustrator. Generative fill, text effects, and image generation that respects Adobe’s licensing model (trained on licensed content). The integration into existing workflows is the key advantage over standalone tools.
Midjourney. Best image quality for concept work. Steep learning curve for consistent results. Version 6 and beyond handle style references better, making it more useful for maintaining visual consistency across a set of images.
DALL-E (via ChatGPT). Lower image quality than Midjourney but faster iteration and better at following specific instructions. Useful for quick concept sketches and social media visuals where perfection is not required.
Canva AI. If your agency uses Canva for social media or quick-turn assets (no judgement), the AI features handle resizing, background removal, and copy suggestions within the platform.
Remove.bg / Clipping Magic. Single-purpose tools that do background removal better than general AI image tools. Essential for product photography and composite work.
Where AI fits in the design process
Map it to your existing workflow:
- Discovery and research. AI helps with competitive visual analysis, trend identification, and generating reference material. Human-led, AI-assisted.
- Concept development. AI generates visual options and layout variations. Human curates, directs, and decides.
- Design development. Human-led. This is where craft, brand knowledge, and design thinking live. AI assists with copy, component suggestions, and iteration speed.
- Production. AI handles the bulk of resizing, adaptation, asset generation, and format conversion. Human reviews for quality.
- Delivery. AI assists with presentation formatting and asset organisation.
The pattern is clear: AI adds the most value at the beginning (exploration) and the end (production) of the process. The middle, where the actual design happens, stays human.
What stays human
Creative direction. The ability to look at 30 AI-generated options and know which three are worth developing. The instinct that a particular visual direction will resonate with a specific audience. The taste that distinguishes good design from technically competent output. AI cannot do this.
Brand strategy. Understanding why a brand should look and feel a certain way. Connecting visual identity to business objectives. Building a design system that scales. These require strategic thinking that AI does not possess.
Client relationships. Presenting creative work, handling feedback, navigating the politics of design decisions within client organisations. This is people work. Agencies that try to remove the human from the creative conversation lose clients.
Craft and detail. The pixel-level refinement that separates professional design from “good enough.” Typography, spacing, colour relationships, visual hierarchy. AI gets close but never quite right. Your designers’ ability to see what is wrong and fix it is the product you sell.
If you are exploring where AI fits across your creative agency’s toolkit, the honest answer is: everywhere except the parts that make you worth hiring.
This is part of Tool Drop, a series reviewing AI tools and approaches through an agency lens. Subscribe to the newsletter to get new articles weekly.