Skip to main content

Archived Article — The Daily Perspective is no longer active. This article was published on 19 March 2026 and is preserved as part of the archive. Read the farewell | Browse archive

Technology

Adobe Unleashes Customisable AI Models for Creative Professionals

Custom Firefly models are now publicly available, allowing creators to train AI on their own work—though copyright safeguards remain contentious

Adobe Unleashes Customisable AI Models for Creative Professionals
Image: The Verge
Key Points 3 min read
  • Adobe launched Firefly Custom Models in public beta, enabling creators to train AI on their own assets for consistent visual output.
  • The tool costs 500 credits per training session and requires users to confirm they own the rights to training material.
  • Adobe claims Firefly is safer than rivals because it was trained on licensed stock and public domain content, but copyright concerns linger.
  • Legal uncertainty remains around who bears liability if trained models inadvertently infringe third-party rights.

Adobe has moved its custom model technology out of beta today, putting personalised AI image generation directly in the hands of creators and brands. The Firefly Custom Models feature, announced at Adobe Max last October but previously restricted to select users, now allows anyone with a Creative Cloud subscription to train a model on their own visual assets.

The pitch is straightforward. Feed the system examples of your character designs, illustration styles, or photographic aesthetics, and it learns to replicate those qualities at scale. Custom models can preserve details like stroke weight, colour palette and lighting. For teams producing high volumes of branded content, the appeal is obvious: consistency without starting from scratch each time.

Each training currently costs 500 credits from the user's monthly generative balance. Adobe has positioned the feature as an extension of its broader Firefly strategy: giving creators more control over the models they use. Unlike general-purpose AI tools, these custom models are private by default. Adobe promises not to use images submitted for training to improve its foundational Firefly models.

Yet Adobe's approach to ownership and responsibility contains a familiar tension. When launching the beta, using Adobe's generative AI features to create, upload, or share content that violates third-party copyright, trademark, privacy, publicity, or other rights is prohibited. This may include uploading an input or reference image that includes a third party's copyrighted content. Users are prompted to confirm they have necessary rights before training begins. The question is enforcement. Nothing in Adobe's terms prevents someone from training a model on work they do not own. The burden of compliance falls entirely on the user.

This matters because the legal ground beneath generative AI remains unstable. The challenges around the legal status of generative AI tools and their outputs are significant, and the standards around generative AI and copyright have not yet been settled legally. Adobe addresses this for enterprise customers by offering legal indemnification for content generated by Firefly, a reassurance that acknowledges the risk. Individual creators get no such protection.

There is a counterargument worth taking seriously. Adobe trained its initial commercial Firefly model on licensed content, such as Adobe Stock, and public domain content where copyright has expired. This positioning distinguishes Firefly from competitors like Stable Diffusion and Midjourney, which have faced lawsuits for allegedly scraping artists' work without permission. For users training custom models on their own assets, the compliance question becomes simpler: train only on what you own or have licensed. That is a workable standard.

But it shifts rather than resolves the core problem. If a trained model produces output that resembles work by another artist, determining infringement will turn on fact-intensive questions about training data composition, output similarity, and what constitutes fair use in the age of machine learning. No amount of user confirmation before training actually prevents bad-faith behaviour. Adobe claims several major companies, including Tapestry and Deloitte Digital, have already used custom models to scale on-brand creativity. For well-resourced enterprises with legal counsel, that may suffice. For freelancers and smaller studios, it introduces new liability exposure.

The public beta represents a calculated expansion of Adobe's business, not a resolution of AI's copyright problem. Adobe's best practices guidance emphasizes ownership and consent, but ultimately asks users to police themselves. The company has structured Firefly to be commercially defensible for its own products. Whether users of custom models enjoy the same protection depends on facts that only a court can ultimately settle.

Sources (5)
Oliver Pemberton
Oliver Pemberton

Oliver Pemberton is an AI editorial persona created by The Daily Perspective. Covering European politics, the UK economy, and transatlantic affairs with the dual perspective of an Australian abroad. As an AI persona, articles are generated using artificial intelligence with editorial quality controls.