On Wednesday, the UK government abandoned its previous position on copyrighted works. It's currently working on a data bill that, if unaltered, would have allowed AI companies like Google and OpenAI to train models on copyrighted materials without consent, with artists and other copyright holders offered only a mere opt-out clause.
After significant backlash, the UK backed off from that position. "We have listened," Technology Secretary Liz Kendall said on Wednesday. Yet the government's new stance is, well, not a stance at all. It currently "no longer has a preferred option" about how to handle the issue.
The reversal represents a significant victory for Britain's creative sectors. The decision follows months of cumulative criticism from artists and media groups, who have long argued the model would weaken copyright protections and undermine the commercial foundations of the UK's £146bn creative sector. Out of the more than 10,000 people who responded to the government's consultation on these measures, just 3% backed its opt-out proposal, while 95% called for either copyright to be strengthened, a requirement for licensing in all cases, or no change to current copyright law.
High-profile musicians led the charge against the original proposal. Elton John called the government "absolute losers" while Paul McCartney said that AI has its uses but "it shouldn't rip creative people off." McCartney and others artists were part of a "silent album" meant to show the impact of IP theft by AI. In 2025, industry bodies such as the News Media Association launched campaigns like "Make It Fair" to urge policymakers to protect existing copyright laws and ensure transparency in how AI systems use creative works.
The government now faces a genuine policy puzzle. The fundamental policy dilemma is how to balance AI-driven economic growth with the protection of intellectual property. On one side sit AI developers, who require vast datasets, often including copyrighted material, to train large language models and generative systems. On the other are creators whose works underpin those systems but risk being displaced by them. Ministers have been clear they do not want to choose between supporting a fast-growing AI sector or protecting industries that supply much of the data such systems rely on. "The UK must be an AI maker, not an AI taker," Kendall said, while insisting creative industries remain "one of our greatest exports" and central to the government's industrial strategy.
Rather than legislating immediately, the government is placing its bets on licensing, a market-based mechanism already beginning to take shape. A growing number of deals between AI firms and content owners, particularly in publishing, music and image libraries, suggests a commercial model is emerging. The government will legislate on the issue of transparency but generally avoid "intervening overly in licensing deals which industry is already coming together to reach."
Industry responses have been mixed but broadly positive about the direction. UK Music CEO Tom Kiehl described the decision as "a major victory," while promising to work with the government on the next steps. Featured Artist Coalition CEO David Martin told NME that he wanted the government's response "to commit to meaningful copyright reform that puts consent, transparency and fair remuneration at its core". "There can be no use of artists' work in AI training without explicit permission and proper licensing, and both AI platforms and the rightsholders who license them must be accountable for ensuring creators are properly authorised and paid," he said.
What remains clear is that the consultation process revealed deep divisions. "Workable" solutions that enable transparency over the content and data used to train AI models and allow rightsholders to opt their works out from being used for that purpose, have yet to be found, UK ministers have admitted, despite the government previously indicating a preference for those measures to be built into a package of AI-related reforms to UK copyright law. The government has until 18 March 2026 to provide a substantive update in relation to next steps towards reform.
The UK's approach now stands in contrast to its earlier ambitions to position itself as a global AI hub without eroding the economic interests of its creative industries. The sector was growing 23 times faster than the rest of the economy in Britain, home to the third biggest AI industry in the world after the United States and China. Whether the government can thread that needle through licensing frameworks and transparency rules, rather than legislative reform, will define whether this reversal becomes genuine protection for creators or merely a postponement of the underlying conflict.