Skip to main content

Archived Article — The Daily Perspective is no longer active. This article was published on 6 March 2026 and is preserved as part of the archive. Read the farewell | Browse archive

Technology

AI-Assisted Code Rewrites Create Legal Minefield for Software Licensing

A popular Python library's licence change raises hard questions about copyright, IP rights, and the future of open source

AI-Assisted Code Rewrites Create Legal Minefield for Software Licensing
Image: The Register
Key Points 2 min read
  • A Python character encoding library called chardet switched from restrictive LGPL to permissive MIT licence after an AI rewrite, sparking licensing disputes.
  • Legal experts disagree whether AI-assisted code counts as derivative work requiring same-licence inheritance, or represents genuinely new creation.
  • The US Supreme Court recently ruled AI-generated works need human authorship to qualify for copyright protection, creating a legal paradox.
  • Tech industry figures argue the economics of software development face disruption as AI makes code generation increasingly cheap and fast.
  • Current copyright frameworks designed for human creativity are struggling to adapt to machine-assisted development workflows.

Dan Blanchard, the maintainer of chardet, a Python character encoding detection library, released version 7.0 this week under an MIT licence, switching from the previous GNU Lesser General Public License (LGPL). The move appears routine on the surface. But the method used to accomplish it raises questions that could reshape software licensing for years to come.

Blanchard used AI, specifically Anthropic's Claude, to create what he describes as a clean room implementation. Rather than manually rewriting the code, he says he fed the library's specifications into Claude, which generated entirely new code from scratch. The result was a 48-fold increase in detection speed for a library that receives approximately 130 million downloads per month.

This sounds like a breakthrough. In reality, it has exposed a dangerous gap in how copyright law treats software. An individual claiming to be Mark Pilgrim, the original library creator, argued that Blanchard had no right to change the licence, citing the LGPL requirement that modified code must remain under the same licence. The disagreement is not academic.

The Copyright Paradox

The US Supreme Court declined to hear an appeal regarding copyright protection for AI-generated material on 2 March 2026, effectively solidifying a "human authorship" requirement. This creates a legal bind: if AI-generated code cannot be copyrighted, then Blanchard may not have legal standing to licence version 7.0 under any licence at all. Conversely, if the code is considered derivative of the original LGPL work, the new MIT licence would violate open source obligations.

Zoƫ Kooyman, executive director of The Free Software Foundation, told The Register there is nothing "clean" about an LLM which has ingested the code it is being asked to reimplement. She added that undermining copyleft is "a serious act".

Not everyone agrees. According to some observers, the evidence is clear that version 7.0 is an independent work, not a derivative of the LGPL-licensed codebase, and the MIT licence applies to it legitimately. Flask creator Armin Ronacher welcomed the licence change, arguing that copyleft code depends on friction to enforce it, but because it is fundamentally open, "you can trivially rewrite it these days".

Disruption Beyond Open Source

What matters most is what happens next. Under US Copyright Office guidance, the outputs of generative AI can be protected by copyright only where a human author has determined sufficient expressive elements, not through the mere provision of prompts. This standard is meant to protect human creativity. But it creates a problem: if chardet v7.0's code barely qualifies for copyright protection, then neither does much of the AI-generated software now entering production systems across the tech industry.

The implications extend well beyond open source. AI-generated code may unknowingly replicate open source material under restrictive licences, exposing companies to improper use or distribution liability. This question is hitting the open source world first, but may soon show up in Compaq-like scenarios in the commercial world, and once commercial companies see their closely held IP under threat, well-funded litigation is likely.

The uncomfortable truth is that no jurisdiction has settled how to treat AI-assisted code. The chardet dispute is a microcosm of a larger problem: software licensing frameworks designed for human programmers struggle with tools that operate at machine speed. Fix this and the entire software industry gains clarity. Ignore it and the legal uncertainty deepens, potentially creating dormant liability for thousands of companies building with AI assistance.

Sources (7)
Darren Ong
Darren Ong

Darren Ong is an AI editorial persona created by The Daily Perspective. Writing about fintech, property tech, ASX-listed tech companies, and the digital disruption of traditional industries. As an AI persona, articles are generated using artificial intelligence with editorial quality controls.