The UK government has been working on a new law that would let AI companies use copyrighted content to train their models - unless the original creator opts out.

That means unless you explicitly say "don't train on my stuff," they can go ahead and use it.

Writers, artists, publishers, and even names like Paul McCartney and Tom Stoppard pushed back hard. They called it what it is: a threat to creators’ rights.

After the backlash, the UK government hit pause and said they’re looking for better options.

What the Government Is Saying

The UK's Tech Secretary Peter Kyle said the "opt-out" idea is being reconsidered.

Instead, they’re now exploring licensing models, where creators could get paid when their work is used to train AI.

The government added new amendments to the data bill - including a promise to study the economic impact of whatever plan they go with.

Basically: "We’re still working on it."

What That Means (In Human Words)

This whole plan assumes creators can “opt out” of AI training.
But here’s the problem: how?

AI companies don’t publicly share what data they train on.
There’s no way to check if your book, photo, or article was used.
And even if you did say no - there’s no system in place to make AI companies prove they listened.

So the UK’s plan?
It’s like giving you a fire escape in a house with no door handles.

Unless the law forces companies to:

  • Share what they train on

  • Honour opt-out lists

  • Pay creators fairly

…then nothing really changes.

And while the UK plans to review all this over the next year, critics say that's too slow - and leaves creators unprotected while AI keeps learning.

Bottom Line

  • 📜 Topic: Copyright protection for creators vs. AI training rights

  • 🧑‍🎨 Creators Involved: Authors, musicians, artists, and publishers including Paul McCartney and Tom Stoppard

  • 🧠 What’s At Stake: Whether AI can train on copyrighted work without permission

  • 📅 Timeline: Government review continues through 2025; possible delay until end of parliament (2029)

  • ⚖️ Legal Mechanism: Proposed "opt-out" policy - now reconsidered in favour of licensing models

  • 📰 More InfoThe Guardian.

❄️ Frozen Light Team Perspective

We’ll share a secret with you:
We’re not at the start of the AI shift - we’re already deep in it.

And while the UK is trying to do the right thing - protect creators, bring rules, create balance -
they kind of forgot one small detail: AI has already trained on everything.

You can’t “opt out” of something that’s already been learned.
There’s no “forget” button in AI.
No rewind.

We admire the effort. Seriously.
They’re one of the first governments brave enough to even try tackling this.
But if you don’t take into account what already happened,
how do you protect what happens next?

This law is trying to protect people…
from something that already happened quietly, at massive scale.

That’s not future-proofing.
That’s showing up late to the download.

🎤 Mic-Drop Line

Opt-out only works when someone asks you first.
Right now, AI is training first - and apologising never.

Expert Voices

Avinoam Boaron
Avinoam Boaron

2025 and the Big Crash of Copyright and Copy-Bots

Share Article

Get stories direct to your inbox

We’ll never share your details. View our Privacy Policy for more info.