Denmark wants to give every citizen legal ownership over their face, voice, and body - but legal rights don’t always stop real-world harm.

I never imagined we’d reach the point where we’d need to copyright our own face.
But here we are.

Denmark’s new law against deepfake proposal would make it illegal to use someone’s face, voice, or body in AI-generated content without their permission. It’s being described as one of the strongest legal responses to deepfakes in the world.

On paper, it’s a major step forward.
But if we look closely, it raises a much harder question:
Does copyright really stop the damage deepfakes can create?

What Denmark Is Trying to Do

The law aims to treat your face, voice, and body like intellectual property. Just like a book or a song, your likeness would be protected. That means if someone creates a deepfake of you without your consent, you could ask for it to be removed, seek compensation, and even trigger platform-level takedowns backed by fines.

It’s not about banning deepfakes altogether - it’s about giving you ownership over yourself in the digital world. It’s a bold step. One that says: Your identity matters.

What Copyright Gives You - and What It Doesn’t

This might sound empowering. And in many ways, it is. But we have to be honest about what copyright really does - and what it doesn’t.

Copyright doesn’t stop a deepfake from being made.
It doesn’t stop it from being uploaded, shared, or believed.
It doesn’t prevent the emotional, reputational, or social damage that might follow.

It gives you a legal right to act after the harm may already be done.

So yes, you can take it down. You can even sue. But the impact may already be out there. That’s the gap we need to talk about - because reactive protection is not the same as prevention.

What Real Protection Would Look Like

If we’re serious about reducing the harm caused by deepfakes, legal ownership is just one layer. The rest of the system needs to move too.

We need:
– Real-time detection tools
– Accountability from platforms
– Technical standards for watermarking and transparency
– Public awareness that AI content can be completely fake

Agency is important. But agency without infrastructure is like giving people keys to a house that doesn’t exist. If we don’t build the systems that support the law, it can’t really do its job.

The Twilight ZoneLet’s push this further - because there’s a strange side to this law that we need to talk about too.

If I legally own my face, what happens when someone makes a funny meme of me for my birthday and I don’t like it?
If it wasn’t approved by me - is it now a “harmful deepfake”?
Can I sue someone over a joke? A filter? A sketch?

What if someone makes a cartoon with a character that looks kind of like me?
Or uses my face in a parody? Or edits a group photo from a wedding and changes the lighting or angle?

These might sound ridiculous, but legally, we’re stepping into murky waters.
Because once identity becomes property, the law has to decide where ownership ends and creative freedom begins.

And that’s where things get tricky.
What starts as protection could easily slide into control.
If everyone owns their face - does that mean we need permission to imagine, draw, joke, or even remember?

The law might mean well - but the edge cases are real. And we’re going to need more than good intentions to sort them out.

The Bigger Picture: Identity in the Age of AI

This isn’t just a copyright issue. It’s a new chapter in the story of identity.

We’re used to thinking of our face, voice, and presence as things that “belong” to us in the emotional sense. But now they belong to us in a legal, digital, and commercial sense too.

That’s a big shift. It means identity has become data - and like all data, it can be copied, manipulated, monetised, or stolen. The law is trying to keep up. But AI moves fast. And the more we digitise ourselves, the more we’re going to face questions about what can’t be undone.

It’s a Start - Not a Solution

Denmark’s proposal is bold. It puts the conversation where it should be - in the hands of people, not platforms.

But let’s not pretend this is the fix.

This law won’t stop deepfakes from being made.
It won’t stop the speed of viral harm.
And it won’t prevent false memories or emotional fallout.

It’s a good beginning - but the real work is still ahead.
Because protecting identity in the age of AI isn’t just about ownership.
It’s about truth, speed, consent, and systems that can act before damage becomes permanent.

So yes - let’s copyright ourselves.
But let’s not stop there.

Expert Voices

Frozen Light Team
Frozen Light Team

Denmark’s Copyright Update: A New Defence Against Deepfake

Share Article

Get stories direct to your inbox

We’ll never share your details. View our Privacy Policy for more info.