[The IPKat] [Guest post] Deepfakes: Can you “copyright” a person?

48 views
Skip to first unread message

Eleonora Rosati

unread,
Dec 8, 2025, 10:13:30 AM (4 days ago) Dec 8
to ipkat_...@googlegroups.com
The IPKat has received and is pleased to host the following guest post by Katfriends Jakob Plesner MathiasenJohannes Stilhoff and Astrid Christine Bay (all Gorrissen Federspiel) on a recent draft bill intended to amend the Danish Copyright Act and introduce a new protection for individuals whose likeness is reproduced in deepfakes. Here’s what they write:

Deepfakes: Can you “copyright” a person?

by Jakob Plesner Mathiasen, Johannes Stilhoff and Astrid Christine Bay

Which is the real IPKat?

bill to amend the Danish Copyright Act is currently under consultation. If adopted, it would introduce a self-standing and expanded protection for individuals whose likeness is reproduced in deepfakes. Until now, global protection against deepfakes has followed a general and fragmented pattern. Serious abuses - especially pornographic deepfakes - have typically been pursued under criminal laws. Other uses, including commercial exploitation, have mostly been handled through case-law and personality rights.

Outside of Europe, there are a few statutory outliers, such as Tennessee’s ELVIS Act (proving, perhaps, that even legislation can’t help falling in love with a good acronym). The result is an uncertain and uneven protection framework, with no clear answer on the exact scope of rights or remedies.

Denmark has now moved first in Europe. Under the Danish EU Presidency, the government has introduced a draft bill that places deepfake protection directly inside the Copyright Act. The aim is to create a defined and predictable legal basis against unauthorized uses of one’s persona.

Can you tell the difference?

It is becoming increasingly straightforward to generate AI-content, including deepfakes depicting a natural person. The services are readily accessible to anyone with a computer or a smartphone, and usability is high. The line between real and artificial has become so thin that even The IPKat briefly considered scheduling an appointment with an optician.

Technological advances have led to a significant increase in the volume of deepfakes on digital platforms. For example, the Danish Prime Minister, Mette Frederiksen, was the subject of a political deepfake video created and uploaded onto the platform X by two opposing parties. Another example regards the voice clone of the actor David Bateson who voices the main character in the computer game Hitman. His voice was used by more than 7,500 people before the Danish Rights Alliance removed the voice clone from the internet.

Some imitations may be humorous and harmless. Others are designed to mislead viewers into believing the content is real. Deepfakes can even include fraudulent, pornographic or other disturbing content against the will of the individual depicted. Regardless of purpose or intention, AI-generated content can be highly lifelike and leave the public uncertain about what is real and what is not. Not least, such content may infringe the depicted individual’s personal integrity, reputation and standing.

From unwritten principles to codification

Personality rights are recognised in various parts of European and non-European laws. In Denmark, they derive from the Danish Criminal Code and the Danish Marketing Practices Act. In addition, a number of important general principles of personality rights have been developed through case law.

The lack of codification of these principles into an actual statute, however, has at times created uncertainty regarding the scope, duration, and enforcement of the legal protection. The intention of the new Danish bill is to codify the protection of personality rights in relation to deepfakes and thereby make it easier for imitated individuals to pursue infringements. Even though most previous cases have concerned public personas, the Danish draft bill awards protection to every natural person, and not just the ones we recognize in public.

The protection has been placed in the Danish Copyright Act. The appropriateness of this has been debated. The subject matter, after all, belongs to personality rights rather than classic copyright. However, the placement does not turn it into an actual copyright. The choice of the Copyright Act mainly reflects that the new protection touches interests that already play a central role in copyright law.

The Danish bill and its subject matter

As mentioned, the Danish bill introduces a general protection for all people (famous or not) against realistic, digitally generated imitations of natural persons’ personal, physical characteristics. Furthermore, it also includes a protection against realistic, digitally generated imitations of performing artists’ artistic performances.

Think of it as a publication restriction rather than a creation ban. The bill only addresses the act of making deepfakes available to the public. The bill does not address the creation of deepfakes, nor the responsibility of AI-services through which deepfakes are created.

The criterion “realistic” refers to a likelihood that the digitally generated imitation may be confused with the actual individual: the bill underscores the importance of enabling the public to distinguish digital content from reality. As described in the bill, manipulated content may pose a democratic problem if it contains false statements or if the imitated person is depicted in a manner that significantly departs from contexts with which they are normally associated. This is both because such imitations may be infringing on the person’s integrity or amount to free‑riding on the economic goodwill the person has built up, and because it is undesirable in a democracy to spread misinformation that may confuse or mislead citizens.

Deepfakes and Article 10 of the European Convention on Human Rights (ECHR)

Copyright must be balanced against freedom of expression under Article 10 of ECHR. This balance has been tested numerous times by courts across jurisdictions, including by the Danish Supreme Court in the case concerning the sculpture of The Little Mermaid. The Supreme Court ruled that a major newspaper did not infringe the copyright by publishing a photo of the sculpture wearing a face mask during the Covid pandemic

Not a deepfake ...
This balance has been considered within the bill as it contains an exception for imitations that are primarily expressions of caricature, satire, pastiche, criticism of power, social criticism or the like. Thus, to the extent a deepfake qualifies as such, publication does not constitute an infringement. In addition, the proposed provisions should be interpreted consistently with article 10 of ECHR. This means that you may still use deepfakes to have a laugh at your prime minister’s expense, if the deepfake clearly constitutes a caricature and does not contribute to misinformation on a level which can entail a serious risk to the rights or significant interests of others.

Till death do us part and 50 years more

Post-mortem personality rights are generally recognized but how long these rights can be maintained has not generally been clear. Duration of protection often varies from case to case, country to country and in the U.S. from state to state.

The Danish draft bill proposes a protection lasting until 50 years after the year of the imitated individual’s death. The rationale for protection post‑mortem is that publishing digitally generated imitations of a deceased person without consent may offend surviving relatives. The 50‑year limit is intended to avoid an unduly restrictive impact on freedom of expression by preventing protection from lasting indefinitely.

The bill considers that 50 years is sufficient to reduce the risk that realistic, digitally generated imitations will be perceived as genuine in a manner that could harm democratic discourse, the deceased person’s reputation, or surviving relatives.
Easier enforcement on the horizon?

One of the central aims of the bill is to make enforcement easier. In the most serious deepfake cases, such as pornographic material, illegality is rarely in doubt. Here, Danish law already offers a clear and direct provision in the Criminal Code that allows quick removal and sanctions. The bill seeks to create the same level of clarity for other realistic, digitally generated imitations that fall outside the criminal threshold.

Only time will tell whether the new rules work as intended. Technology has a habit of moving faster than lawmakers. Something The IPKat observes with the same suspicion it reserves for robotic vacuum cleaners.
Do you want to reuse the IPKat content? Please refer to our 'Policies' section. If you have any queries or requests for permission, please get in touch with the IPKat team.
Reply all
Reply to author
Forward
0 new messages