A holiday tale of AI replicas and ghosts of performers past

31 views
Skip to first unread message

Georgia Jenkins

unread,
Dec 22, 2025, 2:57:26 AM (10 days ago) Dec 22
to ipkat_...@googlegroups.com
Photo by Jessica Lewis 🦋 thepaintedsquare on Unsplash

2025 has been nothing short of eventful for copyright law with artificial intelligence (AI) wreaking total chaos (carnage?) especially over the last few months. Yet, if you thought we were in for a quiet (much-needed) AI-break, this Kat reckons we’ll continue to hear updates right through the Holiday season which is, of course, wonderful news for all of your friends and family. If you needed an opener, why not refer them to a growing trend to integrate synthetic performers, without disclosure, into brand marketing?

The list continues to grow: From Levi's creating ‘Fake’ AI Models to create ‘hyper-realistic’ fashion models, to Mango TeenGuess and H&M adopting cost-effective digital clones. Even the Queen of Christmas, Mariah Carey was the subject of accusations of ‘digital overlays’ that went beyond retouching - although the director, Joseph Kahn, said it was purely CGI. That did not stop the internet from debating whether it even matters.

All this to reflect on New York Governor Kathy Hochul signing two bills at the beginning of the Holiday season to ‘protect consumers and boost AI transparency in the film industry’. Prevailing to a ‘common sense’ approach, Hochul explained that ‘[i]n New York State, we are setting a clear standard that keeps pace with technology, while protecting artists and consumers long after the credits roll’.

Synthetic performers in advertising

The first, Senate Bill 8420-A/S.8887-B, amends section 396-b of the General Business Law related to false and deceptive advertising. It requires advertisers to identify when an advertisement includes AI-generated people. Alongside defining AI and generative AI, the amendment defines a synthetic performer as:
a digitally created asset created, reproduced, or modified by computer, using generative [AI] or a software algorithm, that is intended to create the impression that the asset is engaging in an audiovisual and/or visual performance of a human performer who is not recognizable as any identifiable natural performer. 
The emphasis on ‘synthetic performers’, not digital replicas, mirrors the 2023 TV/Theatrical Agreement, supported by the Screen Actors Guild and the American Federation of Television and Radio Artists (SAG-AFTRA). While digital replicas, sometimes referred to as digital twins, use specific characteristics (voice and likeness) of a performer, synthetic performers are completely new digital characters.

This bill only applies to uses that imply that these synthetic performers are human. It requires those producing or creating an advertisement using synthetic performers to ‘conspicuously disclose’ such use. However, there is a small caveat, as the producer or creator must have actual knowledge. If they do, the first time they fail to comply there is a penalty of $1,000, and the second time, $5,000.

This threshold for liability likely excludes platforms, publishers, broadcasters and advertising intermediaries, unless the producer or creator of the advertisement informs them directly. This approach seems to incorporate intense broadcaster lobbying as a previous version ‘required local stations to include a disclosure label on any advertisement that included a computer-generated character or synthetic performer’.

Beyond intermediaries, the law additionally excludes advertisements or promotional materials for expressive works including:
[M]otion pictures, television programmes, streaming content, documentaries, video games, or other audiovisual works, provided that the use of a synthetic performer in the advertisement or promotional material is consistent with its use in the expressive work. 
So if a video game features an AI character, a trailer for that game can feature the same AI character without disclosure.

Lastly the amendment does not impact section 230 47 U.S.C nor does it apply to audio advertisements, nor AI limited to language translation of a human performer.

Post-mortem right of publicity bill

Senate Bill 8391/A.8882, also backed by SAG-AFTRA, amends section 50-6 of the Civil Rights Law relating to the right of publicity. Enacted in 2020, the right of publicity established postmortem rights for ‘deceased personalities’ and ‘deceased performers’, and banned certain deceptive uses of ‘digital replicas’ (highly realistic computer‑generated likenesses) in sound recordings or audiovisual works. It introduced a 40‑year postmortem protection period, with registration requirements for successors in interest.

The amendments redefine a digital replica as:
A new created computer-generated, highly realistic electronic representation that is readily identifiable as the voice or visual likeness of an individual that is embodied in a sound recording, image, audiovisual work, including an audiovisual work that does not have any accompanying sounds, or transmission in which: (i) the actual individual did not actually perform or appear; or (ii) the actual individual did perform or appear, but the fundamental character of the performance of appearance has been materially altered. 
It also removes a lower threshold which previously excluded recordings or audiovisual works if they ‘imitate or simulate the voice of the individual’. Now it simply refers to authorized sampling of a sound recording or audiovisual work. Additionally, it extends infringing use of a deceased performer’s digital replica from use as a fictional character or live performance of a musical work, to now cover audiovisual works, sound recordings, or live performances of a musical work, with knowledge that the use was of a digital replica and was not authorised.

In contrast to the first bill, this removes the knowledge criterion and introduces strict liability so that there is no requirement to prove that the public would likely be deceived into thinking it was authorized, even despite a disclaimer.

Comment

As evidenced by the SAG-AFTRA backing, these bills respond to the overarching theme of creative labour precarity and displacement as ‘synthetic workers’ tend to de-risk labour from an operational, financial and legal perspective. Both complement Section 5 of the FTC Act in seeking to prevent consumer deception through targeted disclosure and transparency measures.

There are parallels with article 50 of the EU AI Act which requires mandatory labelling of synthetic content. Significantly, there are exemptions for functional tasks (e.g. editing or where the input data is not substantially altered) and content that forms part of 'evidently artistic, creative, satirical, fictional or analogous works'. This is meant to be achieved through both human-visible and machine-readable technical markers, namely watermarking. Again, before most break for the Holidays, the EU Commission introduced its first Draft Code of Practice on Marking and Labelling of AI-generated content, proposing interwoven watermarking and fingerprinting.

However, this Kat’s attendance at the GenAI & Creative Practices Conference, highlighted the dominance of Western regulation to generative AI-related content. In particular, the relationship between GenZ and AI-generated virtual influencers in China, where livestreaming is ‘big business’, has already displaced ‘lower-tier’ livestreamers. Indeed China’s Measures for the Labelling of AI-Generated and Synthesized Content and Review of Industry Practices potentially forms the most comprehensive approach so far.

It would seem that these new laws might simply be a blip in the ocean especially following Donald Trump’s recent Executive Order that envisages an AI litigation Task Force to challenge state AI laws alongside state funding restrictions related to burdensome state AI regulation. How this complex legislative landscape plays out alongside Congress’ proposed No FAKES Act pursuant to Senate Bill 1367 and House Bill 2794 and the First Amendment is anyone’s guess.

All in all, a lot of food for thought, but perhaps its best to simply stick to enjoying the festivities. For now, this Kat is happy to let the tinsel on the tree be the only shiny, synthetic distraction she tries to untangle.
Do you want to reuse the IPKat content? Please refer to our 'Policies' section. If you have any queries or requests for permission, please get in touch with the IPKat team.
Reply all
Reply to author
Forward
0 new messages