UK Report on Copyright and Artificial Intelligence published

64 views
Skip to first unread message

Oliver Fairhurst

unread,
Mar 18, 2026, 4:15:16 PMMar 18
to ipkat_...@googlegroups.com

UK Report on Copyright and Artificial Intelligence published

The UK's Departments for Science, Innovation and Technology and Culture, Media and Sport have published their long-awaited Report on Copyright and Artificial Intelligence. The Government was required to publish the report under s.136 of the Data (Use and Access) Act 2025 as part of a compromise in Parliamentary debates over transparency and the use of copyright by AI providers. While provided under s.136, it appears also to be the Government's response to the AI and Copyright consultation, or is at least the Government's 'current thinking'.

The Government has essentially binned all of its options, opting instead to see how things play out with litigation, the economy, and other countries' regulation. No immediate changes will be made to the complicated issue of computer-generated works, nor will there be any new text and data mining (TDM) exemption. But will the UK finally get a personality right?

Copyright subsistence in AI-created works

The report contains a broad summary of copyright law, and its history in the UK, much of which will be familiar to readers - even the Statute of Anne gets a name check. Much of the focus is on the balancing of the interests in protecting creativity and a wider interest in permitting the access and use of information. 

Those who have considered the interaction between AI and copyright in more detail will be aware of the somewhat contentious section 9(3) of the UK's Copyright, Designs and Patents Act 1988, which states that "In the case of a literary, dramatic, musical or artistic work which is computer-generated, the author shall be taken to be the person by whom the arrangements necessary for the creation of the work are undertaken.

That provision deals with authorship and not, at least explicitly, subsistence. One view is that s.9(3) has no effect on subsistence, i.e. the work must still be original, and to be original the work must reflect the author's own intellectual creation, expressing their free and creative choices, which cannot be satisfied by a computer making something using stochastic methods. Another view is that s.9(3) represents an exception to the originality requirement, part of a deliberate attempt by Parliament to "deal specifically with the advent of artificial intelligence" (Hansard, 12 November 2987, here).

The Report seems to take the view that the latter is correct, and that the person who made the arrangements necessary will be, in the case of a prompt-based AI system, the user that entered the prompt (p104). However, at p.105 the Report notes the need for the creative expression of a human author, and gives an example of AI-edited photograph (i.e. AI-assisted, not AI-generated). The Report also notes that some consultation responses had highlighted the contradiction between s.9(3) and the originality requirement as expanded upon in cases such as Painer

Rather than indicating a preference on whether or not to leave s.9(3) as it is (with all its possible contradictions), reform it to make it clearer, or to repeal it entirely, the Government has said they will keep it under review, with a preference to remove the provision. Given that the stated aim of such a move would be to "incentivise and protect human creativity", it seems likely that such a step would be carried out in order to deny protection for purely AI-generated works, while leaving open the possibility of protection for AI-assisted works. 

PM Kat's 'Reading Time'

The Report points out that the USA and China both have significantly higher levels of investment in AI than the UK does, and that neither of those countries specifically provide for copyright protection for AI-generated works (albeit that Chinese courts have found copyright to subsist in some cases). This is taken as a signal that protection of AI-generated works is not what is holding back the UK AI industry.

Copyright infringement in training

The UK has no clear exception to the owner's exclusive rights for training AI, or for harvesting the data needed to do so. The UK has a TDM exception, but only for non-commercial research. Other exceptions may apply, but will be highly fact-specific. The report canters through the Government's understanding of exceptions to copyright in a number of other jurisdictions, including the EU, USA, Japan, Singapore and India, concluding that the UK's approach is more rights holder-friendly. 

The Government's original preferred option was to introduce a data mining exception to copyright, subject only to an opt-out. Concerns were raised about the effectiveness of any opt-out mechanisms, and the creative industries complained that this was unfair and undermined their industry. Only 3% of respondents to the consultation supported this option, and those were mainly respondents representing the technology sector. 

The Government had already announced that it would not "take forward" this approach, and the Report repeats that position. The Report goes through the other options, dismissing each of them, and concluding that more data and evidence is needed, and more developments are awaited. In short, the Government will see where this goes, not with any noticeable hand on the tiller. Most significantly, the Report notes that the appeal in Getty Images v Stability AI is outstanding (IPKat here), and seems to be leaving the resolution of extra-territorial training of AI models that are then deployed in the UK to the courts to decide based on legislation drafted in the 1980s. 

Transparency

The Report notes that the EU and the State of California have both introduced transparency requirements for AI developers to disclose what works were used in training. There was an overwhelming number of respondents in favour of introducing enhanced transparency in the UK, specifically for them to identify the sources of training material. The Government has declined to do so, proposing to "work with a range of industry and other experts to develop best practice on input transparency to help right holders assert their rights." This will be welcomed by AI developers who had well-reasoned concerns over a need to repeatedly check for opt-outs, albeit concerns that will garner little sympathy among rights holders. The same approach (wait-and-see / working groups) is proposed on labelling AI-generated materials, which will at least be some relief to the advertising industry, which had been particularly concerned over labelling requirements. 

The can is thus kicked a long way down the road. 

Digital replicas

This is one area where the Government does seem keen to act. The Report records that the rise of digital replicas is causing a lot of concern, but that they also provide opportunities. The Report states that the Government will consider whether "a new personality right may be appropriate". This seems like an achievable reform, being something that exists in many jurisdiction. However, the proposal seems to have a significant risk of having much broader impacts than just AI, and could easily become bogged down in Parliament. 

Conclusion

It is disappointing, if perhaps unsurprising, that the government has decided to do nothing to resolve the conflict between rights holders and AI. That approach is likely a reflection of the Government having a lot to think about a the moment, something of a lack of direction on whether AI could help or hinder the UK economy, and the complexity of the issues and differing, conflicting interests. Instead, it is left to the courts, business and other countries to determine how this plays out. 

The current relatively pro-rights holder position on copyright exceptions, combined with the jurisdictional challenges facing those rights holders in enforcing their rights in the UK, means that the status quo serves no one. Despite the huge amount of talent in the UK, AI developers will not want to train their models in the country. This is for a number of reasons, including the legal framework as well as other practical and commercial reasons (including the availability of infrastructure and capital, and the cost of land and energy). Meanwhile, rights holders cannot enforce their relatively strong rights, with their works being used and commercialised, often without compensation. 

While this Kat has every faith that the Court of Appeal will reach the right legal view on whether or not deploying an AI model in the UK that has been trained on copyright works without consent is lawful (to the extent that is the question before it), the Government's decision to leave it to the courts reminds one of Lord Sumption's 2019 BBC Reith Lecture:

"It is true, politics do not always perform that function [taking account of the divergent interests and opinions of citizens] very well but judges will never be able to perform it. Litigation can rarely mediate differences. It’s a zero sum game.  The winner carries off the prize, the loser pays. Litigation is not a consultative or  participatory process, it is an appeal to law. Law is rational. Law is coherent. Law is  analytically consistent and rigorous. But in public affairs these are not always virtues."

It would surely be better for Parliament to be forging a compromise than for the courts to identify one party's case, on a particular set of facts, carries the day. 

Do you want to reuse the IPKat content? Please refer to our 'Policies' section. If you have any queries or requests for permission, please get in touch with the IPKat team.
Reply all
Reply to author
Forward
0 new messages