Should Copyright Exceptions Apply to AI Mined Data? And Other Questions Raised Under the UKIPO Consultation on Artificial Intelligence and Copyright and Patents

On Friday 29 October, the UK’s Intellectual Property Office (the “UKIPO”) launched a consultation entitled “Artificial Intelligence and IP: copyright and patents” (see here), which closes 11:45pm on 7 January 2022 (London time). The consultation forms part of the UK government’s ‘National Artificial Intelligence (AI) Strategy’ (the “Strategy”), which followed the government’s 2017 Industrial Strategy publication.

The aim of the consultation is to determine the right incentives for Artificial Intelligence (“AI”) development and innovation, while continuing to promote human creativity and innovation.

In particular, the consultation is looking into three areas in detail, these areas arose out of the Call For Views on AI and Intellectual Property (see here):

  1. Copyright protection for computer-generated works without a human author. These are currently protected in the UK for 50 years. But should they be protected at all and if so, how should they be protected? Read our earlier thoughts on whether an AI-Generated Work gives rise to a Copyright Claim (see here).
  2. Licensing or exceptions to copyright for text and data mining, which is often significant in AI use and development.
  3. Patent protection for AI-devised inventions. Should we protect them, and if so, how should they be protected?

AI is, without a doubt, a transformative technology with the potential to have an enormous impact on human life, and in some cases, is already having a profound effect. We may be decades away from a truly independently functioning AI, but legislators are already wrestling the very thorny questions that arise as a result (see our article on the World Intellectual Property Organisation’s AI Issue Paper (see here)).

For the moment, there are more questions than answers and often any proposed answers only reveal more questions – who is responsible for an autonomous AI’s actions? The owner or the creator? Who is the owner or creator?

Across the world governments are trying to establish how to allow for safe and effective testing, development and implementation of AI without hampering its advancement. The UK government in particular is jockeying for position as THE jurisdiction for AI creation, as evidenced in the Strategy. The Strategy sets out the government’s ten year plan to make the UK a “global AI superpower”.

The Strategy sets out three broad objectives it hopes to achieve:

  • Invest and plan for the long-term needs of the AI ecosystem to continue UK “leadership as a science and AI superpower”
  • Support the transition to an AI-enabled economy, capturing the benefits of innovation in the UK, and ensuring AI benefits all sectors and regions
  • Ensure the UK gets the national and international governance of AI technologies right to encourage innovation, investment, and protect the public and our fundamental values.

The Strategy shows that the UK intends to position itself as the best place to live and work with AI by providing clear rules, applied ethical principles and a pro-innovation regulatory environment. The government’s AI council has played a central role in gathering evidence to inform the development of the Strategy including through its roadmap published at the beginning of the year, which outlined a set of recommendations reflecting views from the wider AI community in the UK.

However, the current AI framework (or lack thereof) has been labelled a wild west of testing and creation without any oversight, particularly in comparison to pharmaceutical industry legislation, Some suggest that, given the damage an AI could potentially cause, that it should be regulated as medicines are with closely monitored trials, licences and authorisations, and that there should be certain legal requirements for all AI (such as an “off” or “kill” switch). While others say that this would stifle innovation and slow progress, and that such tight control is not necessary (and incidentally, that such ‘switches’ are impossible in an internet-connected world). Certainty, the House of Lords’ view that “blanket AI-specific regulation, at this stage, would be inappropriate“, but it remains to be seen how the UK intends to regulate such a far-reaching and capricious technology.

Importantly, the Strategy acknowledges that no single definition of AI is suitable for every scenario, but recommends the following definition “Machines that perform tasks normally performed by human intelligence, especially when the machines learn from data how to do those tasks.” This definition could show how future UK legislation may define AI, whereas the current definition of AI in the National Security and Investment Act 2021 is a narrower definition (for the purposes of foreign direct investment analysis) as:

technology enabling the programming or training of a device or software to—
(i) perceive environments through the use of data;
(ii) interpret data using automated processing designed to approximate cognitive abilities;
(iii) make recommendations, predictions or decisions.”

There is also the question of what department would regulate AI in the UK or whether the disparate elements will be regulated individually, as certain aspects of AI are already regulated, for example: data protection (Information Commissioner’s Office), competition (Competition & Markets Authority), and human rights and equality (Equality & Human Rights Commission), in addition to the ‘sector-specific’ legislation and regulators, such as the Financial Conduct Authority and Medicines and Healthcare products Regulatory Agency.

There have been some intriguing developments in the AI legal sphere over the last few months. Dr Stephen Thaler’s AI system, DABUS (which stands for “device for the autonomous bootstrapping of unified sentience”) was named as the inventor of a patent at the South African and Australian patent offices, marking the first successes in Dr Thaler’s long running inventorship battle (previously failing at the UK, EU and US patent offices).

It is certainly an area to watch. The consultation closes at 11:45pm on 7 January 2022, and we urge all companies to submit their thoughts for a chance to have their opinion considered.

By Sunny Kumar, Georgina Rigg and Niall Lavery

Copyright © 2020, K&L Gates LLP. All Rights Reserved.