Skip to content

AI in higher education: legal insights

Insight

building

On 17 October, David Copping and Ethan Ezra hosted a webinar discussing the legal implications of artificial intelligence (AI) within the higher education sector, billed as Cutting Through the Noise. The key takeaways were as follows:

Introductory points

  • The existing legal framework: There are clearly gaps in some areas (such as the law around deepfakes, or the development / deployment of AI tools which pose existential risks) and areas of policy which need to be better developed (such as around plagiarism). However, we do have an existing legal framework which enables many questions around the use of AI to be answered without any new law or regulation being implemented.
  • Avoid treating AI as a technological monolith: Whilst much of the current focus relates to generative AI, it is important to remember the breadth and variety of AI as a technological phenomenon. This certainly rings true in the HE sector: current AI examples include intelligent tutoring systems, personalised feedback software, and educational data mining platforms.

Student and academic outputs

  • Differing views on (generative) AI: Currently, there is a huge discrepancy in how students and academics view the use of generative AI. The former is overwhelmingly favourable towards its use, whilst the latter largely view AI as a risky plagiarism tool. Indeed, student use of generative AI does present challenges, from the misrepresentation of one’s academic abilities to infringement of third party-owned IP.
  • Integration rather than prohibition? Given the existing prevalence and popularity of generative AI within the student community, it may be difficult to enforce an outright ban on its use. Universities may be better off implementing strategies to manage and regulate the use of (generative) Some of these include:

    1. Internal and external liaison: Consider setting up an oversight committee or board to discuss and evaluate the risks of generative AI. Also, keep a close eye on what other universities/industry bodies and regulatory groups are saying. The Russell Group’s July 2023 principles on the use generative AI tools in education is illustrative of this, with principle 5 noting the need for collaboration between various industry bodies.
    2. Stay on top of initiatives which counter the risks: For example, Imperial College Business School’s IDEA Lab has proposed a "Generative AI stress test" for teachers to assess the vulnerability of their modules to generative AI tools.
    3. Implement an AI use policy: Consider addressing: (i) what AI the policy covers, (ii) how / when can it be used by students, (iii) tying in the policy with your existing plagiarism policies, (iv) ways of verifying student learning (eg when to require signed declarations and full source citations), (v) restricting the disclosure of proprietary information and third party IP infringement, and (vi) when faculty can use (generative) AI.

IP issues with AI

  • Can AI invent / own patents? As a reminder, patents protect novel inventions which have industrial application. The UKIPO has previously refused two 2018 applications under sections 7 and 13 of the Patents Act 1977 (requirement for the involvement of “a person”) because the AI machine in question, known as DABUS, was deemed not to be “a person” and so could not be the inventor or owner of the patent. The case is currently in the Supreme Court and we are awaiting the decision. Note that a UK IPO consultation on the matter concluded no change to the existing law was needed.
  • What about copyright? Under the Copyright Designs and Patents Act 1988, the “author” of a work is the person who creates it. In the case of a work which is computer-generated, s.9(3) CDPA provides that “the author shall be taken to be the person by whom the arrangements necessary for the creation of the work are undertaken”. Quite who this is in the case of AI-generated art is up for debate: it could be the creator of the AI system itself, and / or the person who has instructed that system to generate such artwork, but clearly the current copyright framework in the UK (and elsewhere) is capable of applying to AI-generated work. In terms of ownership, unclear: the rule of thumb is that the author of a work will be its owner (or an employer when created under an employment scenario). AI platforms adopt differing approaches, with OpenAI, for example, assigning ownership of the generated output to the user.
  • Infringement: As noted above, the use of AI carries an IP infringement risk primarily because AI tools like ChatGPT are "trained" on vast third-party datasets across the internet to generate output material. There are infringement exceptions, eg text / data mining for non-commercial research, but these are unlikely to apply in larger scale data extraction scenarios in a commercial context.

Data protection issues and safeguarding against bias

  • Data protection: The risks here relate to: (i) users inputting personal information into AI tools (given many tools retain / make onward use of input data), and (ii) AI tools which target publicly available personal data (eg profiles and registries). HE institutions should mitigate these issues with suitable data protection impact assessments, AI policies regulating permissible input data (eg limiting personal details), and technical measures to ward off data scrapers.
  • Bias: Multiple studies have demonstrated various social and political biases inherent in AI tools like ChatGPT. HE bodies should consider what types of work and tasks they would permit AI being used for, eg typing in queries on sensitive political issues, student admissions, and welfare queries.

Government and regulatory pronouncements

These include:

This publication is a general summary of the law. It should not replace legal advice tailored to your specific circumstances.

© Farrer & Co LLP, November 2023

Want to know more?

Contact us

About the authors

rgb

David Copping

Partner

David has significant experience advising clients on a broad range of complex issues relating to intellectual property, technology and data. Typically, David advises clients looking to harness and exploit IP, including international commercial opportunities. David also helps clients on a range of commercial transactions and joint ventures.

David has significant experience advising clients on a broad range of complex issues relating to intellectual property, technology and data. Typically, David advises clients looking to harness and exploit IP, including international commercial opportunities. David also helps clients on a range of commercial transactions and joint ventures.

Email David +44 (0)20 3375 7485
Ethan Ezra lawyer photo

Ethan Ezra

Associate

Ethan advises clients on a variety of intellectual property (both contentious and non-contentious), commercial contracts, and information law matters. His clients include higher education institutions, cultural organisations, businesses, and schools.

Ethan advises clients on a variety of intellectual property (both contentious and non-contentious), commercial contracts, and information law matters. His clients include higher education institutions, cultural organisations, businesses, and schools.

Email Ethan +44 (0)20 3375 7169
Back to top