AI in schools

Overview

The use of AI in schools and the impact it will have on teachers and pupils is something which the education sector is coming to terms with. There are many benefits with the use of AI but there are also significant risks too. 

At Edapt we support school staff in England and Wales with individual employment disputes and allegations. We are apolitical, independent and do not participate in political lobbying or strike action.

We understand there could be risks for teachers with exam maladministration, how pupil data is used with generative AI as well as other concerns in the future which are unforeseen.

In this support article, we summarise guidance from the Department for Education (DfE) on generative artificial intelligence in education, outline guidance from the Joint Council for Qualifications (JCQ) on protecting the integrity of qualifications and look at what the potential risks could be for teachers when using AI in the classroom.

We will update this article with more information when further guidance is published.

AI in schools: guidance from the DfE

The DfE has published its current position on the use of generative AI in the education sector.

Generative AI refers to technology that can be used to create new content based on large volumes of data that models have been trained on. This can include audio, code, images, text, simulations, and videos.

The DfE explains that when used appropriately, technology (including generative AI), has the potential to reduce workload across the education sector, and free up teachers’ time, allowing them to focus on delivering excellent teaching. 

It explains the education sector must continue to protect its data, resources, staff and pupils, in particular:

  • Personal and sensitive data must be protected and therefore must not be entered into generative AI tools
  • Education institutions should review and strengthen their cyber security, particularly as generative AI could increase the sophistication and credibility of attacks
  • Education institutions must continue to protect their students from harmful content online, including that which might be produced by generative AI

The DfE suggests whatever tools or resources are used in the production of administrative plans, policies or documents, the quality and content of the final document remains the professional responsibility of the person who produces it and the organisation they belong to.   

Schools and colleges may wish to review homework policies, to consider the approach to homework and other forms of unsupervised study as necessary to account for the availability of generative AI.

Minimising harm: sensitive data should not be entered into AI tools

The DfE explains generative AI stores and learns from data inputted. To ensure privacy of individuals, personal and sensitive data should not be entered into generative AI tools. Any data entered should not be identifiable and should be considered released to the internet. 

All staff in the education sector should be aware that generative AI can create believable content of all kinds. This includes, for example, more credible scam emails requesting payment. 

Whilst not substantially different from content they might find online, it is worth being aware that people interact with generative AI differently and the content that generative AI produces may seem more authoritative and believable.

AI in schools: guidance from the JCQ

The JCQ explains that while the potential for student AI misuse is new, most of the ways to prevent its misuse and mitigate the associated risks are not; centres will already have established measures in place to ensure that students are aware of the importance of submitting their own independent work for assessment and for identifying potential malpractice.

Examples of AI misuse include, but are not limited to, the following: 

  • Copying or paraphrasing sections of AI-generated content so that the work is no longer the student’s own
  • Copying or paraphrasing whole responses of AI-generated content
  • Using AI to complete parts of the assessment so that the work does not reflect the student’s own work, analysis, evaluation or calculations
  • Failing to acknowledge use of AI tools when they have been used as a source of information
  • Incomplete or poor acknowledgement of AI tools
  • Submitting work with intentionally incomplete or misleading references or bibliographies

Teachers must not accept work which is not the student’s own. Ultimately the Head of Centre has the responsibility for ensuring that students do not submit inauthentic work.  

If AI misuse is detected or suspected by the centre and the declaration of authentication has been signed, the case must be reported to the relevant awarding organisation.

If AI misuse is suspected by an awarding organisation’s moderator or examiner, or if it has been reported by a student or member of the public, full details of the allegation will usually be relayed to the centre. 

The relevant awarding organisation will liaise with the Head of Centre regarding the next steps of the investigation and how appropriate evidence will be obtained. 

AI in schools: what are the risks to teachers?

As outlined above there are risks involved when using generative AI, however, it can be used as a tool to support your teaching, lesson planning and teacher admin. Currently a ‘common sense’ approach prevails.

We would always advise you to read and adhere to your school’s ICT and computing policies, safeguarding policies and data protection policies.

If you are an Edapt subscriber and you receive an allegation on the misuse of AI in your role you can contact us for further support and advice. 

AI in schools: further reading

Was this article helpful?

The information contained within this article is not a complete or final statement of the law.
While Edapt has sought to ensure that the information is accurate and up-to-date, it is not responsible and will not be held liable for any inaccuracies and their consequences, including any loss arising from relying on this information. This article may contain information sourced from public sector bodies and licensed under the Open Government Licence. If you are an Edapt subscriber with an employment-related issue, please contact us and we will be able to refer you to one of our caseworkers.