Generative AI

Artificial Intelligence (AI) is likely to have an increasing impact on education systems over the coming years. Te Tāhuhu o Te Mātauranga is developing policies and advice about this to ensure the system is well prepared. In the interim, here are some initial points that schools and teachers should be aware of.

What is ChatGPT?

ChatGPT has been all over the media in recent times. So what exactly is it? 

ChatGPT is a type of artificial intelligence called ‘generative AI’.

Its creators, OpenAI, use very large mathematical models to process vast amounts of text that is available on the internet so it can provide responses to prompts in any form, such as answers to questions, essays, poems, or even lines of code. ChatGPT creates text that appears useful and accurate. The complexity of the model is why the same prompt can get slightly different results.   

There are other free to use large language models like ChatGPT, such as Chatsonic, Youchat, BingAI, as well as paid versions, and more are being developed. Other AIs generate images, like Midjourney and Fotor. Some AI will be embedded in software that we use regularly, like search and word-processing tools.   

Concerns have been raised both within Aotearoa and internationally about the implications of large language models and image generation AI for education, cyber security, and data-privacy. ChatGPT is being reviewed by European and Canadian privacy regulators.

An emerging concern is that the use of data from the internet to train generative AI systems may breach copyright laws.

Using Generative AI in school

Many educators are already using AI tools effectively to support teaching and learning, and we will be sharing examples of these as we learn more.

Some examples are in links on this page. PLD providers and tertiary institutions are offering short courses and learning opportunities for teachers about genAI tools, both general introductions and how to use them for specific learning areas or approaches.

ChatGPT: How teachers are bringing AI tech into the classroom – Stuff.co.nz(external link)

ChatGPT resources – Kit Willet (Selwyn College teacher)(external link)

Considering the issues below will mean that you will be keeping yourself and your students safe as you develop expertise with these exciting new tools. 

Be aware before you use

AI tools provide great opportunities for teaching and learning, but educators need to be aware of the limitations and risks before they jump in.

These tools are still experimental and the risks and potential harms of these are still not well understood. New Zealand’s Office of the Privacy Commissioner has released expectations for agencies about using generative AI, which principals and teachers should be mindful of.

Office of the Privacy Commissioner – Generative Artificial Intelligence(external link)

It is worth being aware of the guidance to the New Zealand public service and considering it when making decisions about genAI in your school.

Understanding the benefits of GenAI to the public service – NZ Digital(external link)

As noted above, there are concerns about genAI and intellectual property. The onus is on you to ensure any generated content you may use does not infringe on other’s intellectual property or copyright law.

Lack of knowledge of non-dominant languages and contexts

Generative AIs like ChatGPT have been trained on content that can be freely accessed on the internet, and most of this content reflects contemporary, dominant cultures and languages.

This means that the tools may not be able to provide results that reflect indigenous knowledge, and in the content of Aotearoa New Zealand, will likely be weak on Mātauranga and Te Reo Māori.

The tool may produce results with errors and biases which can reinforce existing inequities.   

Speaking my indigenous language with new AI – Te Ao Māori News(external link)  

Māori culture and language observations with ChatGPT – Dr Karaitiana Taiuru PhD, JP, ACG, MInstD, RSNZ(external link) 

Age restrictions on use

As with any digital tool, teachers must abide by the terms and conditions of the AI tools they want to use with their learners.

For instance, to use ChatGPT you must be at least 13 years old, and if you are under 18 you must have your parent or legal guardian’s permission. The terms of use for ChatGPT and any services using its API are clear that you must not send it any personal information of anyone under the age of 13.

Requiring students to use ChatGPT or another AI tool must be consistent with the terms and conditions of the tool.  

Terms of use – OpenAI(external link)

Avoid using personal data in genAI tools

Our advice is that the safest option for now is not to input or use personal data in genAI tools.

There is a risk that personal information entered into genAI tools is retained or disclosed by the genAI provider. The responsibility of complying with the Privacy Act 2020 lies with schools.

Large language models like ChatGPT are unreliable

Large language models compile text convincingly, but they do not understand what they have created.

They can invent facts and details, because services like ChatGPT are designed to create answers that are grammatically correct compilations of text, based on predicting what word should come next in a sentence.

OpenAI, the developers of ChatGPT, caution that ChatGPT sometimes writes plausible-sounding but incorrect or nonsensical answers and that fixing this issue is challenging. 

ChatGPT 4 responses are generated based on a pre-existing dataset of information and patterns from a variety of sources available up until September 2021. ChatGPT doesn’t have real-time internet access or the ability to browse the web. However, the new subscription-based ChatGPT Plus integrates with Microsoft’s Bing search engine, which allows ChatGPT to provide real-time information to users.

Introducing ChatGPT – OpenAI(external link) 

An introduction to the role of AI in classrooms and schools – The Education Hub(external link)

Don’t use AI to make decisions about a learners' work

Using ChatGPT or similar tools for marking work can look promising, but teachers should be cautious of this.

Without understanding the basis for the judgements (i.e. seeing inside the algorithm), this can be unfair and discriminatory. It is the responsibility of teachers and the school to make final decisions on learners’ work. Also, because of the way these tools ‘invent’ answers, sometimes they can be simply wrong.

AI systems trained off the internet have not seen enough work by children or young people to have a good understanding of what is appropriate for or expected of young people.  

Can ChatGPT provide feedback? – No More Marking(external link)

NCEA and ChatGPT

NZQA have provided advice specifically about reducing the risk of cheating in NCEA using ChatGPT or other AI tools.

Chat Generative Pre-trained Transformer (ChatGPT) – NZQA newsletter February 2023(external link)

Using AI checkers for assessments

As noted above, teachers should not rely on AI tools to make decisions about student work.

There are many AI detecting or checking tools available, some of which are marketed heavily to teachers. These may have some value, but anyone using them needs to be aware of the risks and caveats.

AI plagiarism checkers are not fool proof. AI detection tools will struggle to keep up with the ability of AI to draft text in the style of a student. AI detection tools can also generate false positives, which means they identify something as AI generated when it is not.

False positives and true positives cannot be told apart without using a different method, such as discussions with the student. This means you should not rely on AI based detection tools to judge whether a student has breached assessment conditions.

Teachers are best placed to make judgments about the authenticity of their students’ work and should employ a range of strategies to support such judgements. School assessment policies should be clear about these strategies and the processes to follow if a teacher suspects unauthorised use of generative AI.

Understanding false positives within our AI writing detection capabilities – Turnitin(external link)

A Hitch in Accurate Detection of AI-Written Content – Psychology Today NZ(external link)

Turnitin says its AI cheating detector isn’t always reliable – The Washington Post(external link)

So how might we use AIs like ChatGPT in schools?

As we have seen, there are opportunities and risks in harnessing AI like ChatGPT in the classroom.

However, through deliberate and discerning use by teachers there are opportunities to teach learners critical literacy, particularly supporting ākonga to question the accuracy of what they read as well as recognising bias. The curriculum is clear about the importance of critical literacy learning, including being literate in a digital space.

Although children and young people may not be able to access ChatGPT themselves, teachers can generate texts and use these with ākonga to support the development of their critical literacy skills. Teachers could also use a series of texts to support understanding of the effective use of prompts.

Last reviewed: Has this been useful? Give us your feedback