Generative artificial technology (AI) continues to grow in many aspects of society, but it’s staying out of Canberra’s public schools for the time being.
The ACT Education Directorate has released its interim position statement on the use of AI in schools, stating it will continue to “restrict” student access to AI tools on devices within department networks.
These include large language models, such as ChatGPT, GrammarlyGo, Bing Chat and Google Bard, as well as multimodal models that make images, music or videos, such as Midjourney and Dall-E.
“Generative AI tools have the potential to engage learners, support teachers to personalise learning, provide real-time feedback and support student-centred pedagogies,” the statement read.
“It is, however, crucial that the Directorate approach the specific implementation of generative AI with caution, and through adhering to responsible and ethical usage practices in alignment with national guidelines.”
The Education Directorate cited duty of care to students as a driver for restricting student access to the emerging technology.
“The decision to restrict access is due to the lack of reliable safeguards preventing these tools from exposing students to potentially explicit and harmful content,” the statement read.
A recent report from The Research Society found 65 per cent of Gen Zs are using AI, with almost one in two using ChatGPT.
“Those in high school are more likely to use AI for school assignments – however, here we must be careful not to interpret this as signs of plagiarism, as this data does not provide evidence for cheating,” report author Anna Denejkina wrote.
The Federal Government has identified AI as a critical technology in the national interest, releasing a framework for Generative AI in schools.
It seeks to guide the ethical and responsible use of these tools, which will be implemented from Term 1, 2024.
“To fully harness the potential of high quality and safe generative AI, schools will need to be supported in understanding and appropriately managing a range of privacy, security and ethical considerations,” the framework document outlined.
“Risk management should also be appropriate for the potential consequences. These consequences include the potential for errors and algorithmic bias in generative AI content; the misuse of personal or confidential information; and the use of generative AI for inappropriate purposes, such as to discriminate against individuals or groups, or to undermine the integrity of student assessments.”
The Education Directorate has set up a working group to explore the benefits, ethical considerations and safe use of AI tools by educators, school staff and students, in order to guide how the Commonwealth framework can be implemented and used in Canberra.
Its work will ultimately result in AI-specific guidelines for ACT public schools next year.
Some have urged policymakers to ensure guidelines around AI in schools are designed with the future in mind.
Trellis Data is a Canberra-based artificial intelligence innovation company, and head of communications Tim McLaren said preparing students to build rather than how to use emerging technologies should be the goal.
“Any education policy on artificial intelligence should be geared towards creating the next generation of machine learning and AI specialists, who can go on to create the next ChatGPT in Australia,” he said.
“Ideally, students should have the tools they need to deploy critical thinking when it comes to using AI. That is, they need to understand the risks of inappropriate use by themselves or others and to take steps to avoid the dangers.”
In the meantime, public school staff are allowed to use generative AI, but only in line with existing acceptable use of ICT and privacy policies.
These regulatory frameworks on education staff include:
- not using or entering into generative AI a student’s personal information, including (but not limited to) their name, age, location, assessment work, behavioural information or learning capacity
- students must not be encouraged to use unapproved tools by requesting they do their work off school networks (for example, instructing students to use ChatGPT as part of homework)
- using any software application that requires both a sign-in and third-party permission forms (for example, students signing in to use Canva using their schoolsnet account)
Mr McLaren did have this warning for teachers planning on using AI for tasks such as lesson plans: “Teachers should be cautious about creating items of curriculum for learning using generative AI.
“GenAI learns by referring to real-world examples, both correct and incorrect – and so it doesn’t always get the answer right.”