A parliamentary inquiry into the use of generative AI in the nation’s education system said the dangers from the tools needed to be properly mitigated in order for them to be used successfully.

The 126-page report, titled Study Buddy or Influencer, laid out 25 recommendations, including that AI tools be included as part of the national curriculum for use as a study tool.

Inquiry chair, Labor MP Lisa Chesters, said AI could help boost educational outcomes for students, particularly those from vulnerable backgrounds.

“When the use of generative AI as an education tool is used appropriately, it can provide equitable access to students and educators and teachers alike,” she told federal parliament on Tuesday.

“The uptake of generative AI in the education sector should be a national priority.”

While the report found AI had educational potential it said safeguards needed to be created for the technology.

It recommended that primary school students have access to AI but certain features would need to be restricted.

IEUA NSW/ACT Branch Secretary Carol Matthews said while the report is crucial to guiding the safe and ethical use of GenAI in classrooms, the reality is that GenAI is outpacing efforts to provide a comprehensive regulatory response on behalf of teachers, school staff and students.

The IEU provided a parliamentary submission and appeared before the House Standing Committee.

“We emphasise collaborating with teachers and their unions to develop guidelines that uphold academic integrity and embed ethical concerns and IT literacy,” Matthews said.

“Inequitable access to technology and GenAI opportunities for disadvantaged students are an urgent priority.”

In its submissions, the IEU particularly highlighted the impact of GanAI on teacher workload and staff wellbeing.

“While the technology has the potential to reduce some administrative tasks, there is also the possibility of adverse impacts on staff,” Matthews said.

“These threats need to be identified and carefully managed.”

The IEU suggested that training must be done on paid time and that teachers and school leaders should not be considered ‘AI enforcement police’.

Any new assessment and/or anti-plagiarism measures to uphold academic integrity, it said, must not add to already excessive workloads.  

The union said that with reported cases of AI ‘deepfake’ harassment of female teachers and students’, there needs to be a decisive and zero-tolerance response in every school to stamp out such behaviour.

Chesters said the Government would need to work with the eSafety commissioner to develop what guardrails would need to be in place.

“(There is a) need to protect users, especially children’s data, to ensure that educational providers do not select generative AI products or tools that store students’ data offshore, or sell them to a third party,” she said.

“Generative AI presents an exciting opportunity, yet is high-stakes risk for the Australian education system.”

The inquiry’s chair also said universities had been grappling with how to deal with AI at tertiary institutions.

“AI has broad implications for the design, implementation of assessments, academic and research integrity,” Chesters said.

“The higher education sector is struggling to address the misuse of AI in assessments.”

(with AAP)