Professor Leslie Loble AM, chair of the Australian Network for Quality Digital Education, is calling on governments to urgently adopt draft national standards that set out safe, educationally sound tools and that guide teachers’ AI use in classrooms.

Loble says emerging research is showing the detrimental impacts caused by outsourcing too much cognitive effort to AI.

“Unfortunately, there's growing evidence that excessive and unstructured use of AI may be undermining the development of our brain's mental structures that not only underpin learning, but sustain us through schooling and in life with the ability to learn new things and to deeply understand them.

“There's been an increasing number of studies that have pointed to this quite concerning challenge that we face,” Loble tells EducationHQ.

Global research is starting to paint a clear picture of the learning illusion AI is creating, the expert elaborates.

“There was a large randomized control trial, so the gold standard of research, that allowed some students to use ChatGPT-AI to assist them in problem solving compared to other students who didn't have that.

“And what it showed is that there was a short-term learning gain for those who used the tool, but once that tool was removed... the gain wasn't there, and in fact, the others had [improved more].”

Other alarming research has found that more frequent use of AI erodes critical thinking capacity, Loble adds.

“And what was worse is that the study showed this increasing AI dependency. So, once we start using the tool to substitute for our own thinking, we then become more dependent on the tool.”

Rather than being used to deepen or enhance learning, Loble is concerned AI is instead robbing students of the ‘virtuous effort’ that lies at the heart of long-term progress.

“It gives us this surface-level sense of mastery, because we can look it up and it produces something, it makes us and certainly students feel that they've then mastered it.

“And the risk is that they accept the AI output uncritically, and more importantly, perhaps, that they bypass the mental effort that is absolutely required for that long-term lasting memory and is the foundation from which we build our further knowledge and understanding.”

This speaks to a clear performance paradox that is also surfacing in studies, Loble says.

“That paradox is that you do get this sort of sugar hit of a short-term gain, but you lack the sustainable learning that is what you really need.

“And teachers know this - this is why we structure things, and we have scope and sequence in curriculum.

“This is why teachers are trained and know about the zone of proximal development. So, how to structure teaching so that you're stretching your students to push them into new areas of knowledge and learning, but without overwhelming them.

“That's the skill and the art of being a teacher.”

The design behind many AI tools being used for learning is problematic, according to Loble.

While they ought to be facilitating deeper learning and structuring student engagement, too many are spitting out simple and easy answers that allow the user to bypass any meaningful cognitive effort.

“In fact UNESCO, in looking at AI products globally, found that a very large proportion were not well designed. A recent study showed that the five most popular AI tools have minimal, if any, connection to the Australian curriculum.”

Without clear national guidelines on AI, the educational equity divide will likely widen, Loble flags.

Professor Jason Lodge from The University of Queensland agrees.

According to the psychological scientist, AI will “almost certainly widen existing equity divides and mean a widening learning gap for disadvantaged students and schools if left unstructured".

“Students who already possess high levels of domain knowledge and strong metacognitive skills will be able to leverage AI for beneficial offloading and accelerate their learning, while students without these skills, often those already experiencing disadvantage, will be susceptible to detrimental offloading and bypassing the very learning they need..." he says. 

Earlier this year prominent educational neuroscientist Dr Jared Cooney Horvath warned that we risk further eroding children’s cognitive development if the expansion of edtech into classrooms continues as it is currently.

Over the past two decades, the cognitive development of children across much of the developed world has stalled and, in many domains, reversed.

“Literacy, numeracy, attention, and higher-order reasoning have declined despite increased school attendance and expanded public investment,” Horvath said.

He noted that although digital tools now consume a significant share of instructional time, assessment, homework, and student attention, the evidence to date from international assessments and sweeping research studies shows that increased screen exposure in classrooms is actually linked with poorer learning outcomes.

“We know that there are ways you can structure the use of AI so that it is enhancing the learning process,” Loble says.  

“Firstly, to be really clear and explicit in the teaching about what is beneficial offloading to AI and what is detrimental offloading to AI.

“So, if you have written an essay and you've got a paragraph that you don't think really quite works, and you ask AI to give you some suggestions about how to make it clearer.

“That's very, very different to asking AI to write your essay.”

Another technique is to have students reflect and assess their use of AI after a session, she says. 

“[The aim is for students to] use it in a very intentional way that allows them to keep control of their learning. That's really the most important thing.”