Jason Lodge, Professor of Educational Psychology at the University of Queensland (UQ), says he’s watched on with a degree of concern and bemusement as the seemingly urgent push for ‘AI literacy’ curriculum in schools rolls out.

An enduring edtech cycle

He warns that teaching highly AI-specific skills is really a wasted effort, and that you only have to dwell on our blighted history with edtech to see why.

“A lot of these things tend to go through a very similar cycle. So … remembering what it was like as a kid, you’d get all these messages about how it was really important that you learn DOS or it’s really important that you learn Windows 1, [and then] social media, and that’s just gone on and on,” Lodge, who’s also the director of the Learning, Instruction, and Technology Lab at UQ, says.

The reality is that all these tech-specific capabilities are no longer talked about as being something that has to be learnt at all, Lodge adds.

“They’re sort of a broad set of digital literacy skills. So, I get concerned when the conversation about AI becomes, ‘oh, you’ve got to understand prompt engineering’, or ‘you’ve got to understand the highly specific way in which this particular tool works’.

“And it’s even worse now because a lot of this stuff has really accelerated – the changes that we’ve seen, and the way in which we interact with machines are accelerating all the time.”

AI-specific skills: a wasted mission

Following the lead of China or the US who are “investing a huge amount of effort” into deeply understanding the current suite of AI tools on the market would be a largely a fruitless endeavour, Lodge argues.

“It means that we’re not necessarily going to be preparing students for whatever tomorrow looks like.”

Lodge’s own research work explores the cognitive, metacognitive, and emotional aspects of learning with AI.

He says we’ve clearly got our focus wrong. 

Given AI will always be better at executing rigid processes and structured approaches than humans, and increasingly so, Lodge asserts we should be zeroing in on what makes us uniquely valuable as embodied and social beings who are compelled to be with, empathise with and learn from each another.

It’s our ability to navigate human networks and our role within them that will be our real superpower here, Lodge says.

“There are numerous international examples where they’re really doubling down on ‘you have to learn about these specific tools right now’.

“And the research that we do, and the research that a lot of other people have been doing for a long time, suggests that it’s not the technical skill, it’s not the pressing the button and the clicking stuff (that matters).

It’s all of the skills that sit underneath that…”

The human skills that will matter

Students’ capacity to self-regulate and co-regulate their learning is what’s going to stand them in good stead for the long-run as technology quickly shifts, Lodge proposes.

“Can I adapt?  Am I OK with trailing stuff and making mistakes and coming back from that? Can I problem solve? These are the skills that I think last the test of time.

“Not that ‘I know exactly how to use this specific tool that’s going to disappear in five years’.

Far from being an abstract buzzword, self-regulated learning needs to be made a core part of a school’s curriculum if students are going to work effectively with AI into the future, the expert says.

“Self-regulated learning is really about how do I understand my own learning? How do I recognise when I need to get help? How do I make good decisions about what I do next while I’m learning?

“And teachers do a great job in helping with this, but it’s not a core part of the curriculum, it’s not inbuilt.

“There’s still such a strong emphasis in many kinds of syllabus, and the national curriculum and other places, on ‘students need to know stuff’.”

We know that the acquisition of domain-specific knowledge via explicit instruction is essential in order to think critically about something, but the curriculum now needs to go further than this, Lodge argues.

“We also need to have built [in many] opportunities for students to learn what to do when they fail or make a mistake or reflect on their learning or think about their thinking.

“And these are the sorts of skills that then provide that foundation for being able to work with machines.”

When you observe people who excel at using AI, they are drawing on the “really fundamental skill” of co-regulated learning that helps them to get the most out of the technology, Lodge suggests.

“…you’re almost treating it like it’s some sort of helper or a peer. And then as a result of that, you’re getting way more out of the kind of interactive capabilities that generative AI in particular introduces.

“That’s not about a technical skill - that’s about I know how to co-regulate my learning with another agent.”

Invest in our researchers 

Lodge sees “enormous possibilities” for AI in education.  

“I sound a bit negative and like, ‘God, this is all going horribly wrong’, but at the same time I think that this is obviously an enormously powerful technology.

“It’s something that I use myself fairly regularly, but I’m using it to do things that I already have a fair amount of expertise in and I can make judgments about whether or not what it’s telling me, what it’s giving me, is actually useful or not … I think that’s part of the challenge here.”

As for claims that AI will inevitably replace teachers or act as a salve for the workforce shortage, Lodge says all these just aren’t feasible.

“I think that that’s nonsense, but what they could do is operate like a tutor or a peer who sits beside students and kind of helps them to develop their own skills as they go.

“The idea would be to get this technology to help students to become better learners themselves, rather than do the work of deciding which pathway through their learning they need to take…”

A huge issue for this country is our lack of investment in AI in education research at the national level, Lodge warns.

“We just don’t have the capacity in Australia to be able to answer the questions that are coming up about these technologies.

“The parliamentary inquiry last year recommended a national research centre (be established), but nothing’s happened … and we don’t have any way to share what we’re learning about all this to move the agenda forward – and we desperately need it.

“Frankly I spend a lot of my time running around begging for money to do this work, it’s just insane.”