That’s according to Western Sydney University researcher, Associate Professor Joanne Orlando.

Orlando, from the university’s School of Education, is a media commentator and policy advisor on digital wellbeing with particular expertise in technology use by children and families, and says until now, schools have rigidly focused their digital literacy education on cyber safety.

“Some schools are still saying for younger people, ‘the internet is dangerous, be careful, there’s lots of risk, people reaching out to you, they can be there in a predatory way’ – and of course it is.

“But digital literacy is more than about just shutting down young people’s uses – for example, it’s also understanding AI, where the risks are, where it can be used well, but understanding that kind of relationship we develop with AI, helping people to unpack that.”

Orlando emphasises that it’s about social media literacy, too.

“While we’re assuming all under 16s are off social media, they’re only partially off 10 platforms that have been banned,” she shares.

“There’s still a lot of other social platforms and kids are easily finding ways around age verification efforts and other restriction measures.

“So it’s about helping them to understand algorithms, echo chambers, why people upload things in the way they do, the kind of emotive language, the hidden advertising, just really reading social media for what it is.”

Orlando says the current viral trend where users are posting photos of themselves from 2016 and 2026 and ‘comparing what I looked like then to now’, is a good teaching opportunity.

“They’re really cool, and fun, but the reality is when you’re uploading two photos of yourself like that, AI is constantly scanning all of our social feeds, so being able to kind of detect what you as a user look like and how you age.

“That kind of data might be sold to advertisers, might be sold to governments – so it’s just being aware that what you’re uploading is being scanned now constantly by AI. I think that’s important.”

Orlando says things like biometric face scan data and the privacy issues around these newer forms of data collection that we’re encountering on a daily basis, should also be part of any solid digital literacy curriculum.

“It’s moving beyond the old school cyber-safety to this really complex online space that we’ve got,” she says.

“And there’s many arms to it and understanding each of those arms, not just in terms of risk, but actually just understanding how they work and the implications for us, so our under 16s can make good decisions, informed decisions.”

AI literacy, Associate Professor Orlando says, is about understanding how to use and navigate the AI world, but also understanding the human interaction with AI and how it impacts us in terms of learning emotionally, mentally, socially and relationship wise.

Orlando is at present undertaking detailed in-school research, and says it is clear that teachers nationwide operate in different contexts, in different ways, in different subjects, with a mixed range of approaches.

“One of the issues holding us down at the moment, in a critical, creative, problem-solving way, is debate in the community and society about kids getting too much screen time.

“A lot of parents are saying things like, ‘my kid gets enough screen time. I don’t want them getting even more in schools’. And we’re seeing some schools, particularly in Victoria, completely removing screen use or really limiting screen use.”

In June last year, EducationHQ spoke to Professor Jason Lodge, an educational psychologist at the University of Queensland, who said that deeply understanding a current suite of AI tools on the market, that are being pushed in many schools in terms of AI literacy curriculum, is in his eyes, largely a fruitless endeavour.

He likened it to the rush to teach students how to harness the intricacies of the floppy disk and other technologies in bygone tech eras, and said that given AI will always be better at executing rigid processes and structured approaches than humans, that we should be “zeroing in instead on what makes us uniquely valuable as embodied and social beings who are compelled to be empathise with and learn from each other”.

“I agree with Jason, we are unique, we are creative, we have emotions, but those things are kind of at risk with AI as well,” Orlando offers.

“So this literacy with AI is about understanding all of those, how do you use and navigate the AI world, but understanding the human interaction with AI and how it impacts us in terms of learning emotionally, mentally, socially, relationship wise, there’s a lot of implications there.”

Orlando says AI has been at the broader public’s disposal now for almost three years, and these days children and adults are using it on a day-to-day basis “to make every and any decision”.

“I think we need to understand how AI works, and how it can help us, but also its limitations,” she says.

“There’s a lot of health advice which young people are tapping into as a decision making tool … it’s that outsourcing decision making, which is what they’re mostly using it for, and now we’re getting things like ChatGPT Health, so you can upload all of your test results, your scans, your ultrasounds to ChatGPT Health for assessment.”

This, Orlando, warns, besides being incredibly foolhardy in terms of medical advice, also raises “massive, massive privacy issues”.

“The kind of information that it might give us, accurate or not, it’s still far from 100 per cent perfect, any medical person will tell you that.

“Clearly there are a lot of great uses, a lot of really practical uses, but the privacy issue is something all young people need to know about.

“It’s about being aware of giving away your skills in a way, giving away your writing skills, giving away your thinking skills. I think, just that kind of awareness.”

Orlando says despite a piece-meal approach nationally in terms of curriculum uniformity, she’s extremely optimistic about the future of digital literacy in our schools.

“We’ve all talked about social media more than we ever have because of the ban, and so opening up those conversations with adults and young people, across every [part] of Australia, that’s a really good thing. The discussion has really started something.

“At this point, if we want to have a really good, safe, competitive, successful generation using screens, we need to look at our policies.

“We’ve got education policies around this, we’ve got health policies, we’ve got equity policies, but they don’t all speak to each other.

“They’re all cancelling each other out at this point. I think a more cohesive approach to policy is really important to really get this on track.”