RMIT’s Miriam Reynoldson has flagged concern over the use of the term ‘AI literacy’, saying generative AI models do not work to deepen our skills of communication and ultimately drive a wedge between educators and learners.
The learning design specialist says she’s frustrated that calls for embedding ‘AI literacy’ in education most often carry a bigger agenda.
“…they’re often used as a way of selling the tools.
“And that might not be financially selling them, but trying to drive adoption and trying to build ubiquity.
“So, particularly in my area in higher education, universities are falling over themselves to adopt particular tools…”
Be it ChatGPT or Microsoft Copilot, educational institutions are sending a “really strong” message to all students and staff that is pushing particular AI tools, Reynoldson says.
Secondly, ‘AI literacy’ is something of an oxymoron, she suggests.
“I also have this frustration with the word ‘literacy’.
“Essentially, literacy is about developing the ability to communicate and advocate for oneself. It’s quite a complex socio-cultural practice, but essentially, it’s for necessary functioning in the world...
“[The use of AI literacy], I think, is a dangerous way of making AI adoption into a truism.”
The fact is that each time we use generative AI, we are ceding our powers of expression and communication, Reynoldson says.
“The use of large models specifically, so we’re talking about written English writing, allows people to bypass the process of thinking through what they mean or what’s being meant or how a concept is being constructed.
“And so the ease (of generating text) comes without that layered comprehension.”
With her own students in the tertiary sphere, Reynoldson says she has seen a growing number of concerning examples where individuals don’t realise the extent to which they have offloaded a task to AI.
“I don’t think that I’ve seen any really cynical use from my students yet, anyway,” she says.
Frequently, students are attempting to integrate generative AI into their workflows, and achieving ‘regressive’ results, Reynoldson reports.
Critical literacy skills are declining, and some cases border on being academic misconduct – yet students are using AI in good faith, she says.
Here handing down poor grades is a way of communicating that the AI integration was unsuccessful, but it punishes the student for earnestly trying to participate in a practice increasingly touted as essential, she flags.
The communication between uni students and teachers is deteriorating with the use of AI on both sides, Reynoldson says.
“Worst case scenario, if we have a person, say a student, submitting a piece of work that is fully completed by the AI or offloaded to that significant degree, and then we have a teacher providing feedback that is significantly offloaded to the AI, then both people have failed to connect with one another.
“We’re meeting each other further and further away every time that these are used.”
The academic says actively selling students ‘AI literacy’ as a ticket to their future success is a harmful move.
When it comes to schools’ approach to generative AI, Reynoldson is quick to concede her views are offered as those of an armchair critic.
“But the thing that really scares me is that we do classify the skills to use AI or even the knowledge about what artificial intelligence technologies are as a literacy.
“This is already happening with the OECD. They have developed some kind of measure for AI literacy – it’s actually starting to get baked in at an international level.
“And that really frightens me.”
Although we do need to be talking through the risks generative AI poses and ‘turning it upside down and shaking it and finding out what’s falling out the back end’, it’s simply wrong to tell children that knowing how to power AI is an essential skill if they want to succeed in this lifetime,” Reynoldson says.
“I really just draw the line at implying that it is an essential skill for the world outside.
“From my own vantage point, I’m seeing employers don’t want it. Obviously, when you move into university, your teachers really don’t want you to be using it and are quite exhausted by it.
“And nobody really wants to be hanging out with a bot.”
As Monash University notes, there are wider debates about whether “literacy” is the appropriate term to capture the kinds of knowledge, understanding and capability required to safely, responsibly and effectively navigate the evolving AI scene.
Yet the institution opts to roll with the term AI literacy “as a practical and commonly-used way of talking about these issues”.
It suggests there are many overlapping dimensions to AI literacy including:
- Knowing about AI and understanding how AI works
- Experience using and applying AI
- Developing critical abilities to evaluate AI
- Skills in creating responsible and explainable AI
- Understanding of ethical issues related to AI.
Meanwhile, a new ‘AI Across the Curriculum’ program at the University of Florida (UF) is seeking to engage students in “an interdisciplinary manner reflecting real-world workplace environments”.
"With thousands of new jobs being created that need AI talent, UF advocates for every college student, regardless of major, to graduate with a basic understanding of AI literacy. Beyond that, UF encourages students willing to put in the time and the effort to become AI competent or an AI expert," an infomation page outlines.
The program reportedly features newly-developed courses and academic programs featuring AI, innovative partnerships and increased internship opportunities focused on AI.
“This approach will enhance workforce development and career readiness to prepare students for their future…” two academics have previously explained.
“The ultimate goal is to prepare students with real-world marketable AI skills, awareness of AI implications in society, and improved communication about AI.”
For Reynoldson, actively selling students ‘AI literacy’ as a ticket to their future success is a very harmful move.
“It’s not really a very strong argument, but it’s somehow being made incredibly compelling and it’s absolutely a narrative of risk.
“It becomes a sense that ‘you’re going to need to do this to catch up – you’re already behind, and if you continue not to run harder than you possibly can, you will be behind forever’.”