The emergence of generative AI has put how we collaborate and communicate with each other firmly in the spotlight.
Dr Colin Webber, who’s background before entering academia largely involved music composing, sound design, theatre and film, and who at present is researching curriculum reform in relation to generative AI in higher education, says from the perspective of both primary and secondary schools, the technology is forcing educators to grapple with numerous questions in relation to their teaching.
“How do you communicate with another person to get them to work with you on something? How do you recognise the skills that they have and give them work that is appropriate to their skills?” he queries.
“How do you acknowledge the skills that comes from a human partner or a group member? How do you extend that work if it’s not quite right? How do you talk about that work as a collaboration?
“So ‘I did this, but my partner didn’t do anything’. How do you find ways of extending your own thinking by having another brain in the room, for example? How do you recognise that the work that’s been done by somebody else in your working group is good? So, what makes it good?”
Webber says SAE University College, as a learning institution, has always been a big fan of what it calls ‘connoisseurship’, the idea that in any given field, there’s good work and there’s not so good work.
And sometimes they can look or sound very different.
“So to give you an example, if we’re training students in recording studio techniques, and one student brings you a jazz recording and another student brings you a heavy metal recording.
“As an assessor, how do you know which of those is good? They are a potentially different set of techniques, but the same set of standards apply, right?
“So connoisseurship is one of those things that also extends to critical thinking about how sentences are constructed, how information is gathered, assimilated, synthesised, checked, fact checked, all of that kind of stuff.”
Dr Webber says school leaders need to ‘go in deep’ with AI and understand it from a business context. IMAGE: generated in MidJourney
Webber says he’s always considered that the main role of education is the creation of good citizens, and using the things that students are interested in as a vehicle to that.
“That’s never been more true, that we have an opportunity to develop the human skills of young people before they get too set in their ways and to work on our brain plasticity, for the older ones among us.
“But to be always looking at how can we use this for a good thing rather than for a destructive thing.”
It is clear generative AI is re-shaping society, education and the way we work, but are schools preparing students with better AI literacy for higher education?
The short answer is ‘no’, according to Webber.
“What we’re seeing is that students are learning on their own, and in some cases, they’re using it to cheat the system or to bypass the system,” he says.
He says schools need to be very careful not to think of ChatGPT or Claude or Gemini as ‘that’s what AI is’. “That's one version of it, one form of it,” he says.
Discussions around AI have also been more about averting it, not just in schools but in business and elsewhere, he says.
Webber has autism and likens some of the reactions to AI as similar to those many have in response to people like him, who have neurodiversity.
“So you see lots of gotchas, it’s like, ‘oh OK, so GPT is so smart but it can’t count the ‘r’s in strawberry’, right?
“But GPT was not designed for that. GPT was designed for something else.
“‘So it’s so smart but it’s stupid at this’, or ‘it’s so smart but it’ll never be able to understand human emotions’, right? Or ‘it’s really good at pretending to be empathic’.”
These are things that Webber has heard on many occasions.
“So it’s interesting that when we look at the outputs of things but they’re different to the way that we think, we sometimes judge those in a negative way, they’re ‘gotchas’,” he suggests.
“But if we can learn how to understand the differences that we encounter in thinking, even if it’s somebody who comes from a different cultural background, who comes from a different neurodiversity, those are actually tools which are going to help us deal with what is effectively an alien intelligence as we move forward.”
Dr Webber says some of the reactions to AI are similar to those many have in response to people who have a neurodiversity. “…when we look at the outputs of things but they’re different to the way that we think, we sometimes judge those in a negative way.”
Webber says while AI is a technology that we have to learn how to use, it’s also something which can help us develop our skills for dealing with each other.
“So everything that I’m doing at the moment is focusing on how do we bring those things into a holistic understanding of what education is for, because education is no longer about knowing things, education is about knowing how to work with what is known.”
It’s about adopting a very different mindset, he indicates.
“Really good leaders surround themselves with people who are better than them in all of these various areas of expertise – that’s why you have a Chief Executive Officer and a Chief Financial Officer,” Webber explains.
“So there’s this idea that we can learn to collaborate with people who know more, or things that know more than we do, but we can access that and work with it in that way.
“I don’t see a bad future in relation to AI. I see some major teething problems as humans understand their own relationships with themselves, because historically one dominant culture coming across a less dominant culture hasn’t gone well for one of them.”
Webber says leaders in all areas, but particularly school leaders, need to have somebody that they work with closely who is dedicated to understanding how AI works within their own context.
“So if you’re the principal or the business manager of a school, actually understanding how AI works in education, so that the principal can be then talking to the teachers and saying, ‘hey you know you could do this in the classroom’, because it’s the context that’s everything.
“You need those people in the leadership team who are capable of understanding where the security risks actually occur, because, for example, uploading personal information to your personal GPT account is probably not a good idea, but you’re not going to be able to write into it ‘who’s the worst performing student at my school?’, and expect an answer, because it’s not a database, it doesn’t work that way.”
There is clearly a great deal of misinformation about what uploading things to GPT actually does, so understanding security, data privacy, even some of the things in policy look like they’re there to appease misinformation rather than control it, Webber says.
“Leaders need people who are prepared to dive in deep, and … need to actually go in deep themselves and understand the core business context.”