Professor Sam Sellar, Dean of Research in Education Futures at the University of South Australia, has argued we must start making collective decisions around just how far we are willing to automate teachers’ roles. 

“At present, the way we use data erodes the social trust we place in teachers as professionals,” Sellar said in a MCERA media release. 

“Teachers often feel conflicted, because their educational values and professional knowledge would see them teaching in different ways but, if the results are not easily translated into data, their views may be discounted.”

Big data and big tech are now defining the future of education, but it’s up to us to determine just how much educational decision-making is left to algorithms to decide, the expert warned.

Increasing automation may work to further disempower teachers, curtailing their judgement, autonomy and creativity, Sellar said. 

“The question being raised is whether it is or desirable, or even possible, for the role of the teacher to be replaced by artificial intelligence,” he posed. 

“The growth of education technologies is disconnected from teachers’ professional judgment. 

“Educators do not generally get to inform how these systems and platforms are designed or implemented. 

“Instead, it is teachers who are often represented as old-fashioned, deficient, and in need of replacement by new, commercially profitable, hi-tech solutions.” 

Technology itself is not the problem here, Sellar noted. 

“The issue is how technology gets dropped into classrooms as ‘black boxes’ that influence, or even substitute for, aspects of professional judgement and decision-making – without that judgement, in turn, shaping the technology.”

This has opened up public schools to new private actors who have been granted influence over the design, management and delivery of education, Sellar said. 

“While the automation of education is still emerging, one common idea is that we can personalise learning to resemble Netflix, Facebook or Amazon – automating content based on an algorithm’s reading of previous activity. 

“This prescriptive ‘personalisation’ can really silo each student into their own, narrowed curriculum, where there is very little shared or collaborative learning.”

Sellar said current requirements around data collection were already limiting teachers’ decision-making.

“The focus on recording data pushes teachers towards what is easiest to measure, not necessarily what is best,” he said.

The expert warned this could create a picture of students that omits their individuality and whole self. 

“Each child ends up with a ‘data double’ that often substitutes, in decision-making and assessment, for the child themselves… 

“Students must be given the freedom to be more than their data profiles.” 

Sellar highlighted the fact that algorithms are often trained on biased data sets, which can lead to biased decisions about students’ performance and potential. 

“This is an opportunity to be mindful that biases are common both to human beings and to artificial intelligence," he said. 

But according to the expert, opposing data-driven technology in education was of little use. 

“What is needed is creative ways to push developments in beneficial directions for educators, students, and societies," he said. 

"Researchers and teachers must explore and promote alternative ways to use data and AI; ways that enhance the work of human educators.

"This is how we can challenge the tendency to automate teaching just because we can, or because it might be convenient to.”

Furthermore, most of the risks could be avoided if teachers’ judgments were factored into the design and development of technologies, Sellar said. 

"We can – we have to – put teachers in the driving seat when it comes to education technology.”

  • Alongside co-author Matthew Thorpe, of Manchester Metropolitan University in the UK, Sellar has authored a new chapter on Datafication published in the International Encyclopedia of Education