International Centre for Missing and Exploited Children chief executive Colm Gannon said the organisation had received a 1325 per cent spike in reports involving AI-generated child sexual abuse material in a year.
The centre received more than 67,000 reports on the matter in 2024.
Experts and Federal Government officials have convened at Parliament House for a round table to address the increasing use of AI in the sexual exploitation of children.
Child safety advocates called for the explicit criminalisation of the use and possession of software designed to generate child sexual exploitation material.
“I have been involved in investigations where there is active trading and profiteering from using these models, it’s a pay-as-you-use design that’s happening within child sexual offender communities,” Gannon, a former specialist investigator who has helped in national and international child sexual exploitation cases, told reporters in Canberra on Thursday.
“There is no social-positive reason why people are going to be in possession of this software except to generate child sexual abuse material.”
A 10-year government plan released in 2021 to address child protection needed to be updated to capture new technology as it didn’t mention AI and associated harms, he said.
Child abuse survivor Grace Tame, who joined fellow advocates at Parliament House yesterday, said there needed to be a broader review into tackling child sexual exploitation with a royal commission into institutional child sexual abuse more than a decade ago failing to examine key issues.
“It was very specifically focused on institutional child sexual abuse and the responses of institutions,” the former Australia of the Year said.
“Incest accounts for the overwhelming majority of all child sexual abuse.
“A lot of this is taking place in home, a lot of the online content that we’re seeing is often filmed by parents and distributed by parents and there’s no institution involved in that.”
Tame said governmental response for some time had not been up to scratch and that the country must criminalise the possession of freely available child exploitation apps.
“I don’t think previous governments and, unfortunately, the current Government, have acted swiftly enough when it comes to child safety online,” she said.
Prior to the round table, Attorney-General Michelle Rowland said the use of AI to facilitate the creation of child sexual abuse was sickening "and cannot continue".
“I am committed to working across government to further consider how we can strengthen responses to evolving harms,” she said in a statement.
“This includes considering regulatory approaches to AI in high-risk settings.”
Tame explained perpetrators at present were able to purchase AI tools and download them for offline use, where the offensive material they were creating could remain undetected.
“It is a wild west, and it doesn’t require much sophistication at all,” she said.
Jon Rouse, who worked in law enforcement for nearly four decades and tackled online child exploitation material, called for authorities to be given greater resources and new tools to quickly identify victims and combat the crime.
“The tragedy about that is that if we don’t find them quickly, they get buried in a landslide of new content,” he said of child abuse content.
Rouse also demanded risk assessments for new technology as social media algorithms pushed users toward disturbing and harmful content.
“The tragedy is we’re at a point now where we’re having to ban our kids from social media, because we can’t rely on any sector of the industry to protect our kids, which is pretty sad,” he said.
One social media app kept suggesting AI-generated content of scantily clad mothers with young children, he said, showing reporters a series of photos.
“They’re not sexually explicit but they are telling you something about the people that created them,” Rouse said.
There also needed to be community-wide education on how to spot problem behaviours and precipitating actions from offenders, Tame said.
“We’ve been talking about early childhood education – these kids are pre-verbal, so they’re even more vulnerable,” she said.
While possession of child sexual abuse material is a criminal offence, advocates say Australia should be following other nations, including the UK and European Union, in outlawing the AI tools themselves.
(with AAP)
Lifeline 13 11 14
Kids Helpline 1800 55 1800 (for people aged 5 to 25)
1800 RESPECT (1800 737 732)
National Sexual Abuse and Redress Support Service 1800 211 028