The Plan include programs to boost AI literacy and skills across schools, TAFEs, and community organisations, alongside partnerships with the education sector to prepare a future AI-ready workforce.
It also aims to support the responsible use of AI in educational settings and prepare students for an AI-enabled economy.
Dannielle Kelly, Head of Government Affairs and Law Enforcement Outreach at ICMEC Australia, yesterday said that the task ahead is to ensure innovation progresses safely.
“Children are already growing up in an AI-enabled world. Our job is to make sure they can do that safely – not by shutting down innovation, but by putting clear guardrails and strong regulation in place, and ensuring the right tools are in the hands of those who protect them. Today’s actions are a positive step towards that future,” Kelly said.
Over the past two years, ICMEC (International Centre for Missing and Exploited Children ) Australia, an independent not-for-profit that works to prevent technology-facilitated child sexual exploitation and abuse (CSE), said it has played “a central role in shaping national thinking on the safe use of artificial intelligence to prevent child sexual exploitation and abuse”.
The organisation advocates for child protection, strengthens professionals who detect and report CSE, and collaborates with stakeholders like law enforcement, financial institutions, and policymakers through data-driven initiatives, training, and research.
As leader of the SaferAI for Children Coalition, ICMEC Australia has brought together experts from technology, law enforcement, academia and other not-for-profits to develop practical, evidence-based measures that prioritise children’s rights and safety.
Figures from the US National Center for Missing & Exploited Children show a 1325 per cent surge in AI-related child sexual exploitation reports, rising from 4700 in 2023 to more than 67,000 in 2024.
Through its parliamentary roundtables in recent months, ICMEC Australia has convened national leaders across government, industry and child protection, including key discussions that contributed to the Government’s ban on nudify apps.
The National AI Plan announcement represents meaningful progress and a welcome response to this collective effort, ICMEC said.
In parallel, the organisation is working with police across the country to ensure frontline officers have the tools, training and specialist expertise needed to respond to AI-enabled offending.
Kelly said this work is becoming increasingly urgent as generative technologies reshape criminal behaviour.
“AI has become a core tool for offenders, and it now must be part of the response for police,” she explained
“We are focused on making sure officers have practical, current training and access to AI-enabled tools that help them identify harm faster, support victims better and hold offenders to account.”
ICMEC Australia said the implementation of the National AI Plan will provide the coordinated direction needed to adopt and develop new technologies with confidence while placing children’s rights and safety at the centre of an AI-informed future.
It said it looks forward to continuing its collaboration with the Australian Government to ensure that AI is used for children’s safety, not against it.
The Government’s National AI Plan will guide investment in data centres and worker training and is a shift away from former industry minister’s plan for “mandatory guardrails” to protect against AI’s worst harms.
It said it will be using “strong existing, largely technology-neutral legal frameworks” and “regulators’ existing expertise” to manage artificial intelligence in the short term.
ABC News said from next year, a $30 million AI safety institute will monitor AI development and advise industry, agencies and ministers when required, while the Government continues what it is calling “ongoing refinement” of its AI plan.
Click here for more information on the SaferAI for Children Coalition.