Guidelines for social media platforms to comply with the age restriction laws will be released by Communications Minister Anika Wells and e-Safety Commissioner Julie Inman Grant today (Tuesday).

Platforms will be expected to find and deactivate or remove underage accounts, in addition to preventing users under the age of 16 from circumventing the rules.

Reliance on self-declaration alone will not be considered sufficient to meet the legal obligation.

Social media sites will not be expected to verify the age of all users, as blanket checks might be considered unreasonable.

The guidance is also considered “principles-based”, meaning platforms are not required to use specific technologies, including those tested in the age assurance trial.

Under the laws, social media companies are prohibited from forcing users to hand over their government ID to prove their age online.

Failure by platforms to take “reasonable steps” to comply with the laws from December 10 will risk fines up to $49.5 million.

Wells said the Australian community was relying on social media companies to keep young people safe online.

“This industry guidance makes clear our strong expectations that social media platforms step up to the plate to implement the minimum age in a way that is effective, private, and fair on Australian users,” she said.

“The Government has done the work to ensure that platforms have the information they need to comply with the new laws, and it’s now on them to take the necessary steps.”

Wells said eSafety’s guidance made clear platforms must provide “transparent and accessible information to their users about their age assurance systems”.

The trial evaluated more than 60 tools, which found technology could be used to successfully prevent children from accessing explicit and inappropriate content.

Last week, the eSafety Commissioner registered a second phase of industry codes, focused on age-inappropriate content such as online pornography and content dealing with suicidal ideation, self-harm and disordered eating.

Available technologies could ensure the laws were enforced “privately, efficiently and effectively”.

But the report warned that unnecessary data retention could take place if tech giants anticipated future regulation.

This raised concerns about an increased risk of privacy breaches due to the collection and retention of data.

Roblox commits to better protect kids

Meanwhile, popular online gaming and content creation platform Roblox has agreed to introduce a new suite of safety measures following concerns raised by Australia’s eSafety Commissioner about child grooming risks on the platform and its compliance with Australia’s industry codes and standards.

Roblox has committed to implementing the safety measures in Australia by the end of 2025.

The new safety measures include making accounts for users aged under 16 private by default and introducing tools to prevent adult users from contacting under 16s without parental consent.

A number of key features will now be switched off by default for children in Australia, such as direct chat and ‘experience chat’ within games, until the user has gone through age estimation.

After a child aged under 16 has gone through age estimation and has chat enabled, they will be unable to chat with adults.

Parental controls will also be introduced to allow parents to disable chat for 13- to 15-year-old users, on top of existing protections for under 13s.

In a statement, Inman Grant said Australia’s world-leading codes and standards are designed to raise the safety bar across the entire online ecosystem and the new commitments from Roblox are an example of this safety uplift.

“We know that when it comes to platforms that are popular with children, they also become popular with adult predators seeking to prey on them,” she said.

“Roblox is no exception and has become a popular target for paedophiles seeking to groom children.”

Inman Grant said she has recently met with senior Roblox executives, including their Chief Legal Officer and Chief Safety Officer, to outline the Commission’s compliance concerns and what the regulator expects of them when it comes to tackling harms as serious as grooming, sexual extortion and other forms of child sexual exploitation.

“We want platforms to view safety as a high ceiling rather than a dirt floor with companies doing more than just the bare minimum.”

eSafety will closely monitor the implementation of these commitments and may consider regulatory action in cases of future non-compliance, including where commitments are not fully delivered, are subject to delays, or where other instances of non-compliance are identified.

Last week, the eSafety Commissioner also registered a second phase of industry codes, focused on age-inappropriate content such as online pornography and content dealing with suicidal ideation, self-harm and disordered eating.

These codes will apply to platforms including Roblox and a broad range of other services.

As the digital landscape continues to evolve, Inman Grant explained that eSafety will use all available powers to ensure that Roblox—and all regulated services—meet their obligations and prioritise the safety of Australian users.

The Government has also committed to introducing a duty of care for online services, reinforcing the principle that safety must be built into platforms from the ground up.

“The time has come for platforms to take real responsibility for the safety of their users. We will continue to use every tool at our disposal to hold them accountable,” Inman Grant said.

(with AAP)