eSafety has issued the gaming services with legally enforceable transparency notices after concerns were raised that online games are being used by sexual predators to groom children and by extremist groups to spread violent propaganda and radicalise young people.

Roblox is facing more than 140 lawsuits in the US alleging that it has failed to stop the sexual exploitation of children.

Last week Roblox agreed to settlements with the US states of Alabama and West Virginia for more than $23 million. A fortnight ago the company announced tailored accounts for young users.

The transparency reporting notices require the providers to explain how they are identifying, preventing and responding to these harms, as well as cyberbullying and online hate.

The notices ask about how their systems, staffing and safety by design choices are aligned with the Australian Government’s Basic Online Safety Expectations.

eSafety Commissioner Julie Inman Grant said that in cases of serious online harms such as grooming, sexual extortion and youth radicalisation, online game and gaming-adjacent platforms like encrypted message services could serve as a point of first contact between children and offenders.

“What we often see after these offenders make contact with children in online game environments, they then move children to private messaging services,” Inman Grant said.

“Gaming platforms are amongst the online spaces most heavily used by Australian children, functioning not only as places to play, but also as places to socialise and communicate. 

The eSafety Commissioner said their research into children and gaming showed around 9 in 10 children aged 8 to 17 in Australia had played online games.

“Predatory adults know this and target children through grooming or embedding terrorist and violent extremist narratives in gameplay, increasing the risks of contact offending, radicalisation and other off-platform harms.

“We’ve seen numerous media reports about grooming taking place on all four of these platforms as well as terrorist and violent extremist-themed gameplay.

This includes Islamic State-inspired games and recreations of mass shootings on Roblox, as well as far right groups recreating fascist imagery in Minecraft. 

“Media reports have also pointed to games in Fortnite gamifying the horrific events of the WWII Jasenovac concentration camp and the January 6th US Capitol Building riots, while Steam is reportedly a hub for a number of extreme-right communities.

“These online game and gaming-adjacent platforms are used by millions of children and so it is imperative that they take every possible step to protect them and continue to improve safeguards.

“These companies must take meaningful steps to prevent their services becoming onramps to abuse, extremist violence, radicalisation or lifelong harm.”

eSafety said its aim is to ensure all users, especially children can enjoy the benefits these platforms have to offer without experiencing avoidable harms.

In addition to responding to transparency reporting notices and how they comply with the Expectations, online game platforms are also required to comply with minimum obligations under the Online Safety Codes and Standards.

A breach of a direction to comply with a code or standard can result in penalties of up to $49.5 million per breach.