The codes will focus on preventing young children from encountering material they are not ready to see and are too young to understand, while also empowering Australian internet users with options to manage their exposure to certain online material.
While they will focus on pornography, the codes will also be expected to cover other high-impact material including themes of suicide and serious illness, which could include self-harm and disordered eating.
The codes will cover app stores and apps; websites – including porn websites; search engines; social media services; hosting services; internet service providers; instant messaging, SMS, chat, multi-player gaming, online dating services; and equipment providers.
eSafety Commissioner Julie Inman Grant said that in today’s online world, pornography is so pervasive and invasive that children are often exposed to it by accident and at increasingly younger ages.
“Our own research shows that while the average age when Australian children first encounter pornography is around 13, a third of these children are actually seeing this content younger and often by accident,” Inman Grant said.
“We know kids will always be curious and will likely seek out porn as they enter adolescence and explore their sexuality, so, many of these measures are really focused on preventing unintentional exposure to young children.”
Inman Grant said children are not only accessing the material via porn sites – 60 per cent of young people have indicated they have been exposed to pornography on social media.
“This exposure was often unintentional and happened on popular services including TikTok, Instagram and Snapchat,” she said.
“The last thing anyone wants is children seeing violent or extreme pornography without guidance, context or the appropriate maturity levels because they may think that a video showing a man aggressively choking a woman during sex on a porn site is what consent, sex and healthy relationships should look like.”
Inman Grant said while exposure to violent and extreme pornography is a major concern for many parents and carers, and they have a key role to play both from a protective and educative standpoint, it can’t be left up to them to shoulder the entire load.
"... we also need industry to play their part by putting in some effective barriers to protect children,” she said.
This could include reasonable efforts to check users’ age, and complementary measures including default safety measures and parental controls, as well as user empowerment tools to filter or blur unwanted sexual content.
These measures could be applied at a range of levels from the connected devices children are using to access the internet to app stores, messaging and social media services and search engines to create multi-layered protection across the technology stack.
Suggested barriers to help protect children include reasonable efforts to check users’ age, and complementary measures including default safety measures and parental controls, as well as user empowerment tools to filter or blur unwanted sexual content.
Industry bodies must present a preliminary draft of the codes to the eSafety Commissioner by October 3 and then provide final codes for registration no later than December 19. They must also hold a public consultation on the codes.
eSafety has also published a Position Paper to assist industry with the development of the codes and ensure there is a clear, shared understanding of eSafety’s expectations of how industry should provide appropriate protections for children.
“We want industry to succeed here and we will work with them to help them come up with codes that provide meaningful protections for children,” Inman Grant said.
“However, if any code should fall short, under the Online Safety Act I have the power to set the rules for them by moving to standards.”
eSafety has also published an Age Assurance Tech Trends Paper considering recent technical and international developments in age assurance technology to provide additional context for the Position Paper.
The codes will complement protections and measures already in place under the Online Safety Act, including the Restricted Access System Declaration, the Basic Online Safety Expectations Determination, and the first phase of industry codes and standards which require industry to take meaningful steps to tackle illegal content like online child sexual abuse material.
These codes will also complement and work alongside significant efforts underway to bolster this toolkit, including the Government’s Age Assurance Trial, ongoing Privacy Act reforms, the statutory review of the Online Safety Act and cross-governmental initiatives to foster respectful relationships under the National Plan to End Violence Against Women and Children 2022-2032.