Australia’s systemic online safety framework reaches a critical milestone on Monday with the full commencement of the Age-Restricted Material Codes, introducing common-sense measures similar to those that have protected children in the physical world for generations.
The codes cover most corners of the online ecosystem, from device manufacturers to app stores, social media and websites requiring the online industry to put in place meaningful protections preventing children’s exposure to age-inappropriate content.
This includes high-impact violence, pornography, self-harm material, and dangerous content such as suicide and disordered eating.
Crucially, the codes will also apply to the AI-powered chatbots and companions which have become increasingly popular with children, preventing them from engaging in conversations with minors that are sexually explicit or encourage self-harm or suicide.
eSafety Commissioner Julie Inman Grant said that for decades society has agreed that there are certain things children are not physically, developmentally or emotionally equipped to deal with and so we have to put in age barriers to protect them.
“We don’t allow children to walk into bars or bottle shops, adult stores or casinos, but when it comes to online spaces where they are spending a lot of their time, there are no such safeguards,” Ms Inman Grant said.
“But that changes for Australian kids with these codes, which simply bring those same, commonsense protections we all grew up with to the online world of today to ensure children are having age-appropriate experiences and not being exposed to potentially harmful content too early.
“Industry must now apply consistent standards across their services so children are not accidentally exposed when they search or scroll online.”
Under the codes, search engines, social media platforms, pornography websites, app stores, gaming providers, and generative AI systems – including companion chatbots – must take meaningful steps to prevent children from being exposed to age-inappropriate content.
The six new Age-Restricted Material Codes join three that are already in force, covering search engines, internet service providers and hosting services.
“Under these codes, if a young person searches the internet for suicide or self-harm content, the first result they see will be a helpline – not a harmful online rabbit hole,” Ms Inman Grant said.
“These obligations will help prevent exposure to potentially harmful content and direct at-risk children to real, lifesaving support. Children’s emotional, psychological development and wellbeing is at stake and so I feel very proud of what we’ve been able to achieve with the industry in Australia.
“These industry-developed codes shift that responsibility back where it belongs – onto the companies designing these digital platforms and profiting from their users – and will give children back a little more of their childhoods.”
Under the codes, adults will continue to have full access to legal adult content, but some services will now require proof of age. The forms of age assurance used must be accurate, robust, fair and reliable. Importantly, any age assurance measures must comply with Australian privacy laws and are managed solely by the service being used – not the Australian Government.
These are some of the main changes under Australia’s Age-Restricted Material Codes:
- AI Companion chatbots – AI companion chatbots capable of generating sexually explicit, high-impact violence or self-harm material, need to confirm someone is 18 or older before allowing them access to that material. This may be required either when a person logs onto the service or at the point of access or generation for that material.
- App stores – App Stores must take appropriate steps to prevent users who are under 18 from purchasing or downloading apps rated R18+ and ensure apps are appropriately rated. If the app store doesn’t already know someone’s age, they may be asked to confirm it through age assurance.
- Messaging – There are no age checks required for widely-used general messaging services or those attached to social media platforms, for example like Facebook Messenger. Users may be asked to verify their age on adult messaging services that specialise in distributing sexually explicit content, pornography or self-harm material.
- Online gaming – Users will have to complete some form of age assurance to access online games classified R18+ by the Australian Classification Board. For all other games, no age checks are required.
- Pornography sites – Users will be asked to confirm their age when accessing age-restricted material on pornography websites and services. Clicking a button that says “I am 18 years or older” is no longer sufficient. This is consistent with similar efforts being implemented internationally.
- Search engines – For users who are not logged into an account, for example a Google account, search results containing pornography and high impact violence will be blurred by default. Logged in children (under 18s) will also have these same safety defaults in place, while for logged-in adults the material will be unblurred, unless they choose to opt in to these safety defaults. For anyone entering searches related to suicide or disordered eating, the first result returned will be a referral to appropriate mental health support services.
- Social media – Social media services that allow pornography or self-harm material must ensure users are 18+ before giving them access to that material. This may involve age assurance when someone logs in, or at the point of access to that material. If they are using a service that doesn’t allow porn according to its own Terms of Service, like Facebook for example, there will be no change.
The new Age Restricted Material Codes will sit alongside existing Unlawful Material Codes and Standards already in force tackling the worst-of-the-worst online content including child sexual exploitation and abuse material as well as pro-terror content.
eSafety will monitor and assess compliance with the Age-Restricted Material Codes, including using our investigations powers, and will take enforcement action where there is systemic non-compliance.
“No piece of regulation will eliminate all risks and harms all at once, but these codes create meaningful protections for children across the tech ecosystem. The Government’s commitment to implementing a digital duty of care will also further strengthen protections in the future,” Ms Inman Grant said.
“The codes also ease the burden on parents, carers and schools, ensuring the online environment aligns more closely with offline protections and they complement other recent reforms, including the social media minimum age obligation.
“But make no mistake, where we see failures or foot-dragging, we will hold companies to account.”
A breach of a direction to comply with a code can result in penalties of up to $49.5 million per breach.
To find out more about the codes go to www.esafety.gov.au/industry/codes