A child cannot walk into a bar and order a drink. They cannot stroll into a strip club, browse an adult shop or sit down at a blackjack table in a casino.
These safe boundaries are enforced not because we are prudish and over-reaching, but because we recognise something simple, profound and universal: children are not adults. Their future selves are still under construction.
Their emotions; judgment; identity; perceptions of relationships and consent; impulse control; and understanding of risk are far from fully formed.
When it comes to the online world, however, these commonsense protections inexplicably dissolve.
It’s not because the risks are fewer online. If anything, the potential for harm is far greater.
A child who could never enter a physical adult venue can, within seconds, access pornography more extreme than anything sold behind the counter of a bricks-and-mortar adult store. They can stream high-impact violent footage far worse than anything they might see by sneaking into an R-rated movie.
That’s not to mention the new wave of AI companion chatbots entrapping and entrancing impressionable young minds with human-like, sycophantic and often sexually explicit conversations, some even going as far as encouraging self-harm and suicide.
We have tacitly allowed the growth of a digital world that ignores principles and rules we enforce in the physical one.
But this is changing. Australia is drawing a line in the digital sand in the form of enforceable industry codes, which effectively take many of the protections applied to children in the offline world for generations and installs them into the online one.
Known as the Age Restricted Material Codes, these new rules cover most corners of the online ecosystem, from device manufacturers, gaming services and app stores to social media, messaging services, generative AI systems, websites and search engines.
They require the entire online industry to put in place meaningful protections preventing children’s exposure to content they are not ready to see – and cannot unsee.
We’re talking about high-impact violence, pornography, self-harm, suicide and disordered eating content.
These new codes complement existing rules that deal with material such as child sexual exploitation material and pro-terror material.
But despite their involvement in writing the codes, we’ve already seen Aylo, the Canadian owner of Pornhub, announce it will only offer “safe for work” content on its free services in Australia.
To access its more explicit content, the company says it will now move to age check requirements for paid, age-restricted services.
But this is a business decision for Aylo rather than a technical one. I am confident companies have the capability to develop and deploy the technologies that protect children from age-inappropriate material and still allow adults to access this content.
Sites that provide adult material will be required to conduct privacy-preserving age checks, but the precise methods are up to them. It is the protective outcomes that matter here.
We know more and more young people are encountering age-inappropriate content unintentionally at a very young age.
Our own research supports this; one in three young people told us that their first encounter with pornography, for example, was before the age of 13, and this exposure was “frequent, accidental, unavoidable and unwelcome”. Many described this exposure as being disturbing and “in your face”.
No one argues that asking for ID at a bar is an unacceptable burden on a business, or that the responsibility should fall upon the alcohol distributor.
The entity that controls the entryway to the risky environment carries the duty of care. You control the room, you own the door.
For years, the dominant and often self-serving narrative has been that digital spaces are too complex, too global or too technologically fluid to regulate in the same way as physical ones.
But complexity should never be used as an excuse for inaction. Financial systems are complex and global, and yet we still regulate them. Pharmaceutical supply chains are complex and global, and we still regulate them.
And we require almost every consumer good, from cars to electronics, imported into Australia to be built to our safety standards. Complexity demands innovation.
So why should the world’s biggest tech platforms – many of which generate enormous revenue from Australian users – be exempt from a similar responsibility to protect children than what we expect from the local pub or cinema?
If we agree that a minor cannot enter an adult shop because of the psychological impact of explicit adult content they might encounter there, why is streaming explicit material to a smartphone treated as inevitable?
The idea behind these codes is that these principles should be consistent: environments designed for adults should not be freely accessible to children, regardless of whether they are built with bricks or computer code.
If we all agree children deserve protection in the physical world, then it stands to reason they should deserve it online, too.