eSafety Commissioner Julie Inman Grant: Senate Estimates Opening Statement

It’s fitting to be here to speak with you on Safer Internet Day. And in the world of online safety, a lot can change in just a day. Even more in the two months since we were here last.  

Just last week, we released a transparency report demonstrating eight of the world's tech giants are continually failing to prevent all of their platforms and services from being weaponised by predators for online child sexual exploitation and sexual extortion.

These are the most egregious forms of online harm imaginable. If there isn't the will to prevent the sexual abuse of children, what hope do we have for addressing other persistent, or more hidden, harms?

Also last week, our regulatory counterparts at the European Commission issued a preliminary finding that Tik Tok had failed to mitigate risks posed by its addictive design. Recently we saw YouTube announce it would roll back policies to allow for the monetisation of harmful content, including around self-harm and suicidal ideation. Grok continued to attract worldwide attention to the concerning content it’s generating – and eSafety has two pending related investigations into xAI and illegal content on the X platform.

But some of this backslide has been met with forceful pushback. Our Australian partners in law enforcement uncovered a sadistic child sexual abuse ring. France, Spain, Malaysia and Ireland have followed our lead and are pursuing social media minimum age restrictions.

And we wrote to Roblox indicating we will actively test whether or not the nine safety commitments they made to us last year have been effectively implemented, particularly those changes designed to prevent adults and children intermingling on their platform.

We are also pleased to announce today that as a result of our engagement with Apple, the company has taken further action against chat roulette style apps which can actively pair sexual predators with children.  

Since late last year, Apple has removed or issued warnings to over 100 of these apps and terminated the accounts of dozens of app developers. The company has also updated its policy in response.

This has demonstrated how app stores can work as effective gatekeepers against harms. We will be reminding app store providers of their obligations under the App Code to do the same with nudifying apps that allow the creation of exploitative deepfakes of children and adults, just as we have already done with one of the most prolific nudifying sites that was being utilised with devastating impact in Australian schools.

I want to turn to the implementation of the social media delay. It has been more than two decades since the major social media platforms were built for adults but inhabited by children. These were never designed to optimise for safety but instead for revenues and reach.  

The social media delay is more than a world-first digital reform, it also signals a major cultural reset. No such reset – whether speed limits, sun and water safety, or smoking – have reached full fruition and impact in two months.

There is no precedent for a reform of this scale to take hold that fast.

Our goal is to normalise the age of access to the dangerous algorithms and features of social media to the age of 16. This also means social media companies do not have access to our children while we take this vital time to further build their digital literacy and hone their critical reasoning skills.

You can be assured we are being rigorous in ensuring the platforms comply with the law.

As an AI-generated Mark Twain may have extrapolated today: "Reports of the death of the social media ban have been greatly exaggerated."

Indeed, we are still in both the earliest and most complex phase of the regulatory process – ensuring companies are effectively deploying their technologies, policies and processes to prevent circumvention of their age assurance systems. 

Remember while some of these companies have very accurate age inference tools, some previously had no age gating at all beyond self-declaration and are starting effectively from scratch.

It is also important to remember that this is complex technical regulation, rather than instant gratification.  

But I will reiterate, the first stage of this process, the restriction of 4.7 million social media accounts, cannot be considered anything but a stunning success.  To put this in perspective, there are 2.5 million 8- to 15-year-olds in this country. By any standard, that is a good start.

What eSafety is doing now in this next early stage – which is not an end state – is conducting intensive investigations, testing claims, building an evidence base, and determining whether reasonable steps are genuinely being taken.

This is not dalliance; it is due diligence. And it’s how enforcement will succeed – not just in the media cycle or before Parliament, but in courtrooms and boardrooms and across an entire global digital ecosystem.

Real, meaningful change must be done properly. That is the only way it can remain durable.