Australia’s eSafety Commissioner has served legal notices on social media giants Twitter and TikTok, and one of the world’s largest technology companies, Google, requiring them to answer tough questions about how they are tackling online child sexual abuse.
The notices have been issued under the Australian Government’s new Basic Online Safety Expectations, a key part of the Online Safety Act 2021. Notices were also issued to livestreaming site Twitch and online chat and instant messaging service Discord.
The questions will also extend to how these companies are dealing with the growing issue of online sexual extortion as well as the role their algorithms might play in amplifying seriously harmful content.
eSafety Commissioner Julie Inman Grant said the Expectations set out the safety measures expected of tech companies to protect Australian users, particularly children, from harm.
"The creation, dissemination and viewing of online child sexual abuse inflicts incalculable trauma and ruins lives. It is also illegal. It is vital that tech companies take all the steps they reasonably can to remove this material from their platforms and services,” Ms Inman Grant said.
“What we discovered from our first round of notices sent last August to companies including Apple and Microsoft is that many are not taking relatively simple steps to protect children and are failing to use widely available technology, like PhotoDNA, to detect and remove this material.
“Back in November, Twitter boss Elon Musk tweeted that addressing child exploitation was ‘Priority #1’, but we have not seen detail on how Twitter is delivering on that commitment.
“We’ve also seen extensive job cuts to key trust and safety personnel across the company – the very people whose job it is to protect children – and we want to know how Twitter will tackle this problem going forward.
“These powers are designed to shine a light on these platforms and lift online safety by compelling them to give some straight answers to some straight questions.”
If the companies do not respond to the notices within 35 days, they could face financial penalties of almost $700,000 a day.
The spread of child sexual exploitation material online is a global scourge and in 2021 reports by tech companies to the US National Centre for Missing and Exploited Children topped 29 million.
Ms Inman Grant said these reports appear to be the tip of a very large iceberg. eSafety has handled more than 76,000 investigations concerning child sexual exploitation material since 2017 and believes there is a lot more child sexual exploitation material looming beneath the surface.
The Expectations are intended to work hand in hand with proposed new mandatory codes being developed by the online industry to deal with the risk of class 1 content, including child sexual exploitation material, on their services. These dual regulatory approaches are key to tackling this issue at scale and will compel online service providers to harden their platforms to predators and paedophiles.
The eSafety Commissioner recently asked industry associations to respond to eSafety’s areas of concern in their latest draft of the codes and resubmit them with the appropriate community safeguards included.
The eSafety Commissioner will then make a final decision on whether to register the industry codes or determine industry standards. The requirements contained in industry codes or industry standards will be enforceable once in place.
eSafety has published its preliminary views on the draft codes at eSafety.gov.au/industry/codes