More and more young men are being blackmailed for money after sending sexual images or videos of themselves to criminals posing as attractive young women, with sexual extortion reports to eSafety almost tripling in the first quarter of 2023.
eSafety Commissioner Julie Inman Grant said the number of sexual extortion reports increased from almost 600 reports in the first quarter of 2022 to more than 1,700 in the first quarter of this year.
“The majority of these reports, almost 1,200, were from young people between 18 and 24 years and 90 per cent of all reports were from males,” Ms Inman Grant said.
“Thousands of Australians are coming to us in crisis as blackmailers threaten to send their sexual images or videos to family, friends, and colleagues unless they pay up.
“The cost to these young people is significant. Not only have many paid out thousands of dollars, but countless have suffered deep distress.
“Our reports show that these scammers initially make contact on social media services, with Instagram and Snapchat the most frequently reported to our investigators.
“These criminals are well-versed in the dark arts of manipulation and coercion, often sending their targets very explicit content to encourage compliance and win trust – but the content is fake.
“They’re also ruthless but targeted in their demands, applying more and more pressure to people who respond to their threats. If you pay once, they keep coming back with more demands. And because they’re mainly based overseas, it’s virtually impossible for people to get their money back.
“We need to educate young people, especially young men, to be very wary of attractive strangers approaching them out of the blue who get sexual straight away. There’s a big risk it's an impersonator account being managed by organised crime, looking to blackmail that young person once they share sexual content.
“If anyone finds themselves being blackmailed online after sharing sexual content or having that content capped or screenshot, they should not engage or pay. Instead, take screenshots of the threats and user profile, and record the user profile URL. Then, report it to the platform before you block the account.
“In cases where these images or videos are shared, report it to eSafety.gov.au. Our investigators can help get that content quickly removed through our image-based abuse scheme. If you’re under 18, report this serious crime to specialist investigators at the Australian Centre to Counter Child Exploitation.”
Ms Inman Grant said eSafety investigators have also seen a concerning jump in the amount of reported material featuring children being sexually abused.
Between January and March, eSafety received almost 8,000 reports of child sexual abuse material.
“For the same period in 2022, we received just under 3,000 reports of child sexual abuse material. That’s almost a three-fold increase in reports of the most damaging, most harmful of online content,” Ms Inman Grant said.
“While this increase can be partly attributed to more Australians understanding our role as an online safety net, we believe it’s predominantly being driven by increased global demand and supply of this horrific material – a trend we’d contend has been supercharged since the onset of the pandemic.
“While our bolstered powers under the Online Safety Act allow us to address some of the most prolific and damaging online harms at a systemic level with new transparency and accountability tools, we continue to call on industry to take greater steps to build in safety features and systems from the outset.
“With the next evolution of technology almost upon us, we need industry to work much harder to actively prevent current and new technologies, such as generative AI, from being weaponised against children and young people. These new technologies have the potential to cause harms that are more visceral and more difficult to detect.”
To report image-based abuse or illegal online content, visit: eSafety.gov.au/report.