What you can report to eSafety

eSafety helps remove serious online abuse, and illegal and restricted online content.

Serious online abuse is when the internet is used to send, post or share content that is likely to harm the physical or mental health of the person targeted. This includes:

  • cyberbullying of a child or young person (under 18)
  • adult cyber abuse (18 years and older)
  • image-based abuse (sharing intimate images or videos without the consent of the person shown).

The harmful content could be a post, comment, text, message, chat, livestream, meme, image, video or email. It can be sent or shared via an online or electronic service or platform, including a:

  • social media service
  • email service
  • chat app
  • interactive online game
  • forum
  • website.

Illegal and restricted online content ranges from seriously harmful material such as images and videos showing the sexual abuse of children or acts of terrorism, through to content which should be not be accessed by children, such as simulated sexual activity, detailed nudity or high impact violence.

On this page:

Cyberbullying

eSafety treats all reports of cyberbullying seriously. But for eSafety to investigate cyberbullying of a child or young person under 18, the harmful content must meet a legal ‘threshold’. This means it must be serious enough to be covered by the definition in the Online Safety Act, which is the law that gives eSafety the power to direct online and electronic services and platforms to remove content. 

For eSafety to investigate cyberbullying, the harmful content must have first been reported to the service or platform used to send, post or share it – at least 48 hours before it is reported to eSafety. This is often the fastest way to have the content removed. The eSafety Guide explains how to report complaints to common services and platforms, including social media sites, online games and other apps.

Checklist – Cyberbullying of a child or young person

If the cyberbullying fits this checklist, and the service or platform does not remove the harmful content within 48 hours, then it can be reported to eSafety for investigation. 

  • The person targeted: A specific child or young person under 18 who ordinarily lives in Australia.
  • The threshold for investigation: It must be likely to harm the physical or mental health of the child or young person because it is seriously threatening, seriously intimidating, seriously harassing or seriously humiliating.
  • When to report it to eSafety: When the harmful content was reported to the service or platform at least 48 hours ago, but they did not remove it.
  • Who can report it: The child or young person targeted, their parent or guardian or a person authorised by them (for example a carer, teacher or police officer).
  • Possible outcomes: Removal of harmful content, issuing a notice requiring the person responsible to refrain from further cyberbullying and/or apologise, issuing fines or penalties for services or platforms that don’t remove content, further legal action.

Check other information you need to know before reporting a complaint to eSafety, including the steps to follow, how to fill out the online form and what happens after reporting.

Find out more about cyberbullying or read detailed information about the Cyberbullying Scheme regulated by eSafety.

REPORT NOW

 

Adult cyber abuse

eSafety treats all reports of adult cyber abuse seriously. But for eSafety to investigate and direct an online or electronic service or platform to remove harmful content, the adult cyber abuse must meet a legal ‘threshold’. This means it must be serious enough to be covered by the definition in the Online Safety Act. Under the Act, 'adult cyber abuse' is reserved for the most severely abusive material that has likely been posted with the intention to cause a person serious harm. The very high threshold for adult cyber abuse complaints helps to make sure that free speech and legitimate expressions of opinion are protected.

For eSafety to investigate adult cyber abuse, the harmful content must have first been reported to the service or platform used to send, post or share it – at least 48 hours before it is reported to eSafety. This is often the fastest way to have the content removed. The eSafety Guide explains how to report complaints to common services and platforms, including social media sites, online games and other apps.

Checklist – Adult cyber abuse

If the adult cyber abuse fits this checklist, and the service or platform does not remove the harmful content within 48 hours, then you can report it to eSafety for investigation.

  • The person targeted: A specific person who is 18 or older and ordinarily lives in Australia.
  • The threshold for investigation: It must be severely abusive online content that was sent, posted or shared with the likely intention of harming the person targeted, and the content must be menacing, harassing or offensive.
  • When to report it to eSafety: The harmful content must have been reported to the service or platform at least 48 hours ago.
  • Who can report it: The person targeted, or a person who is authorised to report it on their behalf.
  • Possible outcomes: Removal of harmful content, fines or penalties for services or platforms that don’t remove content, fines or penalties for the person responsible if they don’t remove the content, further legal action.

Check other information you need to know before reporting a complaint to eSafety, including the steps to follow, how to fill out the online form and what happens after reporting.

Find out more about adult cyber abuse or read detailed information about the Adult Cyber Abuse Scheme regulated by eSafety.

REPORT NOW

 

Image-based abuse

Image-based abuse is when a person shares, or threatens to share, an intimate image or video of a person without their consent. The intimate image or video can show, or appear to show:

  • a person’s genital area or anal area (whether bare or covered by underwear) 
  • a person’s breasts (if the person identifies as female, transgender or intersex)
  • private activity (for example getting undressed, using the toilet, showering, having a bath or engaging in sexual activity)
  • a person without attire of religious or cultural significance that they would normally wear in public (such as a niqab or turban).

The images or videos can be real, or they can be altered or faked to look like you. Or they can be shared in a way that makes people think it’s you, even when it’s not (such as a nude of someone else tagged with your name).

Checklist – Image-based abuse

Use this checklist to find out if eSafety is likely to be able to investigate your complaint.

  • The person shown: A child or adult of any age who ordinarily lives in Australia, or a person targeted by someone who ordinarily lives in Australia.
  • The threshold for investigation: Someone must have shared, or threatened to share, a real or fake intimate image or video of a person without their consent.
  • When to report it to eSafety: Immediately.
  • Who can report it: The person shown in the intimate image, their parent or guardian, or another person who is authorised to report on their behalf.
  • Possible outcomes: Removal of intimate images and videos, fines, penalties or other regulatory action against the person responsible.

Check other information you need to know before reporting a complaint to eSafety, including the steps to follow, how to fill out the online form and what happens after reporting.

Find out more about image-based abuse or read detailed information about the Image-Based Abuse Scheme administered by eSafety.

REPORT NOW

 

Illegal and restricted online content

eSafety also investigates complaints from people and law enforcement agencies in Australia about illegal and restricted online content.

This includes online content that shows or encourages the sexual abuse or exploitation of children, terrorist acts, other types of violent crimes or extreme violence – including murder, attempted murder, rape, torture, violent kidnapping – or content that shows self-harm or suicide or explains how to do it.

Anyone of any age who lives in Australia can report illegal and restricted online content to eSafety. 

Check other information you need to know before reporting a complaint to eSafety, including the steps to follow, how to fill out the online form and what happens after reporting.

You can find out more about illegal and restricted online content or read detailed information about the Online Content Scheme and the Abhorrent Violent Conduct Powers regulated by eSafety.

REPORT NOW

 

What you can report and the steps to follow