How to report adult cyber abuse
Every situation is unique and eSafety is committed to helping all Australians who experience adult cyber abuse.
Harmful content can often be removed by the online or electronic service or platform that was used to send, post or share it. In the most serious cases, if a service or platform does not help you within 48 hours, eSafety can direct them to remove the harmful content.
Even if we can’t investigate your particular case, eSafety still has lots of information to help you protect yourself, deal with adult cyber abuse, and find counselling and support.
On this page:
What you can report
For eSafety to investigate adult cyber abuse, the harmful content must have first been reported to the service or platform used to send, post or share it – at least 48 hours before it is reported to eSafety. This is often the fastest way to have the content removed. The eSafety Guide explains how to report complaint to common services and platforms, including social media, online games and other apps.
Also, the harmful content must meet the legal definition of 'adult cyber abuse'. This means it must be both:
- intended to cause serious harm, and
- menacing, harassing or offensive in all the circumstances.
‘Serious harm’ means that the content is likely to have a severe negative impact on the physical or mental health of the person targeted. The impact can be temporary, long lasting or permanent. It includes serious distress that goes beyond ordinary emotional reactions such as anger, fear or grief.
The ‘serious harm’ threshold for adult cyber abuse investigations is set deliberately high so that it balances freedom of speech, or legitimate expressions of opinion, against the need to protect everyone’s ability to participate online. The Adult Cyber Abuse Scheme is not intended to regulate hurt feelings, purely reputational damage, bad online reviews, strong opinions or banter.
Harmful adult cyber abuse content can be sent, posted or shared in many ways. For example, it can be a post, comment, text, message, chat, livestream, meme, image, video or email. It can use:
- a social media service, or
- a relevant electronic service such as an email service, chat service, instant messaging service or an online game where users play against each other, or
- a designated internet service such as a website.
Find out more about adult cyber abuse is and how it is defined under the Online Safety Act 2021.
Types of adult cyber abuse that are likely to meet eSafety's threshold for investigation include:
- being harassed and threatened with violence because of your physical appearance, religion, gender, race, disability, sexual orientation or political beliefs
- finding your personal contact details have been made public on a social media service or other online platform in order to scare, harass or attack you
- being threatened with serious harm and other people online being encouraged to join in
- being stalked and threatened online, particularly in the context of domestic and family violence
- being encouraged to harm yourself, especially if you are known to be at particular risk (for example, if you have a mental health condition)
- repeatedly being sent obscene and threatening messages as part of ongoing harassment.
Who can report
A complaint about adult cyber abuse may be reported by the person targeted by the abuse, or another person who is authorised to report it on their behalf.
The harmful content must target a person who is 18 years or older, who ordinarily lives in Australia. eSafety cannot help people who live in other countries.
Also, the harmful content must target one specific person, not a broad range or group of people. For example, eSafety cannot act on behalf of an organisation that is experiencing racist or misogynist abuse online, as this abuse is directed at a group rather than a specific person. In cases like this, the Australian Human Rights Commission may be able to help you.
For help for children and young people under 18, you may be able to report a cyberbullying complaint.
This checklist will help you work out if eSafety can investigate your experience.
- The person targeted: A person ordinarily living in Australia who is 18 or older.
- The threshold for investigation: It must be severely abusive online content that was sent, posted or shared with the likely intention of harming the person targeted, and the content must be menacing, harassing or offensive.
- When to report it to eSafety: The harmful content must have been reported to the service or platform at least 48 hours ago.
- Who can report it: The person targeted, or a person who is authorised to report it on their behalf.
- Possible outcomes: Removal of harmful content, fines or penalties for services or platforms that don’t remove the content, fines or penalties for the person responsible if they don’t remove the content, further legal action.
Steps to report adult cyber abuse
To make a report to eSafety, an online service or the police, you will need to collect evidence of what has happened and where. This can include noting information like the web page address (URL) and the other person’s user profile, or taking screenshots.
It is important to collect the evidence first, because it can be difficult to prove what happened once the person who targeted you is blocked.
Find out more about how to collect evidence.
Report harmful content
Report the harmful content to the service or platform used to send, post or share it. This is often the fastest way to have the content removed. You can find reporting links for many services and platforms, including social media, online games and other apps in The eSafety Guide.
If the harmful content is serious enough to meet the legal definition of adult cyber abuse, and the service or platform does not remove it within 48 hours, you can report to eSafety using our online form and we will help to have it removed. We will ask you for evidence that you have complained to the service or platform first, such as a receipt, reference number or report number.
You can also report serious online abuse to the police. This is very important if someone is threatening you, or your family or friends. Find out more about how to get police and legal help.
Prevent further contact
Use in-app functions to ignore, mute or block the other person and update your privacy settings. The eSafety Guide has advice on key online safety functions for many online services, including social media, online games and other apps.
Get more help
Experiencing or helping someone who has experienced serious online abuse can be very distressing.
You may find it helpful to use the strategies we recommend for managing the impacts of adult cyber abuse, including tips for taking care of your wellbeing.
You can also find counselling and support that is right for you.
What eSafety does next
You can expect to hear from us within two business days, but in many cases we will contact you sooner – especially if we need further information.
One of eSafety’s investigators will assess your complaint to check if it fits the legal definition of adult cyber abuse and meets the threshold for further action. When we have completed our assessment of your complaint, we will notify you of our approach using the contact details you supplied. We will also notify you if we decide not to investigate your report.
If we can investigate
When helping to have adult cyber abuse content removed, eSafety may make informal requests or take formal actions.
eSafety will often approach the online or electronic service or platform informally to ask them to remove the adult cyber abuse content. We have found this generally results in faster removal of harmful content, especially when it breaches their own terms of service.
We may also report the accounts of the person who targeted you to the online service or platform to take appropriate action – this could result in deletion of the person’s accounts.
We generally adopt a graduated approach to formal actions, starting with the least severe.
- Service provider notification – this puts the service or platform on notice, letting them know that eSafety is aware adult cyber abuse content is available on their service.
- Removal notice – this can be sent to the person responsible for the adult cyber abuse and/or the service or platform. It requires the recipient to take all reasonable steps to remove the adult cyber abuse content within 24 hours (or a longer timeframe specified by eSafety).
- Enforcement action – we can issue a formal warning to the person responsible for the adult cyber abuse, fine them or seek a civil penalty order against them if they do not comply with a removal notice. If the service or platform does not comply with a removal notice, we can take legal action such as requesting an enforceable undertaking or court injunction.
Find out more about the full range of actions eSafety can take by reading about the Adult Cyber Abuse Scheme.
If we can’t investigate
We will notify you if we decide we can’t investigate your report, and we will try to help you in other ways such as:
- providing tips and information for avoiding or minimising the impact of abusive content
- directing you to resources and other organisations or agencies that may be able to provide further support.
Find out more about other options if eSafety can't investigate.
Can I have eSafety’s decision reviewed?
You can request a review if eSafety decided not to issue a removal notice for content that meets the definition of adult cyber abuse. This review will be handled in the first instance internally, by another staff member of eSafety. If you are still dissatisfied, you can seek review by the Administrative Appeals Tribunal
Find out more about review rights, including who can seek a review.
Report serious adult cyber abuse to eSafety
If your experience meets the legal definition of adult cyber abuse, and the service or platform does not help you within 48 hours, you can report the harmful content to eSafety using our online form.