Report to popular sites

Generally, sites with well-articulated policies, terms of use and reporting mechanisms will also have safety agents, sometimes referred to as moderators, who will triage and assess the reports and take action.

Because of this, reporting image-based abuse to the website or social media site where the images or videos are posted can be the quickest way to have the images taken down.

We have identified many of these services and have provided direct reporting links for you to use below.

Direct reporting links

After you report or request removal, the service may remove the content straight away. They may also block the person who posted the image or delete their account. Sometimes a service may ask you for proof that you are the person in the picture, for example, by sending them a scanned copy of your ID. If you’re uncomfortable with providing ID to the service, consider reporting to us.

Staying safe

If the image-based abuse is part of domestic violence or other abuse, staying safe is your number one priority. Learn more and connect with support.

What do I do if my image is not taken down?

If you have reported to one of the sites or social media services listed below and your image has not been removed, you can make a report to the Office of the eSafety Commissioner and we will do our best to help you get the image removed.

Report to the Office

Remember to preserve evidence

It is understandable that victims of image-based abuse want the images or video to be taken down immediately. But it is important to preserve evidence before you request removal. You may need this evidence if you wish to take legal action and it can also assist when you report the images and video using the links below.

How to collect evidence

Direct reporting links for popular websites and social media services

The websites and social media services listed below have formal removal processes for image-based abuse. Find the site or service on which the image or video of you was posted to make a report.

Bing, OneDrive, Xbox Live (Microsoft)

Microsoft will remove links to intimate images and videos that have been shared without consent when they are notified by the individual. The content is removed from search results in Bing globally and will also be removed if it is shared via OneDrive or Xbox Live.

Blogger

Non-consensual sharing of private, nude or explicit images and videos can be reported by the individual who is shown in the content or by their legal representative.

Facebook

Facebook will remove images or videos that have been shared in an act of revenge or without permission from the people in the images. Facebook will also take action if threats are made to share intimate images. Facebook uses photo-matching technology to help stop future attempts to share the image on Instagram, Facebook and Messenger.

Flickr

Non-consensual pornography is not permitted on Flickr, which is one of Yahoo’s services. Yahoo will investigate and remove private images and videos that have been posted without consent on Flickr.

Google

Google will remove non-consensual sharing of nude and explicit images and videos from Google’s search results. They will only remove images or videos that meet certain requirements, so please review their reporting process and include all the requested information. You will find more information about the reporting process under ‘Tips for filling out the image removals form’ at the links below.

Grindr

Grindr does not allow video, audio, photographs or images of any person under 18, or of any other person without their express permission, on personal profile pages. It also does not allow offensive or pornographic materials on profile pages. Grindr will also remove impersonation accounts.

Imgur

Imgur will remove nudity and sexual content, including images taken of someone without their knowledge or consent. Sexually explicit comments may also be removed. You can email Imgur at abuse@imgur.com or use their online reporting form.

Instagram

Instagram does not allow the sharing of sexual content, nudity, or threats to post intimate images of others. Instagram uses photo-matching technology to help stop future attempts to share certain images on Instagram, Facebook and Messenger, which are all owned by Facebook.

Pinterest

Pinterest will remove content that sexually exploits people. Content will be removed from both secret and public boards.

Reddit

Reddit prohibits the posting of photographs, videos, or digital images of users in a state of nudity or engaged in any act of sexual conduct, taken, or posted, without their permission.

Snapchat

Snapchat will remove content that invades people’s privacy, including pornographic content and content that has been taken of people without their consent or knowledge.

Tinder

Tinder may investigate or terminate an account that impersonates any person, posts images of another person without their consent or posts content that is threatening, sexually explicit or contains nudity.

Tinder has an in-app reporting function: to report someone, go to their profile, tap the Menu icon (ellipsis icon) and tap Report.

Tumblr

Tumblr will take action against non-consensual pornography or private photos or videos that have been taken or posted without a person’s consent

Twitter

Posting intimate photos or videos that were taken or distributed without a person’s consent is a violation of Twitter’s rules. Content that is reported will be removed.

Whisper

Nude or intimate pictures are not permitted on Whisper. Whisper will ban users, suspend accounts and remove content that violates their guidelines.

Yahoo

Non-consensual pornography is not permitted on Yahoo’s services and they will investigate and remove private images and videos that have been posted without consent.

YouTube

There are a number of categories that are relevant when flagging videos for removal on YouTube in the context of image-based abuse. YouTube does not allow pornographic or sexually explicit content on its site and will remove it when made aware of this content. Content that has been uploaded to YouTube with the intent to humiliate and/or personally identify an individual will also be removed under YouTube’s harassment policies. To flag videos for removal from YouTube use the flag option underneath the specific video.

What if the site the image or video was posted on is not included here?

You can check to see whether this site is included in the Cyberbullying Research Center’s regularly updated list of contact information for social media apps, gaming networks, and related companies. This includes links to support and resources for bullying, harassment and threats.

Cyberbullying Research Centre

If the site your image or video was posted on is not listed above, you will find information and safety advice on how to report to other sites at the link below.

How to report to unlisted sites

If the image-based abuse is part of domestic violence or other abuse, staying safe is your number one priority

If image-based abuse is being used to threaten, blackmail or control you or someone else, seek support before you remove an image. This behaviour known as sextortion and may have legal consequences.

We understand the urgency in requesting the removal of the content. However, if the perpetrator is potentially violent or abusive, you may want to speak to a support service such as 1800RESPECT , police or a legal service to ensure you have a safety plan in place before you request the removal of images and video.

You will find more options for support and counselling services in the support section of this website.