Find out about social media age restrictions for Australians under 16. Learn more

Frequently asked questions about access to online porn and other adult content

The industry codes aim to better protect children from a range of harmful and age-inappropriate content, including pornography, extreme violence, disordered eating and self-harm.

These frequently asked questions (FAQs) will help you understand how the codes will work in practice, including how they were developed, the age checks required, and what it means for searching or viewing adult content.

Learn more about how the industry codes and standards help to protect Australians from illegal and restricted online content.

Click or tap on the + to find more information.

The Australian Government decided pornography is legal to view in Australia but restricted its access to adults.

Age restrictions for pornography and other high-impact material are a lot like those for cigarettes, alcohol and public gambling. They allow the community to protect children from things that might endanger their immediate safety or harm their long-term health and development.

Through the Online Safety Act, Parliament required the online industry to provide Australian children greater protection from harmful and age-inappropriate content.

This is exactly what new industry codes registered by eSafety set out to achieve: reducing the risk of children being exposed to material they’re not ready to see.

No. You’ll still be able to use a search engine without logging in.

However, if you’re not logged into an account and your search returns pornographic or extremely violent images, these will be blurred by default.

If you enter a search relating to suicide or self-harm, any material promoting this will be downranked, while reliable health information and support services will be promoted.

No. These steps formalise many practices already adopted by major search engine providers.

Other countries are also implementing age-assurance measures across a range of online platforms and services to protect children from exposure to pornography and other harmful and age-inappropriate content.

No. Age checks are managed by the service, not by the Australian Government or eSafety.

The codes do not require people to be identified and/or linked to their activity, nor do they require or allow any such data to be shared with the Government.

All companies subject to codes must continue to comply with any applicable privacy laws, including the Privacy Act and its privacy principles.

No. Not all services are required to check a person’s age.

Age checks will be required on certain ‘high-risk’ services and platforms featuring harmful and age-inappropriate content, including:

  • Pornography and other adult websites.
  • App stores when you want to download R18+ apps.
  • Social media services that allow online pornography, self-harm material or high-impact violence.
  • Artificial intelligence (AI) chatbots or generative AI services that are capable of generating sexually explicit, self-harm or violent material without appropriate safeguards.
  • Messaging services or online games rated R18+.

For those services and providers deemed ‘medium’ or ‘low risk’ there’s minimal or no change to how you access or use them.

In some cases, you might be asked to state your age so extra safety protections and filters can be automatically applied. For example, mobile phones, tablets and laptops will prompt you to provide the age of the person using them. However, you or your child will not need to provide any proof of age. It’s an opt-in safety system.

In addition to age checks, there are additional requirements placed on most services and providers to improve safeguards and reporting tools to benefit all Australians, especially under-age users. These safety improvements can include:

  • improving their safety tools and features over time
  • providing online safety resources that are clear and accessible to Australians
  • ensuring Australians can easily report or complain about any perceived breaches of the codes.

  • Accessing adult websites (for example, pornography sites).
  • Downloading or using apps that are rated R18+, including simulated gambling apps.
  • Using social media services or features that allow online pornography, self-harm material or high-impact violence.
  • Downloading or using AI chatbots or generative AI services that could generate sexually explicit, self-harm or violent material without appropriate safeguards.
  • Downloading or using messaging services or online games rated R18+.

For ‘high risk’ services or providers required to carry out age checks under the codes, the method they use is up to them, so long as they meet the definition of ‘appropriate age assurance’.

Methods they might use can include:

  • confirmation by a parent of age
  • photo identification
  • facial age estimation
  • credit card checks
  • digital identity wallets or systems
  • use of artificial intelligence technology to estimate age based on relevant data inputs
  • a third-party age-assurance vendor.

Whatever the chosen method, it must minimise the collection of personal information and comply with applicable privacy laws, including the Privacy Act.

No. There are a number of way platforms or services can check you're over 18 under the code.

It's up to the individual platforms to decide which methods they use. The only requirement is that they be effective and comply with any applicable privacy laws, including the Privacy Act and its privacy principles.

The codes specify that any measures introduced by services must comply with applicable privacy laws, including the Privacy Act.

The Australian Government’s independent Age Assurance Technology Trial also found age-assurance technology can be both effective and safeguard privacy.

The Office of the Australian Information Commissioner is the Australian privacy regulator. It can take action whenever it considers a party is in breach of Australian privacy law.

Yes.

The only time you would need to prove your age is if you’re using a messaging service specifically designed for sharing sexual content or sexual activity.

In all other cases, you can message or text sexual content to your partner or another person with their consent. Most messaging services also offer tools that can automatically blur sexual content if you want that protection in place.

The internet wasn’t designed with children in mind, even though children and adults both rely on the internet and digital devices for study, work, connection, relaxation and self-expression.

eSafety research shows children and young people are regularly exposed to harmful and age-inappropriate content, including pornography, high-impact and extreme violence, disordered eating and deliberate self-harm.

For example, our report ‘Accidental, unsolicited and in your face’ found around 10% of children have accidentally stumbled across online pornography by the age of 10. This climbs to almost 30% by age 13. Two in five young people also said their first exposure to pornography happened when they were searching for something else, such as visiting a gaming site or checking their social media feed.

In addition to pornography, many children report encountering other harmful and age-inappropriate content online:

  • 44% of children aged 10 to 17 years had seen content encouraging unhealthy eating or exercise habits.
  • 27% had seen content showing or encouraging illegal drug taking.
  • 22% had seen extreme real-life violence.
  • 19% had seen material suggesting ways to self-harm or suicide.
  • 12% had seen violent sexual images or videos.

The purpose of these codes is to require certain high-risk online services featuring this content to check people’s ages before they access or view it.

While the codes will provide stronger protections for children, they also require services to give all Australians information, tools and options to limit their exposure to this sort of content.

  • App distribution services – also called app stores.
  • Designated internet services – a broad category covering online services (mainly websites) that provide entertainment, education or information. It also covers some generative AI services and AI model distribution platforms.
  • Equipment providers – including operating software providers. This typically relates to phones, tablets and laptops, or other devices that allow direct interaction between people, are portable and allow you to search the internet.
  • Hosting services – covers the servers and infrastructure that make websites or online services accessible on the internet.
  • Internet service providers – including phone and home broadband services.
  • Relevant electronic services – covers email, messaging or online chat (including dating services), as well as services for playing online games together.
  • Search engines – such as Google or Bing.
  • Social media services.

You will start to see some changes from December 2025.

For search engine services, internet carriage services and hosting services, most changes will take effect from 27 December 2025.

For designated internet services, relevant electronic services, social media services, app distribution platforms and equipment providers, most changes are expected to take effect from 9 March 2026.

The Australian Government’s independent Age Assurance Technology Trial found geolocation technology has the potential to strengthen age assurance by helping detect when someone might be trying to mask their real location, for example by using a Virtual Private Network (VPN).

Services could also consider using other signals or data they already access or collect to determine if a person is likely to be in Australia.

This depends on the service and if it’s designated as ‘high risk’ by industry.

Some services may allow one-off age checks that apply across future sessions or across connected services.

Others may require session-by-session or service-by-service confirmation.

Providers must balance effectiveness with usability and privacy.

Industry bodies that drafted the codes said this was the best approach to implementing age assurance measures at this time.
 

No. These codes were drafted by industry bodies representing the online industry.

In June 2024, eSafety published a Position Paper setting out expectations for what it hoped the codes might achieve, but ultimately industry was responsible for deciding what to include or exclude in its submitted codes.

As required under Australia’s Online Safety Act, the eSafety Commissioner consulted on the development of the codes and assessed the drafts to ensure they provided appropriate community safeguards against harmful and age-inappropriate content for children under 18 years.

The Commissioner found they did and then registered the codes, making them enforceable under the Act.

Social media services that allow online pornography or self-harm material must ensure people are 18 years or older before allowing access to this material.

Services that do not allow online pornography, high-impact violence or self-harm material according to their own terms of service must detect and remove this material, as well as keep improving their detection systems over time.

Services must provide better options for all Australians to reduce their risk of exposure to online pornography, self-harm material and high-impact violence.

Services with AI companion chatbot features must assess the risk of children generating sexually explicit material, self-harm material and high-impact violence. Chatbots with the highest risk must ensure people are 18+, while those with moderate risk must put safety guardrails in place.

The codes protect children from under 18 from harmful and age-inappropriate content across a wide range of services, while the social media age restrictions primarily protect children under 16 from harmful social media features, such as algorithms and recommender systems.

The codes complement the social media age restrictions by adding an extra layer of protection on services not covered by that framework, as well as making age-restricted social media services safer for under 16s who bypass controls to gain or keep accounts. They also give Australians of all ages tools to avoid pornographic and other high-impact material if they don’t want to see it.

You can find more information about the social media age restrictions on the eSafety website.

When the Australian Parliament legislated the Online Safety Act with bipartisan support in 2021, it didn’t give itself the power to review industry-drafted codes in the same way it can review eSafety-drafted standards.

As provided for under the Act, the codes were written by industry and accepted by eSafety. This is a form of industry co-regulation, which is common in Australia, along with self-regulation. Both can often occur without Parliament reviewing the text of every code. Examples include workplace health and safety, and the telecommunications and traditional media industries.

What the Act does require is for eSafety to ask the online industry to draft codes that will provide Australians with greater protections from online harms. If eSafety Commissioner is satisfied these codes provide appropriate community safeguards, the Act states the Commissioner can register them. That is what occurred with these codes.

In cases where the Commissioner is not satisfied codes provide appropriate safeguards, the eSafety Commissioner can draft a standard. Any such standard will then be subject to a period of Parliamentary scrutiny and the potential for disallowance. That did not occur in this case.

This is the process Australia’s Parliament legislated when it passed the Act with bipartisan support in 2021.

First, report the content directly to the service where you saw it. This is often the fastest way to get it removed. If the platform fails to act, report it to eSafety.

You can also make a complaint to eSafety if you suspect a breach of an industry code or Standard that is in place. The codes and standards are mandatory and enforceable.

Please note: a single instance, or a handful of instances, of content doesn’t necessarily amount to a breach. The codes and standards are about services having systems and processes in place, not about individual items of content.

A breach of a direction to comply with a code may result in civil penalties of up to $49.5 million.

The existing industry codes and standards deal with illegal and restricted material, including child sexual abuse and exploitation material, pro-terror content and extreme crime and violence. They can also cover crime, violence and drug-related content.

These new codes focus on harmful and age-inappropriate content. This includes pornography and sexually explicit content, crime and violence, drug use and drug-related content, suicide and self-harm, eating disorders and sexual violence.