Find out about social media age restrictions for Australians under 16. Learn more

Basic Online Safety Expectations

The Basic Online Safety Expectations, known as ‘the Expectations’ or ‘BOSE’, are a key element of the Online Safety Act.

They outline the Australian Government’s expectations that social media, messaging and gaming service providers and other apps and websites will take reasonable steps to keep Australians safe.

Under the Online Safety Act, eSafety can require online service providers to report on how they are meeting any or all of the Expectations. These notices can require non-periodic reporting (one-off reports) or periodic reporting (multiple reports at regular intervals). The obligation to respond to reporting requirements is enforceable and backed by civil penalties and other mechanisms. The requirements are designed to improve providers’ safety standards and improve transparency and accountability. eSafety can also publish statements about the extent to which services are meeting the Expectations. 

eSafety has issued eight periodic notices, seven information requests and 23 non-periodic notices to online service providers about how they are meeting the Expectations on their services. These have focussed on child sexual exploitation and abuse material and activity (CSEA), online hate, the use of social media by Australian children, terrorist and violent extremist material and activity (TVE) and other illegal and harmful material such as self-harm and pornography.

Read the full reports on our Responses to transparency notices page.

Find out more about the regulatory guidance for providers, how to comply with the Expectations and respond to mandatory reporting requirements. 

On this page:

Stay up to date

Sign up for industry news and guidance to help you meet your obligations and support safer online experiences.

SIGN UP NOW

Transparency notices given

eSafety has given transparency notices under the Online Safety Act to four companies providing the artificial intelligence (AI) companion services, Character.AI, Nomi, Chai and Chub.AI. 

The notices require these providers to report on how they’re meeting the Basic Online Safety Expectations and steps they’re taking to protect children from exposure to a range of harms. This includes sexually explicit conversations and images, suicidal ideation, self-harm content as well as child sexual exploitation and abuse material on their services. 

Further information will be shared once this regulatory process has concluded. 

Summary of the Expectations

Some of the Expectations for providers include ensuring that:

  • all end-users can use online services safely
  • children’s best interests are a primary consideration in the design and operation of services likely to be used by children
  • certain features of a service, such as encrypted services, anonymous accounts, generative artificial intelligence (AI) and recommender systems can be used safely
  • the provision of unlawful and harmful material and activity is minimised
  • end-users can make reports and complaints about unlawful and harmful material and activity and that the service will review and respond to these reports
  • the service has terms of use, policies and procedures to ensure safe use, and that it enforces these terms.

Reasonable steps

The Determination includes examples of reasonable steps that online service providers may take to meet the Expectations. Some examples of reasonable steps set out in the Determination include:

  • undertaking assessments of safety risks and impacts, and implementing safety review processes, throughout the design, development and deployment of the service.
  • making sure the default privacy and safety settings of services used by children, are robust and set to the most restrictive level.
  • continually improving technology and practices relating to the safety of end-users.
  • providing educational and explanatory tools to end-users.
  • working with other online service providers to detect high volume, cross-platform attacks (also known as ‘volumetric’ or ‘pile-on’ attacks).
  • incorporating processes that require verification of identity or ownership of accounts.
  • implementing appropriate age assurance mechanisms
  • publishing regular transparency reports that outline the steps the service is taking to ensure safe use of the service.

Providers should be prepared to report on the steps they have taken, why they are reasonable, and how they help to meet the relevant Expectation(s) and keep people safe.

Further steps that service providers can take can be found in our Regulatory Guidance.

Guidance

The Basic Online Safety Expectations Regulatory Guidance was updated in July 2024 to reflect the amendments as a result of the Online Safety (Basic Online Safety Expectations) Amendment Determination 2024.

It was updated again in January 2025 to reflect that the new Administrative Review Tribunal (ART) replaced the Administrative Appeals Tribunal (AAT).

Where there is a connection between the Expectations and other eSafety work streams – such as the industry codes and the age verification roadmap – we will aim to ensure alignment and consistency across the different elements and aim to use these learnings from different engagement processes.

eSafety encourages providers to review these resources:

Reporting

There are three different ways eSafety can seek information from providers regarding compliance with the Expectations:

  1. Giving a reporting notice to an online service provider requiring them to produce a report about their compliance with any or all of the Expectations. These notices are enforceable, backed by civil penalties and other enforcement mechanisms. They can require non-periodic (one-off) reporting or periodic reporting over a specified time frame of six to 24 months.
  2. Making a reporting determination – a legislative instrument – requiring periodic or non-periodic reporting for a specified class of services. These determinations are enforceable and backed by civil penalties and other enforcement mechanisms if the provider fails to report.
  3. Requesting information about terms of use breach complaints, the time frame for responding to removal notices, measures taken to make sure people can use the service in a safe manner, the performance of online safety measures and the number of active end-users of a service in Australia. Failure to comply would give the Commissioner discretion to prepare a statement.

Why does eSafety publish this information in summaries?

By highlighting what we have learned from transparency notices and information requests, eSafety’s aim is that the information is used by researchers, academics, the media and the public to scrutinise the efforts of industry to encourage implementation of the Expectations and to lift safety practices, protections and standards across the industry.

eSafety recognises that each provider is different, with different architectures, business models and user bases. This means an intervention, or use of specific tools on one platform, may not be proportionate on another. 
However, seen together, these reports represent a significant step towards greater transparency and understanding of what providers are and are not doing to protect Australians online.

Read the full reports on our Responses to transparency notices page.