Need help dealing with violent or distressing online content? Learn more

Social media 'ban' or delay FAQs

Find out the facts about the social media age restrictions that will help keep Australians under 16 safer. 

These frequently asked questions will be added to and updated throughout 2025.

Click or tap on the + to find the answers. 

Download our poster, flyer, presentation and social tile, and help share accurate and helpful information about the social media age restrictions in your community or workplace.

You can also find advice on how to talk with your child about the changes.

Why are under-16s being ‘banned’ from social media?

It’s not a ban, it’s a delay to having accounts. 

Age-restricted platforms won’t be allowed to let under-16s create or keep an account. That’s because being logged into an account increases the likelihood that they’ll be exposed to pressures and risks that can be hard to deal with. These come from social media platform design features that encourage them to spend more time on screens and make it more likely that they will see negative, upsetting or manipulative content. 

For example, the pressure to respond to streams of notifications and alerts and view disappearing content has been linked to harms to health - these include reduced sleep and attention, and increased stress levels. Over-exposure to harmful content can also impact immediate and long-term health and wellbeing.

While most platforms currently have a minimum age of 13 for account holders, delaying account access until 16 will give young people more time to develop important skills and maturity. It’s breathing space to build digital literacy, critical reasoning, impulse control and greater resilience. 

It also means there’s extra time to teach under-16s about online risks and the impacts of harms, as well as how to stay safer online and seek help when they need it. This will give young people a better chance to prevent and deal with issues once they turn 16 and can have full social media accounts.

As the law will apply to all under 16s, parents and carers will no longer need to choose between allowing them to set up accounts on platforms that may negatively affect their health, or making sure they are not socially excluded. No under-16s have to feel like they’re ‘missing out’. Parents and carers won’t have to say ‘yes’ or ‘no’ to social media accounts, instead they can say ‘not yet’.

What will age-restricted social media platforms have to do to comply with the law?

Age-restricted platforms will be expected to take reasonable steps to:

  • find existing accounts held by under-16s, and deactivate or remove those accounts
  • prevent under-16s from opening new accounts
  • prevent workarounds that may allow under-16s to bypass the restrictions
  • have processes to correct errors if someone is mistakenly missed by or included in the restrictions, so no one’s account is removed unfairly.

Platforms should also provide clear ways for people to report underage accounts, or to request a review if they have been age-restricted by mistake.

Age-restricted platforms are also expected to give users who are under 16 information about how they can download their account information in a simple and seamless way prior to account deactivation or removal, or request access to their information within a reasonable period after account deactivation. The information should be provided in a format that is easily accessible. Platforms should consider formats that could allow end-users to transfer their information and content to other services, or to upload the information on the same platform if they sign up again after turning 16. 

The full expectations of platforms are set out in the Social Media Minimum Age Regulatory Guidance.
 

When will the age restrictions start?

After the law takes effect on 10 December 2025, Australians can expect to see age-restricted social media platforms taking steps to stop under-16s setting up or continuing to use accounts.

eSafety recognises this is a complex task, so we’re already consulting with social media platforms about their preparations for introducing effective methods to prevent and remove underage account holders.

We expect age-restricted social media platforms to comply with the requirements once they take effect. We will continue to work with industry to ensure age-restricted social media platforms implement reasonable steps.

Which platforms will be age-restricted?

eSafety has informed Facebook, Instagram, Snapchat, Threads, TikTok, X, YouTube, Kick and Reddit of its view that they are age-restricted platforms and therefore required to take reasonable steps to prevent Australians under 16 from having accounts on their services from 10 December 2025. 

Services that eSafety considers do not currently meet the criteria for being an 'age-restricted social media platform', including those that fall within an exclusion in the legislative rules, include Discord, GitHub, Google Classroom, LEGO Play, Messenger, Roblox, Steam and Steam Chat, WhatsApp and YouTube Kids.

These lists reflect eSafety’s views as at 5 November 2025. We will continue to update them prior to the Social Media Minimum Age obligation coming into effect on 10 December 2025. In addition, eSafety may assess new services that emerge or reassess existing ones if they evolve to the extent that their purpose changes, so these lists may continue to change. Find the latest details about which platforms are age-restricted.

General conditions for age restrictions

More generally, age restrictions will apply to social media platforms that meet four specific conditions, unless they are excluded based on criteria set out in legislative rules made by the Minister for Communications in July 2025.

The conditions for age restriction are:

  • the sole purpose, or a significant purpose, of the service is to enable online social interaction between two or more end-users
  • the service allows end-users to link to, or interact with, other end-users
  • the service allows end-users to post material on the service
  • material on the service is accessible to, or delivered to, end-users in Australia.

Platforms that have the sole or primary purpose of enabling messaging or online gaming are among a number of types of services that have been excluded under the legislative rules.

Multiple purpose platforms

It is important to note that many platforms have multiple purposes.

For example, some messaging services have social-media style features which allow users to interact in other ways apart from messaging. If the primary purpose of the service changes due to the noticeable common use of these social-media style features, then they may be included in the age restrictions.

Also, online gaming services that enable online social interaction through features and functions such as direct messaging, group chats and livestreaming may be included in the age restrictions if the service’s sole or primary purpose changes.

The way online services are used can change over time, many services have multiple purposes, and new services are constantly being developed. So the platforms which are age-restricted may change depending on whether they start to meet, continue to meet or no longer meet the legislative rules for exclusion.

Compliance and enforcement

eSafety does not have a formal role in declaring which services are age-restricted social media platforms. However, platforms we believe to be age-restricted will be informed of this by eSafety, to help them understand their legal obligations under the Online Safety Act. We will also share our view publicly. These platforms must comply with the law when the minimum age requirements take effect on 10 December.  

eSafety’s view will also underpin our decisions to use enforcement powers under the Online Safety Act if a platform does not take reasonable steps to implement the age restrictions. Where a service disagrees with eSafety’s use of these powers, it has the usual legal rights to challenge and seek review our decisions.

As noted in our Regulatory Guidance, eSafety will be taking a proportionate and risk-based approach to compliance, initially focusing on services with the greatest number of end-users, where there are higher risks of harm, accounting for the steps providers are implementing to prevent the youngest users from having accounts. 

Which platforms have been excluded from the age restrictions?

Legislative rules excluding certain types of online services were made by the Minister for Communications following advice from the eSafety Commissioner and consultation with youth groups, parents, carers, the digital industry and civil society groups, as well as experts in child development, mental health and law.

The exclusions apply to:

  • services that have the sole or primary purpose of messaging, email, voice calling or video calling
  • services that have the sole or primary purpose of enabling users to play online games with other users
  • services that have the sole or primary purpose of enabling users to share information about products or services
  • services that have the sole or primary purpose of enabling users to engage in professional networking or professional development
  • services that have the sole or primary purpose of supporting the education of users
  • services that have the sole or primary purpose of supporting the health of users
  • services that have the sole or significant purpose of facilitating communication between educational institutions and students or student families
  • services that have the significant purpose of facilitating communication between health care providers and people using those services.

As of 5 November 2025, services that eSafety considers do not meet the criteria for being an 'age-restricted social media platform', including those that fall within an exclusion in the legislative rules, include Discord, GitHub, Google Classroom, LEGO Play, Messenger, Roblox, Steam and Steam Chat, WhatsApp and YouTube Kids. We will continue to update this list prior to the Social Media Minimum Age obligation coming into effect on 10 December 2025. Find the latest details about which platforms are age-restricted.

Multiple purpose platforms

It is important to note that many platforms have multiple purposes. For example, some messaging services have social-media style features which allow users to interact in other ways apart from messaging. If the primary purpose of the service changes due to the noticeable common use of these social-media style features, then they may be included in the age restrictions. 

Also, online gaming services that enable online social interaction through features and functions such as direct messaging, group chats and livestreaming may be included in the age restrictions if the service’s sole or primary purpose changes. 

The way online services are used can change over time, many services have multiple purposes, and new services are constantly being developed. So the platforms which are age-restricted may change depending on whether they start to meet, continue to meet or no longer meet the legislative rules for exclusion. 

The legislative rules are supported by an explanatory statement, which provides some details about how eSafety should assess a platform’s sole, primary or significant purpose. The factors eSafety is to consider include:

  • the features and functions of the platform
  • how they are deployed and influence user engagement and experiences
  • the actual use of the platform, in addition to what the platform may say its intended purpose is.

Will under-16s who already have accounts be allowed to keep using them?

No. Age-restricted social media platforms will have to take reasonable steps to find and remove or deactivate accounts held by under-16s.

'Reasonable steps' means platforms have to implement the restrictions in a way that is just and appropriate in the circumstances. eSafety has developed regulatory guidelines to help platforms deactivate accounts using an approach that is as safe and supportive as reasonably possible. The guidelines are informed by a broad evidence base, including lessons learned through the Australian Government’s Age Assurance Technology Trial and the outcomes of stakeholder consultations. The Office of the Australian Information Commissioner has provided guidance on privacy.

Will underage users be able to reactivate their old accounts when they turn 16?

Age-restricted social media platforms will have to take reasonable steps to find and remove or deactivate accounts held by under-16s.

eSafety has developed regulatory guidelines to help platforms remove or deactivate accounts using an approach that is as safe and supportive as reasonably possible.  

The guidelines are informed by a broad evidence base, including lessons learned through the Australian Government’s Age Assurance Technology Trial and the outcomes of stakeholder consultations. The Office of the Australian Information Commissioner has provided guidance on privacy.

Platforms are meant to remove accounts belonging to under-16s. Instead of removing accounts, some platforms may deactivate them so they can be reactivated with all their existing data when the user turns 16. However, users should not rely on platforms to provide this option. It’s best for under-16s to download any data they want to save, including connections, posts, chats, photos and videos, before 10 December. 

Will there be penalties for under-16s if they get around the age restrictions?

There are no penalties for under-16s who access an age-restricted social media platform, or for their parents or carers. 

This is about protecting young people, not punishing or isolating them. The goal is to help parents and carers support the health and wellbeing of under-16s.

On the other hand, age-restricted social media platforms may face penalties if they don’t take reasonable steps to prevent under-16s from having accounts on their platforms. 

What are the penalties for age-restricted platforms that allow under-16s to have accounts?

A court can order civil penalties for platforms that don’t take reasonable steps to prevent underage users from having accounts on their platforms. This includes court-imposed fines of up to 150,000 penalty units for corporations – currently equivalent to a total of $49.5 million AUD.

'Reasonable steps' means platforms have to act to enforce the restrictions in a way that is just and appropriate in the circumstances. They will be in breach of the law if they show an unreasonable failure to prevent underage access to accounts. 

eSafety is already working with the key platforms where we know Australian children are present in large numbers, and where there are features associated with risks to children. By working with platforms now, eSafety is taking steps to ensure they are getting ready for the social media age restrictions.

eSafety will monitor compliance and enforce the law. This will be done through a range of regulatory powers provided in the Online Safety Act. 

Won’t the age restrictions stop under-16s from accessing important benefits of being online?

Under-16s will still be able to use online services, sites and apps that are not covered by the social media age restrictions. 

The Australian Government is mindful of the need to balance safety with a broader range of digital rights. Under-16s will still be able to explore and express themselves on platforms that are not age-restricted, allowing connection, creativity, learning, health advice and entertainment. 

In addition, platforms that have the sole or primary purpose of enabling messaging or online gaming are among a number of types of services that have been excluded from the age restrictions under the legislative rules.

Under-16s will also continue to have access to online services that provide crucial information and support if they are experiencing distress. 

My child has never had a problem on social media, why should they miss out?

We know that young people are not all the same. They use a range of social media platforms in varying ways and with different exposure to risks of harm.  

However, the Australian Parliament voted for the restrictions for the good of all Australians under 16. The delay is similar to other age-based laws, such as restrictions on the sale of alcohol and cigarettes.

As the law will apply to all of them, parents and carers will no longer need to choose between allowing their under-16s to set up accounts on platforms that may negatively affect their health, or making sure they are not socially excluded. No under-16s have to feel like they’re ‘missing out’. Parents and carers won’t have to say ‘yes’ or ‘no’ to social media accounts, instead they can say ‘not yet’.

Won’t under-16s still be able to see social media feeds without accounts?

Under-16s will still be able to see publicly available social media content that doesn’t require being logged into an account. As they won’t be logged in, they are less likely to be exposed to some of the harmful design features of social media.

For example, most content is currently available to view on YouTube without holding an account. 

Another example is that anyone can see some of Facebook’s content, such as the landing pages of businesses or services that use social media as their host platform. 

It’s the Australian Government’s intention that under-16s will also continue to have access to online services that provide crucial information and support if they are experiencing distress, which is why legislative rules have been made to exclude certain services.

For example, they can still go to these platforms:

How will under-16s be stopped from finding a way around the age restrictions?

Most social media services currently have a minimum age requirement for account holders, but often they don’t enforce it. That won’t be acceptable anymore. The new law requires age-restricted social media platforms to take reasonable steps to make sure under-16s can’t create or keep accounts.

There are systems and technologies that make this possible while preserving the privacy of users. Some are already being used by social media platforms.

Of course, no solution is likely to be 100% effective all of the time. We know that some under-16s may find their way around the age restrictions, like some get around restrictions on cigarettes and alcohol.  

But age-restricted platforms will have to take steps to stop under-16s getting around the law. This includes having ways to prevent under-16s from faking their age by using false identity documents, AI tools or deepfakes. It also means trying to stop under-16s from using VPNs to pretend to be outside Australia.    

Platforms may assess age-related signals which can help work out if someone is under 16. These signals can include:

  • how long an account has been active
  • whether the account holder interacts with content targeted at children under 16
  • analysis of the language level and style used by the account holder and the people they interact with
  • visual checks, such as facial age analysis of the account holder’s photos and videos
  • audio analysis, such as age estimation of the voice of the account holder
  • activity patterns consistent with school schedules
  • connections with other users who appear to be under 16
  • membership in youth-focused groups, forums or communities.

Platforms may also use location-based signals which can help work out if an account holder usually lives in Australia and could be using a VPN to pretend they don’t. These signals can include:  

  • IP address(es)
  • GPS or other location services
  • device language and time settings
  • a device identifier
  • an Australian phone number
  • app store or operating system or account settings
  • photos, tags, connections, engagement or activity.  

Evidence of these age and location signals is expected to trigger the age assurance process, or review of an account if it has already been checked. 

Will every Australian now have to prove their age to use social media?

No. eSafety does not expect a platform to make every account holder go through an age check process if it has other accurate data indicating the user is 16 or older.

For example, if someone has had an account since Facebook started in Australia in 2006, Meta could reasonably assume they are older than 16 so no further check is needed.
 

For Australians who do have to prove their age, what methods will be allowed?

There is a range of technologies available to check age, at the point of account sign up and later. It will be up to each platform to decide which methods it uses.

There are systems and technologies that make this possible while preserving the privacy of users. Some are already being used by social media platforms. 

eSafety has published regulatory guidance to help platforms decide which methods are likely to be effective and comply with the Online Safety Act. The guidelines draw on the Australian Government’s Age Assurance Technology Trial as well as stakeholder consultations, including our ongoing engagement with social media platforms that are likely to be restricted. The regulatory guidance also draws on our existing knowledge base, and includes principles that are consistent with similar international frameworks. The Office of the Australian Information Commissioner will provide guidance on privacy.

No Australian will be forced to provide a government-issued ID or use an Australian Government accredited digital ID service to prove their age. Age-restricted social media platforms will have to offer reasonable alternatives to users.

Will Australians who do have to prove their age be forced to use a government ID?

No. In fact, the Social Media Minimum Age legislation specifically prohibits platforms from compelling Australians to provide a government-issued ID or use an Australian Government accredited digital ID service to prove their age. 

Platforms may offer it as an option but must also offer a reasonable alternative, so no one who is 16 or older is prevented from having a social media account because they choose not to provide government ID. This includes situations where other age check methods return a result the user does not accept. 

eSafety can seek penalties of up to $49.5 million if a platform makes Australians use a government ID.
 

How can people be sure any identity information they use to prove they are 16 or older won’t be misused or stolen?

The Social Media Minimum Age legislation builds on the existing privacy protections contained in the Privacy Act. As part of the protections, platforms have to ensure any personal information they collect to check that a user is 16 or older is not used for other purposes without their consent, including marketing.

The Australian Government’s Age Assurance Technology Trial has confirmed that a variety of methods provide effective age checks while also preserving privacy. In addition, the Office of the Australian Information Commissioner will release guidance on privacy soon.
 

What if the account of someone who's 16 or older is removed or deactivated by mistake?

There is a chance that some users who are 16 or older may have their accounts removed or deactivated in error.

Age-restricted platforms are expected to have processes to correct errors if someone is mistakenly missed by or included in the restrictions, so no one account is removed or deactivated unfairly.

Platforms should also provide clear ways for people to report underage accounts, or to request a review if they have been age-restricted by mistake.

What can I do now to help my family prepare?

The age restrictions are likely to mean a big change for many under-16s, so they may feel a range of emotions – including being upset, worried, frustrated, confused, sad or angry.  

Some may binge on social media use now, before the restrictions start, and find it harder than usual to switch off. Others may become more secretive about their social media use and less likely to ask for help from a trusted adult if things go wrong.

As a parent or carer, you can support your child by talking calmly and openly about the age restrictions. This includes asking how they use social media now, helping them understand how the law might affect them, and guiding them to prepare for the change.  

eSafety has developed specific FAQs for parents and carers, including advice on: 

  • dealing with conflict over the social media age restrictions
  • supporting under-16s who may currently rely on social media for connection with important communities and services
  • alternative platforms and their risks
  • what to do if something goes wrong on an age-restricted platform and your child is under 16.

There are sample conversation starters for parents and carers, as well as a Get-ready guide for helping under-16s prepare for the change. The guide has tips for helping under-16s find other ways to connect with friends, keep up with their interests, express themselves, learn about things and be entertained.

We understand families and households have their own agreements and approaches to being online and using devices, and that every child is different.  

You know your child best. Keep in mind their age, developmental stage, emotional readiness and individual needs when talking about the age restrictions.  

Also, if you’re caring for children of various ages you may need to handle the issue in different ways. Discussing it with everyone together and setting shared expectations can be helpful, or you may find it’s better to have separate conversations.  

What matters most is creating a safe space for under-16s to feel heard, supported and empowered to make ‘switching off’ age-restricted social media accounts as stress-free as possible.  

Here are some tips:

  • Lead with empathy, letting them know you understand their feelings.
  • Ask them how they currently use social media.
  • Talk about the new law and what it means.
  • Explain that the restrictions are to protect them, not punish them.
  • Talk about the sorts of risks the law aims to help them avoid. These include spending too much time on screens and being over-exposed to negative and harmful content – which can impact their sleep, stress levels, attention and wellbeing.
  • Focus on what they can still do online and offline.
  • Reassure them they can always come to you or another trusted adult to talk about their concerns.
  • It’s OK to try again later if the talk has not gone very well. Lots of little chats are often more effective than one big conversation.

Remember, the aim of delaying account access until 16 is to give young people more time to develop important digital, social and emotional skills before facing the risks of age-restricted social media accounts.  

You can use the extra time to teach them about healthy online habits and the importance of responsible online behaviour – and model them yourself (see our tips in the conversation starters). That way they will be better prepared for social media account access when they turn 16.  

You can also explore our content for parents and carers on a range of topics, including using parental controls and managing screen time.

Other helpful advice about discussing the social media age restrictions is provided by headspace (Australia’s National Youth Mental Health Foundation) at Information for family about the social media ban

How will the age restrictions impact schools that use social media platforms?

Age restrictions may apply to platforms that some schools currently use for educational purposes and to communicate with their students and community, so they may need to explore alternatives. 

eSafety has informed Facebook, Instagram, Snapchat, Threads, TikTok, X, YouTube, Kick and Reddit of its view that they are age-restricted platforms and therefore required to take reasonable steps to prevent Australians under 16 from having accounts on their services from 10 December 2025. 

However, learning management systems that allow educators to share course materials, manage assignments and facilitate communication, and which allow students to access classroom resources, submit work and collaborate with peers, will be excluded from the age restrictions.

While these services are often integrated with other tools such as video conferencing, messaging and the ability to post content on the service, if their sole or primary purpose is to support the education of users, the exclusion will apply.

Some of these services allow teachers to embed public video content from other platforms onto the learning management system, such as YouTube videos. If the content is publicly available, and does not require the student to log into another platform, students will still be able to watch this content.

As of 5 Novermber 2025, services that eSafety considers do not currently meet the criteria for being an 'age-restricted social media platform', or that fall within an exclusion in the legislative rules, include Discord, GitHub, Google Classroom, LEGO Play, Messenger, Roblox, Steam and Steam Chat, WhatsApp and YouTube Kids. Find the latest details about which platforms are age-restricted.

For further guidance, including if educators will be able to use their own accounts to share age-appropriate education materials, please refer to school or sector policies and procedures.

Find specific FAQs and resources for educators.

How will children’s digital rights be protected under the age restrictions?

Respect for children’s rights underpins eSafety’s principles-based approach to implementing the social media age restrictions for under-16s.

We have developed a Statement of Commitment to Children’s Rights that sets out how we are upholding children’s rights throughout the process. This commitment is guided by the United Nations Convention on the Rights of the Child and General Comment No. 25 on children’s rights in the digital environment.

We also expect those rights to be respected by age-restricted social media platforms when complying with their obligations.

Our approach involves:

  • consulting directly with children and young people
  • ensuring information about the restrictions is accessible and age appropriate
  • working with other independent regulators to ensure a human-rights based approach
  • evaluating the impact of the restrictions to identify benefits, as well as emerging risks or unintended consequences
  • ensuring children still have access to safe and supportive digital environments.

Where can I get more information?

Throughout 2025 eSafety will provide webinars and publish new and updated resources for parents and carers – and for educators and other youth serving professionals – to help them understand the age restrictions and how to support young people through the change.

These are the latest resources:

All resources were informed by extensive consultation and feedback from key partners including mental health and support organisations, such as headspace, Kids Helpline, Beyond Blue, Raising Children Network and ReachOut Australia.

Meanwhile, you can find eSafety tips for parents and carers on managing parental controls and screen time, as well as how to start hard-to-have conversations.

In addition, eSafety will continue to provide education support and professional learning to schools and education sectors. New advice for educators will also be delivered through the National Online Safety Education Council, Trusted eSafety Providers Program and eSafety Champions Network

Subscribe to eSafety's newsletter to receive updates about the social media age restrictions straight to your inbox.

Sign up now

You can also read more information about the Social Media Minimum Age law and how it fits into the Australian regulatory environment.

Talking with your child about the age restrictions

Help prepare your children for the social media age restrictions

How the social media age restrictions will help keep under-16s safer

Download the shareable assets

The resources include a poster, flyer, presentation and social tile with accurate and helpful information about the social media age restrictions.