Social media 'ban' or delay FAQs
Find out the facts about the social media age restrictions that will help keep Australians under 16 safer.
These frequently asked questions will be added to and updated throughout 2025.
Click or tap on the + to find the answers.
Download our poster, flyer, presentation and social tile, and help share accurate and helpful information about the social media age restrictions in your community or workplace.
You can also find advice at:
Which platforms will be age-restricted?
UPDATED: 21 November 2025
From 10 December 2025, Facebook, Instagram, Kick, Reddit, Snapchat, Threads, TikTok, Twitch, X and YouTube will be required to take reasonable steps to prevent Australians under 16 from having accounts on their platforms.
Services that eSafety considers do not currently meet the criteria for being an 'age-restricted social media platform' (including those that fall within an exclusion in the legislative rules) include Discord, GitHub, Google Classroom, LEGO Play, Messenger, Pinterest, Roblox, Steam and Steam Chat, WhatsApp and YouTube Kids.
These lists reflect eSafety’s views as at 21 November 2025. There are no further assessments planned in the lead up to 10 December. After that, eSafety may assess new services that emerge or reassess existing ones if they evolve to the extent that their purpose changes, so these lists may continue to change. Find the latest details about which platforms are age-restricted.
General conditions for age restrictions
More generally, age restrictions will apply to social media platforms that meet four specific conditions, unless they are excluded based on criteria set out in legislative rules made by the Minister for Communications in July 2025.
The conditions for age restriction are:
- the sole purpose, or a significant purpose, of the service is to enable online social interaction between two or more end-users
- the service allows end-users to link to, or interact with, other end-users
- the service allows end-users to post material on the service
- material on the service is accessible to, or delivered to, end-users in Australia.
Platforms that have the sole or primary purpose of enabling messaging or online gaming are among a number of types of services that have been excluded under the legislative rules.
Multiple purpose platforms
It is important to note that many platforms have multiple purposes.
For example, some messaging services have social-media style features which allow users to interact in other ways apart from messaging. If the primary purpose of the service changes due to the noticeable common use of these social-media style features, then they may be included in the age restrictions.
Also, online gaming services that enable online social interaction through features and functions such as direct messaging, group chats and livestreaming may be included in the age restrictions if the service’s sole or primary purpose changes.
The way online services are used can change over time, many services have multiple purposes, and new services are constantly being developed. So the platforms which are age-restricted may change depending on whether they start to meet, continue to meet or no longer meet the legislative rules for exclusion.
Compliance and enforcement
eSafety does not have a formal role in declaring which services are age-restricted social media platforms. However, platforms we believe to be age-restricted will be informed of this by eSafety, to help them understand their legal obligations under the Online Safety Act. We will also share our view publicly. These platforms must comply with the law when the minimum age requirements take effect on 10 December.
eSafety’s view will also underpin our decisions to use enforcement powers under the Online Safety Act if a platform does not take reasonable steps to implement the age restrictions. Where a service disagrees with eSafety’s use of these powers, it has the usual legal rights to challenge and seek review our decisions.
As noted in our Regulatory Guidance, eSafety will be taking a proportionate and risk-based approach to compliance, initially focusing on services with the greatest number of end-users, where there are higher risks of harm, accounting for the steps providers are implementing to prevent the youngest users from having accounts.
Will every Australian now have to prove their age to use social media?
Published: 26 September 2025
No. eSafety does not expect a platform to make every account holder go through an age check process if it has other accurate data indicating the user is 16 or older.
For example, if someone has had an account since Facebook started in Australia in 2006, Meta could reasonably assume they are older than 16 so no further check is needed.
For people who do have to prove their age, what methods will be allowed?
UPDATED: 21 November 2025
Various technologies can be used to check age, when someone is signing up for an account or later. The methods used by age-restricted social media platforms have to meet the regulatory requirements and respect privacy laws and digital rights.
To reduce the risk of scams, age-restricted platforms should clearly communicate the age-check steps account holders have to take, the information they have to give, and whether the platform is using a third-party age assurance provider – their ‘Help’ or ‘Support’ sections are a good place to look for latest details. We will also be updating The eSafety Guide with relevant links.
It’s important to know that age-restricted platforms can no longer just rely on a user providing a birthdate at account sign up. The platforms are expected to have ways to stop users faking their age using false identity documents, AI tools or deepfakes. They are also expected to try to stop under-16s from using VPNs to pretend to be outside Australia.
If someone offers to sell you a fake ID or direct access to an age-verified account, it’s probably a scam. DO NOT PAY or give them personal information. Check Scamwatch for more information.
I've been asked to prove my age. How can I tell if it's a scam?
NEW: 2 December 2025
Age-restricted social media platforms, including Facebook, Instagram, Kick, Snapchat and Threads, are already asking users for proof of age. But be careful, because a scammer trying to steal money or personal information may send you a fake request. They are likely to create a sense of urgency, threatening to delete your account if you don’t act quickly.
Different social media platforms are using different methods to check age, so it’s important to trust only the information that each platform provides. They should let you know:
- the age-check steps their account holders have to take
- the information their account holders have to give
- whether the platform is using another company to do its checks.
Go directly to each platform’s ‘Help’ or ‘Support’ section, via your browser or the app. If you receive a text, email or any other message about proving your age, stop and think carefully. Don’t click on links and don’t download any attachments or apps unless you can verify their source.
We will also update The eSafety Guide with relevant links to age assurance information when platforms provide them.
Impersonation scams
A scammer may pretend to be from a social media platform, another company doing the checks for a platform, a government department, or a law enforcement agency such as the police or security services.
They may ask you to:
- click a link directing you to a fake website
- provide your account username and password
- send or upload identity information such as passport, driving licence or proof of age card
- record videos of yourself to prove your age
- pay a fine for being on a social media platform while under 16.
Buying and selling scams
A scammer may offer to sell you a fake ID or direct access to an age-verified account. They may collect personal information and request payment. But they may not give you what they’ve promised, or may give you access to something that doesn’t meet the real age check requirements.
‘Hi Mum’ style scams
A parent or guardian may receive a message from a scammer pretending to be their child. For example, they may claim they’ve lost their phone, so they’re using someone else’s number.
They may tell you:
- you need to click a link to verify their age
- they need you to send copies of their identity information (such as a passport, proof of age card and/or birth certificate) so they can verify their age (or get back into their account).
Further support
To find out more about the latest scams and how to protect yourself, check the Australian Government’s Scamwatch website. Being scammed is a horrible experience, and it can happen to anyone. If you need someone to talk to, you can reach out to Kids Helpline (for 5- to 25-year-olds) or another counselling or support service.
You can also reach out to IDCare, a not-for-profit organisation that can help you recover from scams, identity theft and other cybercrimes.
How can people be sure any identity information they use to prove they are 16 or older won’t be misused or stolen?
UPDATED: 21 November 2025
The Social Media Minimum Age legislation builds on the existing privacy protections contained in the Privacy Act. As part of the protections, platforms have to ensure any personal information they collect to check that a user is 16 or older is only used for other purposes (like marketing) in certain circumstances, such as with the consent of the user.
The Office of the Australian Information Commissioner has provided further guidance on privacy.
Will people who do have to prove their age be forced to use a government ID?
Published: 26 September 2025
No. In fact, the Social Media Minimum Age legislation specifically prohibits platforms from compelling Australians to provide a government-issued ID or use an Australian Government accredited digital ID service to prove their age.
Platforms may offer it as an option but must also offer a reasonable alternative, so no one who is 16 or older is prevented from having a social media account because they choose not to provide government ID. This includes situations where other age check methods return a result the user does not accept.
eSafety can seek penalties of up to $49.5 million if a platform makes Australians use a government ID.
Will under-16s who already have accounts be allowed to keep using them?
UPDATED: 21 November 2025
No. If an age-restricted platform suspects or knows that a user is under 16, it should stop them using the account on that platform or creating a new one for it. It is meant to do this in a way that is as safe and supportive as possible, as set out in eSafety's regulatory guidelines.
eSafety has suggested that platforms give clear instructions on how under-16s can download, transfer or access their data, like saving their favourite posts, photos and important contacts.
To find out more about how to prepare, under-16s can:
- visit the ‘Help’ or ‘Support’ section on the platform
- check The eSafety Guide for advice on downloading content and protecting their personal information on common platforms, games, apps and sites
- follow the tips in eSafety’s Get-ready guide and action plan for under-16s, including how to save their most precious data and content.
Will underage users be able to reactivate their old accounts when they turn 16?
UPDATED: 21 November 2025
Some platforms could allow under-16s to deactivate their accounts, so they can start using them again with all their existing data when they turn 16. But young people shouldn’t rely on platforms to provide this option. It’s best that they download any data they want to save, including connections, posts, chats, photos and videos – before 10 December.
To find out more about how to prepare, under-16s can:
- visit the ‘Help’ or ‘Support’ section on the platform
- check The eSafety Guide for advice on downloading content and protecting their personal information on common platforms, games, apps and sites
- follow the tips in eSafety’s Get-ready guide and action plan for under-16s, including how to save their most precious data and content.
What if the account of someone who's 16 or older is removed or deactivated by mistake?
Published: 21 November 2025
There is a chance that some users who are 16 or older may have their accounts removed or deactivated in error or due to false reporting.
Age-restricted platforms are expected to have processes to correct errors if someone is mistakenly missed by or included in the restrictions, so no one account is removed or deactivated unfairly.
Platforms should provide clear instructions about how to request a review if a user has been age-restricted by mistake, as well as easy ways for people to report underage accounts.
How will under-16s be stopped from finding a way around the age restrictions?
UPDATED: 2 December 2025
We know that some under-16s may find their way around the age restrictions, like some get around restrictions on cigarettes and alcohol.
But age-restricted platforms will have to take steps to stop under-16s getting around the law. This includes having ways to prevent under-16s from faking their age by using false identity documents, AI tools or deepfakes. It also means trying to stop under-16s from using VPNs to pretend to be outside Australia.
Platforms may assess age-related signals which can help work out if someone is under 16. These signals can include:
- how long an account has been active
- whether the account holder interacts with content targeted at children under 16
- analysis of the language level and style used by the account holder and the people they interact with
- visual checks, such as facial age analysis of the account holder’s photos and videos
- audio analysis, such as age estimation of the voice of the account holder
- activity patterns consistent with school schedules
- connections with other users who appear to be under 16
- membership in youth-focused groups, forums or communities.
Platforms may also use location-based signals which can help work out if an account holder usually lives in Australia and could be using a VPN to pretend they don’t. These signals can include:
- IP address(es)
- GPS or other location services
- device language and time settings
- a device identifier
- an Australian phone number
- app store or operating system or account settings
- photos, tags, connections, engagement or activity.
Evidence of these age and location signals is expected to trigger the age assurance process, or review of an account if it has already been checked.
Which platforms have been excluded from the age restrictions?
UPDATED: 21 November 2025
Legislative rules excluding certain types of online services were made by the Minister for Communications following advice from the eSafety Commissioner and consultation with youth groups, parents, carers, the digital industry and civil society groups, as well as experts in child development, mental health and law.
The exclusions apply to:
- services that have the sole or primary purpose of messaging, email, voice calling or video calling
- services that have the sole or primary purpose of enabling users to play online games with other users
- services that have the sole or primary purpose of enabling users to share information about products or services
- services that have the sole or primary purpose of enabling users to engage in professional networking or professional development
- services that have the sole or primary purpose of supporting the education of users
- services that have the sole or primary purpose of supporting the health of users
- services that have the sole or significant purpose of facilitating communication between educational institutions and students or student families
- services that have the significant purpose of facilitating communication between health care providers and people using those services.
As of 21 November 2025, services that eSafety considers do not meet the criteria for being an 'age-restricted social media platform' (including those that fall within an exclusion in the legislative rules) include Discord, GitHub, Google Classroom, LEGO Play, Messenger, Pinterest, Roblox, Steam and Steam Chat, WhatsApp and YouTube Kids. Find the latest details about which platforms are age-restricted.
Multiple purpose platforms
It is important to note that many platforms have multiple purposes. For example, some messaging services have social-media style features which allow users to interact in other ways apart from messaging. If the primary purpose of the service changes due to the noticeable common use of these social-media style features, then they may be included in the age restrictions.
Also, online gaming services that enable online social interaction through features and functions such as direct messaging, group chats and livestreaming may be included in the age restrictions if the service’s sole or primary purpose changes.
The way online services are used can change over time, many services have multiple purposes, and new services are constantly being developed. So the platforms which are age-restricted may change depending on whether they start to meet, continue to meet or no longer meet the legislative rules for exclusion.
The legislative rules are supported by an explanatory statement, which provides some details about how eSafety should assess a platform’s sole, primary or significant purpose. The factors eSafety is to consider include:
- the features and functions of the platform
- how they are deployed and influence user engagement and experiences
- the actual use of the platform, in addition to what the platform may say its intended purpose is.
When will the age restrictions start?
Published: 11 August 2025
After the law takes effect on 10 December 2025, Australians can expect to see age-restricted social media platforms taking steps to stop under-16s setting up or continuing to use accounts.
eSafety recognises this is a complex task, so we’re already consulting with social media platforms about their preparations for introducing effective methods to prevent and remove underage account holders.
We expect age-restricted social media platforms to comply with the requirements once they take effect. We will continue to work with industry to ensure age-restricted social media platforms implement reasonable steps.
Will there be penalties for under-16s if they get around the age restrictions?
Published: 11 August 2025
There are no penalties for under-16s who access an age-restricted social media platform, or for their parents or carers.
This is about protecting young people, not punishing or isolating them. The goal is to help parents and carers support the health and wellbeing of under-16s.
On the other hand, age-restricted social media platforms may face penalties if they don’t take reasonable steps to prevent under-16s from having accounts on their platforms.
If you’re asked to pay a fine for being on social media while you’re under-16 it’s a scam. DO NOT PAY. Check Scamwatch for more information.
What are the penalties for age-restricted platforms that allow under-16s to have accounts?
Published: 11 August 2025
A court can order civil penalties for platforms that don’t take reasonable steps to prevent underage users from having accounts on their platforms. This includes court-imposed fines of up to 150,000 penalty units for corporations – currently equivalent to a total of $49.5 million AUD.
'Reasonable steps' means platforms have to act to enforce the restrictions in a way that is just and appropriate in the circumstances. They will be in breach of the law if they show an unreasonable failure to prevent underage access to accounts.
eSafety is already working with the key platforms where we know Australian children are present in large numbers, and where there are features associated with risks to children. By working with platforms now, eSafety is taking steps to ensure they are getting ready for the social media age restrictions.
eSafety will monitor compliance and enforce the law. This will be done through a range of regulatory powers provided in the Online Safety Act.
How are accounts handled when they’re used for several services, including one with age restrictions?
NEW: 3 December 2025
In these circumstances, it is up to companies to determine how to deal with accounts that are used to log into multiple services. eSafety's focus is on ensuring companies are preventing under-16s from using their accounts to log into age-restricted social media platforms, as opposed to removing accounts that are used to access a broader range of services.
Won’t under-16s still be able to see social media feeds without accounts?
UPDATED: 17 October 2025
Under-16s will still be able to see publicly available social media content that doesn’t require being logged into an account. As they won’t be logged in, they are less likely to be exposed to some of the harmful design features of social media.
For example, most content is currently available to view on YouTube without holding an account.
Another example is that anyone can see some of Facebook’s content, such as the landing pages of businesses or services that use social media as their host platform.
It’s the Australian Government’s intention that under-16s will also continue to have access to online services that provide crucial information and support if they are experiencing distress, which is why legislative rules have been made to exclude certain services.
For example, they can still go to these platforms:
- My Circle is a free, private, safe and confidential social forum for 12- to 25-year-olds that supports mental health.
- Beyond Blue forums are open to under-16s – there’s even one where they can discuss how they’re feeling about the social media age restrictions.
Will I be able to report an under-16 for being on social media?
NEW: 4 December 2025
Reports can be made to the platforms.
Age-restricted social media platforms should provide easy pathways for people to report that they believe an account holder is under 16, to trigger an age check. But platforms should also provide a way for users who are 16+ to appeal if they are flagged or removed by mistake or due to a false report.
If an under-16 has an account on an age-restricted social media platform they are not breaking the law and no criminal charges or fines apply to them or their family because of this. It’s only age-restricted social media platforms that face penalties if they fail to take reasonable steps to stop under-16s creating or having accounts.
This means that if you (or someone else) receive a request to pay a fine for being under 16 or for not having your account verified, it’s a scam – DO NOT PAY. (See Scamwatch)
This also means there’s no mandatory reporting of users under 16 – for parents, educators or police. However, reporting under 16 use may help the platform to understand how under-16s are getting around age checks, so it can tighten safety protections for all.
What will age-restricted social media platforms have to do to comply with the law?
Published: 26 September 2025
Age-restricted platforms will be expected to take reasonable steps to:
- find existing accounts held by under-16s, and deactivate or remove those accounts
- prevent under-16s from opening new accounts
- prevent workarounds that may allow under-16s to bypass the restrictions
- have processes to correct errors if someone is mistakenly missed by or included in the restrictions, so no one’s account is removed unfairly.
Platforms should also provide clear ways for people to report underage accounts, or to request a review if they have been age-restricted by mistake.
Age-restricted platforms are also expected to give users who are under 16 information about how they can download their account information in a simple and seamless way prior to account deactivation or removal, or request access to their information within a reasonable period after account deactivation. The information should be provided in a format that is easily accessible. Platforms should consider formats that could allow end-users to transfer their information and content to other services, or to upload the information on the same platform if they sign up again after turning 16.
The full expectations of platforms are set out in the Social Media Minimum Age Regulatory Guidance.
Why are under-16s being ‘banned’ from social media?
UPDATED: 17 October 2025
It’s not a ban, it’s a delay to having accounts.
Age-restricted platforms won’t be allowed to let under-16s create or keep an account. That’s because being logged into an account increases the likelihood that they’ll be exposed to pressures and risks that can be hard to deal with. These come from social media platform design features that encourage them to spend more time on screens and make it more likely that they will see negative, upsetting or manipulative content.
For example, the pressure to respond to streams of notifications and alerts and view disappearing content has been linked to harms to health - these include reduced sleep and attention, and increased stress levels. Over-exposure to harmful content can also impact immediate and long-term health and wellbeing.
While most platforms currently have a minimum age of 13 for account holders, delaying account access until 16 will give young people more time to develop important skills and maturity. It’s breathing space to build digital literacy, critical reasoning, impulse control and greater resilience.
It also means there’s extra time to teach under-16s about online risks and the impacts of harms, as well as how to stay safer online and seek help when they need it. This will give young people a better chance to prevent and deal with issues once they turn 16 and can have full social media accounts.
As the law will apply to all under 16s, parents and carers will no longer need to choose between allowing them to set up accounts on platforms that may negatively affect their health, or making sure they are not socially excluded. No under-16s have to feel like they’re ‘missing out’. Parents and carers won’t have to say ‘yes’ or ‘no’ to social media accounts, instead they can say ‘not yet’.
What can I do now to help my family prepare?
UPDATED: 17 October 2025
The age restrictions are likely to mean a big change for many under-16s, so they may feel a range of emotions – including being upset, worried, frustrated, confused, sad or angry.
Some may binge on social media use now, before the restrictions start, and find it harder than usual to switch off. Others may become more secretive about their social media use and less likely to ask for help from a trusted adult if things go wrong.
As a parent or carer, you can support your child by talking calmly and openly about the age restrictions. This includes asking how they use social media now, helping them understand how the law might affect them, and guiding them to prepare for the change.
eSafety has developed specific FAQs for parents and carers, including advice on:
- dealing with conflict over the social media age restrictions
- supporting under-16s who may currently rely on social media for connection with important communities and services
- alternative platforms and their risks
- what to do if something goes wrong on an age-restricted platform and your child is under 16.
There are sample conversation starters for parents and carers, as well as a Get-ready guide for helping under-16s prepare for the change. The guide has tips for helping under-16s find other ways to connect with friends, keep up with their interests, express themselves, learn about things and be entertained.
We understand families and households have their own agreements and approaches to being online and using devices, and that every child is different.
You know your child best. Keep in mind their age, developmental stage, emotional readiness and individual needs when talking about the age restrictions.
Also, if you’re caring for children of various ages you may need to handle the issue in different ways. Discussing it with everyone together and setting shared expectations can be helpful, or you may find it’s better to have separate conversations.
What matters most is creating a safe space for under-16s to feel heard, supported and empowered to make ‘switching off’ age-restricted social media accounts as stress-free as possible.
Here are some tips:
- Lead with empathy, letting them know you understand their feelings.
- Ask them how they currently use social media.
- Talk about the new law and what it means.
- Explain that the restrictions are to protect them, not punish them.
- Talk about the sorts of risks the law aims to help them avoid. These include spending too much time on screens and being over-exposed to negative and harmful content – which can impact their sleep, stress levels, attention and wellbeing.
- Focus on what they can still do online and offline.
- Reassure them they can always come to you or another trusted adult to talk about their concerns.
- It’s OK to try again later if the talk has not gone very well. Lots of little chats are often more effective than one big conversation.
Remember, the aim of delaying account access until 16 is to give young people more time to develop important digital, social and emotional skills before facing the risks of age-restricted social media accounts.
You can use the extra time to teach them about healthy online habits and the importance of responsible online behaviour – and model them yourself (see our tips in the conversation starters). That way they will be better prepared for social media account access when they turn 16.
You can also explore our content for parents and carers on a range of topics, including using parental controls and managing screen time.
Other helpful advice about discussing the social media age restrictions is provided by headspace (Australia’s National Youth Mental Health Foundation) at Information for family about the social media ban.
How will the age restrictions impact schools that use social media platforms?
UPDATED: 21 November 2025
Age restrictions may apply to platforms that some schools currently use for educational purposes and to communicate with their students and community, so they may need to explore alternatives.
However, learning management systems that allow educators to share course materials, manage assignments and facilitate communication, and which allow students to access classroom resources, submit work and collaborate with peers, will be excluded from the age restrictions.
While these services are often integrated with other tools such as video conferencing, messaging and the ability to post content on the service, if their sole or primary purpose is to support the education of users, the exclusion will apply.
Some of these services allow teachers to embed public video content from other platforms onto the learning management system, such as YouTube videos. If the content is publicly available, and does not require the student to log into another platform, students will still be able to watch this content.
As of 21 November 2025, services that eSafety considers do not currently meet the criteria for being an 'age-restricted social media platform', or that fall within an exclusion in the legislative rules, include Discord, GitHub, Google Classroom, LEGO Play, Messenger, Pinterest, Roblox, Steam and Steam Chat, WhatsApp and YouTube Kids. Find the latest details about which platforms are age-restricted.
For further guidance, including if educators will be able to use their own accounts to share age-appropriate education materials, please refer to school or education sector policies and procedures.
Can under-16s access YouTube using their school email address?
NEW: 4 December 2025
Google will have a responsibility to prevent under-16s from having their own accounts for purposes of accessing YouTube, regardless of whether or not those accounts are ‘condoned’ or ‘filtered’ by schools.
Some learning management systems allow teachers to embed public video content from other platforms, such as YouTube. If the content is publicly available and does not require the student to log into an age-restricted social media platform, students will still be able to watch this content.
How do the age restrictions impact overseas citizens in Australia, including international students?
NEW: 4 December 2025
The platforms will be responsible for finding and removing accounts held by users under the age of 16 who are ‘ordinarily resident in Australia’ from 10 December, when the law comes into effect.
The Online Safety Act does not define ‘ordinarily resident in Australia’ and there is no stated time threshold that platforms must apply. International students under the age of 16 who are living in Australia should be aware that their accounts may be flagged for age checks, deactivation or removal if platforms receive signals indicating they are in Australia for a significant period of time or indefinitely.
Platforms may check various signals to assess if a user intends to live or stay in Australia, such as the use of an Australian device and/or network provider, and updates to country settings within user accounts.
Platforms should have mechanisms in place for users to appeal if they believe their account has been flagged, removed or deactivated in error, or if the user’s age or ordinary residence changes.
How will children’s digital rights be protected under the age restrictions?
NEW: 17 October 2025
Respect for children’s rights underpins eSafety’s principles-based approach to implementing the social media age restrictions for under-16s.
We have developed a Statement of Commitment to Children’s Rights that sets out how we are upholding children’s rights throughout the process. This commitment is guided by the United Nations Convention on the Rights of the Child and General Comment No. 25 on children’s rights in the digital environment.
We also expect those rights to be respected by age-restricted social media platforms when complying with their obligations.
Our approach involves:
- consulting directly with children and young people
- ensuring information about the restrictions is accessible and age appropriate
- working with other independent regulators to ensure a human-rights based approach
- evaluating the impact of the restrictions to identify benefits, as well as emerging risks or unintended consequences
- ensuring children still have access to safe and supportive digital environments.
Won’t the age restrictions stop under-16s from accessing important benefits of being online?
UPDATED: 17 October 2025
Under-16s will still be able to use online services, sites and apps that are not covered by the social media age restrictions.
The Australian Government is mindful of the need to balance safety with a broader range of digital rights. Under-16s will still be able to explore and express themselves on platforms that are not age-restricted, allowing connection, creativity, learning, health advice and entertainment.
In addition, platforms that have the sole or primary purpose of enabling messaging or online gaming are among a number of types of services that have been excluded from the age restrictions under the legislative rules.
Under-16s will also continue to have access to online services that provide crucial information and support if they are experiencing distress.
My child has never had a problem on social media, why should they miss out?
UPDATED: 26 September 2025
We know that young people are not all the same. They use a range of social media platforms in varying ways and with different exposure to risks of harm.
However, the Australian Parliament voted for the restrictions for the good of all Australians under 16. The delay is similar to other age-based laws, such as restrictions on the sale of alcohol and cigarettes.
As the law will apply to all of them, parents and carers will no longer need to choose between allowing their under-16s to set up accounts on platforms that may negatively affect their health, or making sure they are not socially excluded. No under-16s have to feel like they’re ‘missing out’. Parents and carers won’t have to say ‘yes’ or ‘no’ to social media accounts, instead they can say ‘not yet’.
What are eSafety’s expectations of small online services that can’t afford sophisticated age assurance measures?
NEW: 4 December 2025
eSafety will take a proportionate and risk-based approach to monitoring providers’ compliance with the law, initially focusing on ensuring compliance by providers with the greatest numbers of Australian children under the age of 16 prior to 10 December, and platforms that use persuasive design features associated with the risk of harms to children.
Whether reasonable steps have been taken by a provider is contextually dependent. In this context, it is relevant for eSafety to consider the size, scope, experience and expertise of the provider. However, as a starting point, providers of platforms may wish to:
- review eSafety’s regulatory guidance, the Behind the Screen report, and the final report of the Australian-government sponsored Age Assurance Technical Trial
- update the relevant minimum age in their terms of use and communicate this to users
- require users to declare their age when creating an account, especially through a neutral age gate
- consider various signals, technologies, systems and processes that may be available to confirm that age
- implement accessible mechanisms for people to report underage account holders, and systems and processes to review and action those reports.
Where can I get more information?
Throughout 2025 eSafety will provide webinars and publish new and updated resources for parents and carers – and for educators and other youth serving professionals – to help them understand the age restrictions and how to support young people through the change.
These are the latest resources:
- A dedicated online hub with tailored FAQs explaining what is happening, and how to prepare.
- Practical guidance for parents and carers, including conversations starters and get-ready guides.
- Information for educators, explaining what the new restrictions mean for schools, and how to prepare students.
- Youth-friendly content outlining what the new restrictions mean for young people, a get-ready guide and action plan, and information on where to get help and support.
All resources were informed by extensive consultation and feedback from key partners including mental health and support organisations, such as headspace, Kids Helpline, Beyond Blue, Raising Children Network and ReachOut Australia.
Meanwhile, you can find eSafety tips for parents and carers on managing parental controls and screen time, as well as how to start hard-to-have conversations.
In addition, eSafety will continue to provide education support and professional learning to schools and education sectors. New advice for educators will also be delivered through the National Online Safety Education Council, Trusted eSafety Providers Program and eSafety Champions Network.
Subscribe to eSafety's newsletter to receive updates about the social media age restrictions straight to your inbox.
You can use our form to provide information to eSafety about whether and how age-restricted social media platforms are implementing and complying with the Social Media Minimum Age obligation.
You can also read more information about the Social Media Minimum Age law and how it fits into the Australian regulatory environment.
Talking with your child about the age restrictions
Help prepare your children for the social media age restrictions
How the social media age restrictions will help keep under-16s safer
Download the shareable assets
The resources include a poster, flyer, presentation and social tile with accurate and helpful information about the social media age restrictions.
Last updated: 04/12/2025