Need help dealing with violent or distressing online content? Learn more

What do the social media age restrictions mean for educators?

What do the social media age restrictions mean for educators?
What do the social media age restrictions mean for educators?

As of 10 December 2025, age-restricted social media platforms must take reasonable steps to prevent Australians under 16 from having accounts. Find out how social media age restrictions affect educators and can help you support your students’ online safety.

On this page:

decorativeWhy the social media delay?

The changes aim to protect under-16s from pressures and risks they can be exposed to while logged in to social media accounts. These come from design features in the platforms that:

  • encourage them to spend too much time on screens – for example, by prompting them with “streaks” or streams of notifications and alerts, and pressuring them to view disappearing content
  • increase the likelihood of exposure to negative, upsetting or manipulative content served up in their feeds by algorithms.

These features have been linked to harms to health and wellbeing – including increased stress levels, and reduced sleep and concentration.

The delay provides under-16s with the time to:

  • learn about social media’s benefits and risks
  • build digital, social and emotional skills
  • understand the importance of reaching out for help if things go wrong.

While the responsibility rests with the platforms themselves to take reasonable steps to prevent under-16s from having accounts, we can all play a part.

You can help students deal with these changes and build their general online safety at the same time. The keys are clear communication, effective teaching, and modelling of safe and positive online behaviours. 

decorativeQuick facts

  • As of 10 December 2025, Facebook, Instagram, Kick, Reddit, Snapchat, Threads, TikTok, Twitch, X and YouTube are required to take reasonable steps to prevent Australians under 16 from having accounts on their platforms. See the latest list.
  • Under-16s are still allowed to see publicly available social media content that doesn’t require logging into an account.
  • Schools may need to explore alternative methods for communicating with parents and students.  
  • There are exclusions for a number of platforms, including educational tools such as learning management systems. For example, Google Classroom will not be age restricted.
  • YouTube videos that can be seen without logging into an account can still be shared by teachers.
  • For further guidance, including whether educators will be able to use their own accounts to share age-appropriate education materials, please refer to school or education sector policies and procedures.  

 

Find out more general information at eSafety’s social media age restrictions hub.   

decorativeFAQs

In eSafety’s consultations about the social media age restrictions, educators agree that the delay will allow time for building and strengthening digital literacy and resilience in under-16s.  

We understand that you, as educators, can play an important role in helping to explain the age restrictions to students and their families, while continuing to support their online safety.

These FAQs will help you address some immediate concerns.

Click or tap on the + to find the answers.  

Age restrictions may apply to platforms that some schools currently use for educational purposes and to communicate with their students and community, so they may need to explore alternatives.  

Learning management systems that allow educators to share course materials, manage assignments and facilitate communication, and which allow students to access classroom resources, submit work and collaborate with peers, will be excluded from the age restrictions.  

While these services are often integrated with other tools such as video conferencing, messaging and the ability to post content on the service, if their sole or primary purpose is to support the education of users, the exclusion will apply.  

Some of these services allow teachers to embed public video content from other platforms onto the learning management system, such as YouTube videos. If the content is publicly available and does not require the student to log into another platform, students will still be able to watch this content.  

For further guidance, including if educators will be able to use their own accounts to share age-appropriate education materials, please refer to school or education sector policies and procedures.    

Google will have a responsibility to prevent under 16s from having their own accounts for purposes of accessing YouTube, regardless of whether or not those accounts are ‘condoned’ or ‘filtered’ by schools.  

Some learning management systems allow teachers to embed public video content from other platforms, such as YouTube. If the content is publicly available and does not require the student to log into an age-restricted social media platform, students will still be able to watch this content. 

Under 16s are able to use most apps and platforms that have the sole or primary purpose of enabling messaging or online gaming, as well as online services that share health and educational information and support.  

Legislative rules excluding certain types of online services were made by the Minister for Communications following advice from the eSafety Commissioner and consultation with youth groups, parents, carers, the digital industry and civil society groups, as well as experts in child development, mental health and law.

The exclusions apply to:

  • services that have the sole or primary purpose of messaging, email, voice calling or video calling
  • services that have the sole or primary purpose of enabling users to play online games with other users
  • services that have the sole or primary purpose of enabling users to share information about products or services
  • services that have the sole or primary purpose of enabling users to engage in professional networking or professional development
  • services that have the sole or primary purpose of supporting the education of users
  • services that have the sole or primary purpose of supporting the health of users
  • services that have a significant purpose of facilitating communication between educational institutions and students or student families
  • services that have a significant purpose of facilitating communication between health care providers and people using those services.

As of 21 November 2025, services that eSafety considers do not meet the criteria for being an 'age-restricted social media platform' (including those that fall within an exclusion in the legislative rules) include Discord, GitHub, Google Classroom, LEGO Play, Messenger, Pinterest, Roblox, Steam and Steam Chat, WhatsApp and YouTube Kids. Find the latest details about which platforms are age-restricted.

Multiple purpose platforms

It is important to note that many platforms have multiple purposes. For example, some messaging services have social-media style features which allow users to interact in other ways apart from messaging. If the primary purpose of the service changes due to the noticeable common use of these social-media style features, then they may be included in the age restrictions.  

Also, online gaming services that enable online social interaction through features and functions such as direct messaging, group chats and livestreaming may be included in the age restrictions if the service’s sole or primary purpose changes.  

The way online services are used can change over time, many services have multiple purposes, and new services are constantly being developed. The platforms which are age-restricted may change depending on whether they start to meet, continue to meet or no longer meet the legislative rules for exclusion.  

The legislative rules are supported by an explanatory statement, which provides some details about how eSafety should assess a platform’s sole, primary or significant purpose. The factors eSafety is to consider include:

  • the features and functions of the platform
  • how they are deployed and influence user engagement and experiences
  • the actual use of the platform, in addition to what the platform may say its intended purpose is. 

Reports can be made to the platforms.

Age-restricted social media platforms should provide easy pathways for people to report that they believe an account holder is under 16, to trigger an age check. But platforms should also provide a way for users who are 16+ to appeal if they are flagged or removed by mistake or due to a false report.

If an under-16 has an account on an age-restricted social media platform they are not breaking the law and no criminal charges or fines apply to them or their family because of this. It’s only age-restricted social media platforms that face penalties if they fail to take reasonable steps to stop under-16s having accounts. 

This means there’s no mandatory reporting of users under 16 – for parents, educators or police. However, reporting under 16 use may help the platform to understand how under-16s are getting around age checks, so it can tighten safety protections for all.

Please note: if someone receives a request to pay a fine for being under 16 or for not having their account verified, it’s a scam. They should NOT PAY. (See Scamwatch)

Responsibility for preventing under-16s from creating or keeping accounts lies with the platforms. There are no penalties for under-16s who access an age-restricted social media platform, or for their parents or carers.

However, school staff may want to model positive behaviour in the use of social media (and other technologies) by supporting students to adhere to the age restrictions. The most effective way to do this is to implement a whole-school approach. At the classroom level, this could involve establishing an online safety classroom agreement that outlines a shared understanding of rights and responsibilities for technology use.

Age-restricted platforms will be expected to provide clear ways for people to report underage accounts.  

eSafety’s Toolkit for Schools supports schools to design or strengthen policies and procedures to create safer online environments. 

No matter how old your students are, if they have a harmful experience online they should reach out for support – even if they are under 16 and it happens on an age-restricted social media platform.

Depending on what has happened, there are different ways to get support from eSafety or other services. They won’t get into trouble. 

Cyberbullying like harmful posts or profiles should be reported to the platform first. If it's very serious and the platform doesn’t help, it can be reported to eSafety.  

If someone shares, or threatens to share, a nude or sexual image or video of a person without their consent this is called ‘image-based abuse’ and it can be reported to eSafety. If someone is trying to use the image or video to blackmail a person, this is called ‘sexual extortion’ (or 'sextortion'). It’s best to follow our specific advice in the Toolkit for Schools Guide to responding to image-based abuse, including sexual extortion.  

eSafety’s 'I need help' page for young people has more detailed information and guidance on what they can do if something goes wrong online.

The platforms are responsible for finding and removing accounts held by users under the age of 16 who are ‘ordinarily resident in Australia’ from 10 December, when the law comes into effect.  

The Online Safety Act does not define ‘ordinarily resident in Australia’ and there is no stated time threshold that platforms must apply. International students under the age of 16 who are living in Australia should be aware that their accounts may be flagged for age checks, deactivation or removal if platforms receive signals indicating they are in Australia for a significant period of time or indefinitely.  

Platforms may check various signals to assess if a user intends to live or stay in Australia, such as the use of an Australian device and/or network provider, and updates to country settings within user accounts.  

Platforms should have mechanisms in place for users to appeal if they believe their account has been flagged, removed or deactivated in error, or if the user’s age or ordinary residence changes. 

You can support young people by referring them to eSafety’s information and resources specifically for under-16s. These explain the changes and how they might affect under-16s, provide a get-ready guide that students can fill out, and cover related health and wellbeing issues.

Support for parents and carers can be found in Social media age restrictions and your family. There are answers to frequently asked questions, as well as tools to support parents and carers, including a get-ready guide and a guide for starting conversations with young people about the social media age restrictions and its impacts. 

decorativeTools

Explore these practical tools to support the online safety of your students and school community. They follow the key elements of eSafety’s Best Practice Framework for Online Safety Education and Toolkit for Schools: prepare, engage, educate and respond.

 

Prepare

  • Webinars: Social media age restrictions explained – a guide for educators and youth-serving organisations
  • eSafety’s Toolkit for Schools supports schools to design or strengthen policies and procedures to create safer online environments (updates in relation to social media age restrictions coming soon).

Engage

Educate

Respond

Information to share

eSafety has developed information and resources to support young people, as well as parents and carers, to understand the social media age restrictions. You can share them with your school community.  

You can also download and share the educator stakeholder kit. The resources include posters, flyers, presentation slides, social tile, parent and carer and under-16s Get-ready guides, and guidance on how to use these assets in your communications.