Videos showing extreme violence are no longer rare – they are often viral. This guide explains how gore spreads, the risks for children and young people, and practical steps parents, carers and schools can take to reduce harm.
In this online safety advisory:
- When violent content goes viral
- What is gore content?
- How gore content spreads
- Why this matters now
- What eSafety is doing to help
- What parents and carers can do
- How schools can help
- What tech platforms should do about gore content
- Helpful resources
When violent content goes viral
Violent incidents are now filmed, uploaded and replayed within minutes – often landing in the social media feeds of young people without warning.
Graphic clips can surface via autoplay, recommendations, and reposts.
This advisory explains what ‘gore’ means, how it spreads, why it matters for young people, and what parents, carers, schools and platforms can do to minimise its harm.
What is gore content?
‘Gore’ refers to the graphic portrayal of violent acts across a variety of media. This can include actual footage of real events as well as dramatised live‑action scenes, and even animation.
In this guide, our focus is on unfiltered, actual violence captured on devices and then posted online.
Examples include bombings, murders, dismemberment or decapitation, graphic war and conflict footage (such as battlefield or drone videos), terrorist propaganda (such as ISIS beheadings), mass‑casualty events (such as the Christchurch massacre), and political assassinations.
Unlike fictional scenes in movies or games, this material could normalise real-world violence for some children.
How gore content spreads
Gore spreads online fast.
It might look harmless at first – disguised by innocent captions or thumbnails – this content can show up in general feeds, or “For You” pages, even when a young person is not searching for it.
It can also be tagged with popular hashtags to appear in unrelated searches or even be presented as recommended content, increasing exposure for young people.
Livestreams and bystander clips of violent acts are cut, reposted and boosted by algorithms across platforms. Some users share gore for shock value, while others do it to stir outrage, gain likes, or raise ‘awareness’ – all of which helps to exacerbate its rapid spread online.
Once uploaded, the same clips can circulate across mainstream platforms such as X, Facebook, Instagram, Snapchat, TikTok, and YouTube. They can also be easily shared in private messages and group chats (Telegram, Messenger, and WhatsApp). This is often to impress or outdo peers – with young users not fully grasping the nature of the material, how it affects developing minds, or the deeper consequences.
Copies persist even when one upload is removed. Others can reappear through private groups, cross‑posts, or by slipping through search loopholes.
While many online platforms have systems to block gore and extreme violence, altered copies make it more difficult for platforms to detect and remove. It becomes a classic game of whack-a-mole.
Dedicated gore websites with searchable libraries sorted by categories have also evolved into fringe social networks. These less mainstream, often hidden platforms for sharing extreme content have features such as follower tools, chat and recommendation loops. Such features can further amplify the reach and impact of this material.
Many of these gore sites operate within ‘permissive hosting environments,’ using complex hosting arrangements to evade removal by authorities.
While looking for connection, a sense of belonging, or ‘edgy’ content, some children and young people are targeted by predatory algorithms and steered toward more extreme material and sites.
These services ‘churn’ (when one disappears, another pops up) and their clips are frequently pushed back onto mainstream platforms.
Why this matters now
Public fascination with the macabre is not new; what’s changed is its scale, speed, reach, and saturation of exposure. Increased use of social media means it is now more likely users will inadvertently come across gore content.
High‑definition footage reaches personal devices within minutes, is replayable on demand and sometimes promoted by recommendation systems.
With cameras, editing tools and the internet now bundled into a phone, capturing content, uploading it and viewing it all happen on the same device.
Most social media networks mandate the use of sensitive content warnings or blur filters to shield innocent eyes from such visceral and damaging content. However, with reduced investment in trust and safety teams and technology, and slower detection systems, these safeguards often trail behind the rapid spread of viral gore.
eSafety’s latest research shows 22 per cent of children between the ages of 10-17 have seen extreme real-life violence online.
Exposure to gore can harm mental health, particularly with repeated or unexpected viewing. Children and teens may experience anxiety, nightmares, intrusive thoughts, avoidance, desensitisation or distress triggered by reminders.
The speed of spread and the difficulty of ‘un‑seeing’ graphic clips make prevention and early support vital.
What eSafety is doing to help
We work to reduce the impact of online harm, including exposure to violent content. This is what eSafety can do:
- Require removal or geo-blocking of extreme or illegal material under the Online Content Scheme.
- Enforce codes and standards that require online platforms to reduce exposure to violent content.
- Work with global platforms to accelerate removal of illegal content and prevent re‑uploads.
- Provide reporting options so anyone can flag harmful content.
- Offer trauma-informed advice through our guidance on dealing with distressing content, with links to resources for parents and carers, young people and kids, and educators.
- Run education programs and campaigns to raise awareness about online harm and build digital wellbeing.
Enforcement can be time consuming and complex, particularly where content is hosted overseas. It is also reactive, meaning exposure has already occurred. This is why prevention and early intervention remain essential.
What parents and carers can do
Here’s a practical checklist for before and after your child has been exposed to gore content.
Before exposure
Talk early and often. Ask what they’re seeing; keep discussions open and non‑judgemental. Explain that not everything online is safe, true or appropriate.
Set up protections. Use age-appropriate parental controls and platform filters to reduce violent content in feeds.
Tighten privacy. Limit who can contact your child or share content with them.
Model healthy habits. Be mindful of what you view, share and say about gore online.
If exposure happens
Stay calm and reassure. Your response sets the tone; let your children know they’re safe.
Let children lead. Ask open questions: ‘What did you see?’ ‘How did it make you feel?’
Limit your child’s re‑exposure. Pause the platform or device for a period and stop your child replaying the clip.
Report. Use the in‑app or platform tools, or report to eSafety if appropriate.
Seek support. If distress continues, contact services such as Kids Helpline or your school’s wellbeing team.
Ongoing protection
Keep the conversation going. Check in regularly about what your children are seeing online.
Build digital resilience. Practice critical thinking and coping strategies with your child (such as mute, report, block, step away, talk to a trusted adult).
Stay informed. Follow updates from eSafety and trusted child‑safety organisations.
How schools can help
Most exposure to gore content happens outside school, but the impact often shows up in the playground and classroom. This is how schools can respond quickly and support recovery:
Include digital literacy education. Teach students about unsafe, untrue and disturbing content and strategies for handling exposure.
Teach digital resilience early. Embed online safety, emotional regulation and critical thinking. Show students how to respond, and who to report to, if they see harmful content.
Create safe reporting pathways. Make it easy for children to report their concerns. When they speak up, respond with care and support.
Provide trauma‑aware support. Offer check‑ins, counselling and short‑term adjustments to school workloads for students exposed to gore content.
Train staff. Help teachers support trauma-affected students and stay ahead of emerging online risks with calm, practical strategies from eSafety professional learning webinars.
Engage families. Share information about emerging risks and practical steps at home via newsletters and other communication channels, and suggest eSafety parent webinars.
Review device protections. Ensure school‑issued/BYOD devices have appropriate settings and filters without blocking access to learning.
eSafety has many resources to help schools create safer online environments.
What tech platforms should do about gore content
Online platforms have the tools to stop gore from spreading among children and young people. The problem is not just how fast content spreads, it’s also how platform algorithms push it further, especially on social media.
Algorithms reward engagement, even when that engagement is driven by shock, fear, or outrage.
Priorities for platforms should include the following:
- Act swiftly to prevent the upload and spread of gore content by combining advanced technology, such as AI, with human oversight.
- Adjust algorithms to deprioritise extreme imagery and dial-down the amplification of gore content in recommender systems to prevent involuntary exposure.
- Implement stricter age checks and restrict autoplay features, ensuring children are protected from gore content.
- Make it simple for users to report gore content, with these reports triggering thorough, timely reviews instead of automated responses.
- Be transparent about the amount of gore content removed and the speed of responses by regularly sharing relevant data with the public.
- Deploy content warnings and blurring functionality and consult experts to design safer experiences for all users.
- Collaborate with regulators around the world to block gore content and prevent its spread across platforms.
Helpful resources
eSafety – Distressing or violent content – How to get help
eSafety – Disturbing content – advice for young people
eSafety – What is illegal and restricted online content?
eSafety – Parent guide to online safety
Kids Helpline – Dealing with trauma
Step Together – Encountering violent extremism