Need help dealing with violent or distressing online content? Learn more

Drifting into an ‘echo chamber’? Take control of the algorithms that shape your feed

Every time you go online, algorithms influence what you see. They can gradually push you into an ‘echo chamber’ of narrow content. But you have the power to stop the drift – and break free if you’re feeling stuck.

In short:

  • Algorithms and recommender systems decide what you see online based on what you do, who you are, and who you engage with.
  • They’re designed to keep you on the platform for as long as possible, which means it can be hard to stop scrolling.
  • Over time, the algorithms can push you towards an ‘echo chamber’, where your feed becomes narrow, repetitive or more extreme without you realising. This can give you an unbalanced view of what’s ‘normal’ or what most people think.
  • Understanding how these systems work helps you spot warning signs, so you can rebalance your content and protect your wellbeing.
  • You can stay in control of your feed by mixing up your viewing habits, using in-app tools to hide or show you less of some types of content, and applying critical thinking skills.

What are algorithms and recommender systems?

Whether you’re watching videos, browsing websites, scrolling through socials, gaming, or talking to an AI chatbot, the content you see online and in your feed is not chosen at random.

Behind almost every site, app or platform are invisible systems that decide what to suggest, what to show you, and sometimes what to hide. Together, these algorithms and recommender systems quietly shape what you see and engage with, influencing how you think, feel and behave.

Algorithms are the rules or instructions that help platforms sort and organise huge amounts of content. They look at your activity – like what you watch, search for, follow, pause on or click – to work out what to show you. For example, you like one puppy video and your feed is suddenly full of them.

Recommender systems use algorithms, and other information like your age, gender and location, to decide what content you might like to see or interact with next. Platforms can also program recommender systems to suggest posts, videos and ads that help them meet business goals like growing the number of users or making more money from advertising.

These systems are built into almost everything you use online, including:

  • social media apps (‘for you’ feed, videos, stories, friend suggestions)
  • streaming services like Netflix (recommended movies and TV shows)
  • music apps like Spotify (curated playlists and podcast suggestions)
  • gaming platforms (new contacts, ads, curated content)
  • online stores like Amazon (product recommendations and personalised search results)
  • search engines and news sites (ranked results and ‘top stories’)
  • AI chatbots (advice about what’s ‘best’ for you).

How they shape your online experience

Algorithms and recommender systems learn from what you do online and boost content you react to or spend time on. This can make your feed feel more fun and personal – like making it easier to discover music, games, creators, and information or advice you’re interested in. But it can also become harmful.

These systems are usually designed to keep you engaged, not to support your health and wellbeing. This means they often promote attention-grabbing content. Then once you’ve shown interest by clicking, liking, commenting or even just pausing for longer than usual, they serve you more of the same type of content – without considering how accurate it is or how it makes you feel.

Combined with features like constant notifications, disappearing content and infinite scrolling (no page end), it can make it hard to take a break. Over time, it can also narrow your feed and push you towards more emotional, shocking or extreme content, increasing your risk of ending up in an ‘echo chamber’

What is an echo chamber?

An echo chamber is like being in a room where you mostly hear things that back up what you already think, or what others in the room are saying. This can make certain ideas feel more common, normal or acceptable than they really are, while cutting you off from other points of view. People in echo chambers don’t just ignore outside opinions, they often put them down or reject them, and bully or threaten others for having opposing views.

As a result, echo chambers can promote harmful ideas and stereotypes, including:

  • bias and discrimination against someone because of their gender (sexism)
  • hatred towards women (misogyny)
  • dislike of LGBTIQ+ people (homophobia/ transphobia)
  • racism and fear of ‘outsiders’ (xenophobia)
  • use of violent porn and other disturbing content
  • radical political, personal or social views
  • dangerous conspiracy theories
  • negative body image and unhealthy eating.

How this can play out for young people

When you’re young, there’s a greater risk of being affected by algorithms and echo chambers because you're still working out who you are, what you believe, and where you fit in.

So when your friends follow the same creators, trends or opinions, it can feel easier to go along with them, especially if you’re worried about being teased or left out. This social pressure, combined with the effect of algorithms, can pull you deeper into content loops or spaces even when things feel ‘off’.

Then your thoughts and attitudes can gradually become more extreme without you realising it – affecting how you see yourself, other people and the world.

What to look out for

Because of the way algorithms work, anyone can get caught in an echo chamber – it’s not your fault. What might start as experimenting with ideas or doomscrolling negative content on a bad day can send your feed out of balance.

The best way to protect yourself is to recognise the warning signs of echo chambers. These are some things to look out for:

For example, you watch a video criticising the government’s vaccine advice and scroll the comments, then gradually your feed fills with anti-science conspiracies.

Algorithms often promote narrow ideas about what young men and women 'should' be like because these themes get a lot of engagement.

  • Girls often get pushed towards makeup, ‘glow-ups’, ‘trad wife’ roles, cleaning hacks, beauty rules and how to be desirable.
  • Boys often get pushed towards toughness, muscles, dominance, money, dating ‘rules’, or content that mocks women or emotions.

You can start to believe these roles are expected of you – or that everyone else thinks this way – even when it’s not true. This can leave you feeling pressured, judged, or like you’re 'failing' at being who you’re 'supposed' to be.

  • Watching self-improvement or confidence videos: some algorithms push creators with strict ideas about relationships, gender roles or success – sometimes slipping into sexist or homophobic messages.
  • Engaging with political opinion posts: reading or watching a few strong opinions can lead to more one-sided content being pushed into your feed. This can promote biased views, or frame certain groups as threats – making unfair ideas feel more common or justified than they really are.
  • Looking up makeup or beauty tips: recommended content becomes more extreme, like comparison videos, pressure to be ‘perfect’, or creators pushing that girls should focus on looks over school, hobbies or goals.
  • Searching for gym workouts: you may start seeing unrealistic male body ideals, extreme dieting, or ‘grind harder’ content that shames rest or normal body types.
  • Being into gaming or streaming: gaming forums often recommend channels where jokes turn into hate, stereotypes or ‘us vs them’ thinking. Over time it can make bullying, harassment or extreme views seem normal.

These shifts usually happen slowly, which makes them harder to notice. That’s why it’s important to look out for them early.

Specific risks for young men and boys

Depending on the platform’s programming, recommender systems can quickly pick up on themes like fitness, dating, success, or confidence and start promoting more extreme content. This can lead to narrow ideas of masculinity and drag you into the ‘manosphere’ with unbalanced views about girls and women.

This doesn’t happen because there’s something wrong with you. It happens because the system is trying to keep your attention.

Here’s how things can escalate:

Many boys first encounter harmful content like ‘looksmaxxing’ videos after exploring content about improving your appearance. It can start as:

  • gym routines
  • skin tips
  • posture or style videos.

But the algorithm may gradually push:

  • extreme dieting or ‘starvation’ cuts
  • obsession with jawlines, height or bone structure
  • unsafe ‘hacks’ like bone smashing
  • constant self-criticism or comparing yourself to influencers.

This can fuel anxiety, low self-esteem, body image issues and unhealthy habits.

You might click on a few videos about confidence or dating advice. Soon your feed is full of:

  • ‘real men don’t show emotion’
  • ‘women must be dominated’
  • ‘alpha vs beta’ talk
  • creators mocking girls, LGBTIQ+ people or ‘weak men’.

At first, it might seem like harmless humour or ‘straight talk’, but over time it pushes a view that being a man means being tough, angry or superior – and that girls and women are inferior.

Some boys and young men get drawn into what’s called the ‘manosphere’ – a group of influencers, forums and communities that promote harmful ideas about masculinity. They often blame women, feminism or society for men’s struggles and encourage unhealthy ideas about relationships, status and power.

You might be in a manosphere echo chamber if:

  • most content blames women or feminism for men’s problems
  • you hear terms like stacey, chad, red pill, black pill
  • humour is used to shame or degrade women
  • violence or humiliation is framed as taking control
  • extreme opinions are treated like ‘facts’
  • you feel pressure to agree or risk being excluded.

These spaces can feel empowering at first – like someone finally ‘gets it’ – but they often promote anger, hopelessness, distrust, and violence.

If you’re seeing content that promotes harming women or yourself, exit and unfollow the site.

It may help to talk with a free, confidential support service. These are available 24/7:

 

In our 2024 research1, some young men told us about their experiences of how their views are influenced online.

'[Male influencers] just don’t necessarily represent what being a man is all about... what you should be striving to be as a person, you know? [I]t’s not promoting emotional kind of sensitivity or an understanding of... your mental health.' – Felix (20)

1Find out more in the research report, Being a young man online.

How to take back control of what you see online

Small steps can make a big difference.

Use critical thinking

Ask yourself:

  • Is this fact or opinion?
  • Who benefits if I believe this?
  • Is this creator trying to get clicks or money?
  • Are there other sides to this story?

You can find more tips on our critical thinking page.

Actively look for different viewpoints

Algorithms follow your behaviour, so give them something new to learn from:

  • Follow creators with different backgrounds or ideas.
  • Search topics using neutral words.
  • Use trusted, evidence-based news sources.

You don’t need to agree with everything - you just need a variety of information.

Adjust your feed

These steps help ‘reset’ what you’re shown:

  • Clear your watch or search history.
  • Tap ‘not interested’ or ‘see less of this’.
  • Unfollow or mute accounts that stress you out.

Visit The eSafety Guide to find out more about using your settings on different apps and platforms.

Set boundaries online

  • If you’re seeing content that’s violent, or encourages you to harm someone else or yourself, close it and unfollow the site. Report it to the platform or to eSafety.
  • Track your screentime and take breaks (aim for a short break every 30 minutes).
  • Turn off notifications or put your phone on ‘do not disturb’ sometimes, to reduce distractions.
  • Spend time offline with people you trust.

Find out more about balancing your time online.

Stay open-minded and kind

  • Listen when someone challenges what you’ve shared.
  • It’s OK to change your mind.
  • It’s also OK to respectfully disagree.

Something has happened

Look for new content.
Most platforms use recommender systems that guess what you want to see next, based on your past activity. You may need to actively search for different content to rebalance your feed.

Use in-app functions.
Report, mute or block content you don’t want to see. Many apps let you choose ‘See less of this type of content’.

Take a break from being online.
Set time limits and balance online and offline activities.

Get support.
If what you’re seeing affects your daily life, talk to someone you trust or contact Kids Helpline.

Have an open conversation.
Share alternative perspectives and ask them where their views are coming from.

Check the source.
Research the creators they follow - some are paid to get reactions or push certain ideas.

Protect your wellbeing.
Don’t feel pressured to argue online. Use ‘see less of this content’ or take a break if you need to.

If something goes wrong

An important step is reminding yourself that it’s OK to reach out for help instead of trying to cope on your own. You’re never alone.

If anything makes you feel confused, stressed, pressured, uncomfortable or unsafe online, you can:

  • talk to a trusted adult – someone like a parent, auntie, teacher, coach or counsellor
  • use problem solving steps or critical thinking to figure out what’s happening and what your options are
  • reach out to support services like Kids Helpline (for ages 5 to 25) or MensLine Australia – they’re free, confidential and available 24/7
  • report harmful or illegal content to the platform or to eSafety – check I need help: Something has happened online
  • use your privacy and safety settings to mute comments and limit contact from anyone who pressures or scares you.

More useful resources

Countering violent extremism

Developing resilience