Dark social may sound like the grim reaper is having conversations with your kids online, or anyone for that matter, but thankfully, it’s a little less ominous.
The terms 'clear web', 'deep web', 'dark web', 'dark social', and 'disappearing media' tend to be tossed around interchangeably so we’ve created a handy little glossary box for you to properly understand the web lingo below.
What is Dark Social?
Dark social refers to the private features of a social networking service. This includes closed groups, Facebook Messenger or Twitter Direct Messages, messaging apps like WhatsApp and Snapchat, and in-game messaging features found in games like Minecraft and Roblox.
Think of it like this: you see a great-looking restaurant and send a website link about the venue to a friend, using Messenger. This link is provided privately, so the information you sent can’t be tracked using web analytics (which measure and analyse web traffic). Voilá, you’ve just shared using dark social.
While dark social is a term used predominantly in the marketing industry, it has implications from an online safety perspective as there are both benefits in this type of communication, and a number of risks.
Moving into the light
You may think that the term ‘dark social’ sounds a little menacing and perhaps a better term for this communication is ‘hidden social’. Because, in reality, it’s simply communication that is hidden from public view.
The good news is that there are many benefits. Most notably, these services:
- are free, and provide instant and easy to use ways to communicate with friends and family, similar to email and text messaging and are often used for group chat;
- allow links to be shared without being tracked by platform analytics, and;
- offer the ability to share information that is not for public consumption.
What are the risks?
While there are many benefits, there are also some real risks.
The hidden nature of these platforms makes it difficult to monitor and regulate the content that’s being shared between users. Dark social can also facilitate conversations with strangers, which could lead to grooming, or exposure to content that is inappropriate for younger children. Unfortunately, children and teens, who may be suspicious of an anonymous text on their mobile, or a call from an unknown number, might be more receptive to communication with a ‘random’ when using a messaging app, or via IM in a game.
In addition, it’s hard for parents and educators to keep an eye on content that’s being shared in this way, and there is a higher chance that prohibited content will pass through without the service able to detect it.
In dark social, cyberbullying and image-based abuse are also potential risks, particularly when the abuse is shared instantaneously with a group of users.
How you can help
There are some simple approaches parents can use to help children and teens navigate their communications using hidden social.
- Teach them. Remember that when you hand a child a mobile phone or tablet, they need to know the safety rules. Just like you’d teach them road rules, be sure to teach them how to behave online. And, should they run into trouble, support them in working out how to deal with it, including where to find help.
- Help children to ‘read’ messages critically, paying attention to tone and meaning. By being savvy about how language is used they will be better able to recognise when other users are not behaving appropriately.
- Use technology tools. Check the privacy and safety settings for the services and apps they use and learn how to set these to suit you. Many of these services are currently improving their reporting functions for users, making it easier to report abuse.
- Talk. Have open conversations with your children about the devices and platforms they like to use. Join up yourself and learn the ins and outs so you’re well informed and can help guide them when they need.
What else you can do
Most messaging platforms offer blocking functions and have some reporting capabilities.
Remember, if your child is seriously cyberbullied, you can contact the social media platform and ask them to remove the content. If this is not resolved within 48 hours, follow this up with a complaint to the eSafety Office, including the offending content and communication from the service provider about the report.
Reporting abuse on ‘dark social’ services is not always straightforward, and can involve decrypting messages to address the issue. Find out how to report something that might be upsetting your child on services like WhatsApp, Facebook Messenger and Snapchat at Games, Apps and Social Networking.
If your child sees offensive or illegal content, guide them through it by talking about the sites they visit. Be aware of how they use the internet and discuss the sites and apps that are okay to explore and those that are not. You can teach your child strategies about how to deal with offensive material but remember to be vigilant, especially if your child is prone to taking risks or is emotionally or psychologically vulnerable.
If your child experiences image-based abuse, help them to report it, teach them how to collect evidence and learn how to look after yourself, and your child.
For more, please visit esafety.gov.au.