Imagine a powerful new technology that takes children back in time to experience the sights, sounds and sensations of ancient Rome or, conversely, subjects them to horrific new forms of predatory abuse.
The metaverse could be both, offering potential for enormous good alongside great harm. It all depends on the architecture we create to enable and govern it.
No one knows yet exactly what that will be. If the major platforms achieve a degree of interoperability, the metaverse could become one extended 3D world. Or it could look more like a multiverse with a range of “walled garden” offerings.
What we do know is that a range of tech-enabled, hyper-realistic and high-sensory experiences are coming together to form a totally new environment tipped to be worth $800 billion by 2024.
For all its novelty though, the metaverse presents business with an oddly familiar choice: pursue fast profits now and fix the damage later? Or work to build sustainable enterprise that promotes safety and wellbeing in tandem with growth.
Recognising this dilemma, the World Economic Forum (WEF) recently began a new initiative to support more ethical outcomes, Defining and Building the Metaverse.
As the world’s first regulatory authority established to tackle existing and future online harms, Australia’s eSafety Commissioner has a seat at this table shaping governance of the coming Extended Reality world, working alongside 60 of the world’s largest tech providers.
We are also looking forward to joining an Australian-led movement instigated by Dr. Catriona Wallace of the Gradient Institute called the Responsible Metaverse Alliance. This new advocacy network brings together artists, activists, academics, entrepreneurs and government to promote a safer, more transparent and more human-focused metaverse.
How should industry leaders respond? In retrospect, it is clear Big Tech faced a similar moment a generation ago at the dawn of Web 2.0. The emerging tech leaders of the day chose, as Mark Zuckerberg put it, to “move fast and break things”.
Unfortunately, among the things they broke were the lives of many people we at eSafety now seek to assist every day, not to mention much of the promise of Web 2.0 itself.
From the early 2000s, rapid growth in platforms leveraging new interactive possibilities created complex and often unforeseen consequences for society; a tech wreck governments around the world are only now beginning to address.
Business has suffered too. Just recently Zoom provided another example of what can go wrong when proper safeguards aren’t proactively applied, resulting in serious security, privacy and safety vulnerabilities.
The platform spectacularly scaled – going from 10 million daily meeting participants in December 2019, to 300 million by March 2020. It then spectacularly failed to contain a resulting surge in “Zoom bombing”, where uninvited guests gate-crash meetings, exposing participants to an array of inappropriate content.
Among those affected was a group of Queensland primary school children subjected to horrific videos they will never be able to unsee.
Reputational damage from events like these is difficult to measure but the mass exodus of organisations to alternative platforms must surely be felt.
One of the positive things about bad experiences, however, is that we can use them to avoid repeating past mistakes. And we are seeing companies like Zoom making great safety strides.
At eSafety we support this process through our Safety by Design initiative. The aim is to make safety a priority rather than an afterthought, assessing risk upfront and using innovation to bake safety protections in at the start.
There is no time to lose. As I write, companies are developing haptic suits that enable people to feel pain from experiences such as simulated gunshot wounds. It doesn’t take much imagination to see how that could be abused. We have already seen allegations of virtual rape reported in relation to prototype metaverse applications.
And while the metaverse will seem an enticing virtual playground to children, it will also be permeated with dark corners and private spaces. The Center for Countering Digital Hate found users including children are exposed to abusive behaviour every seven minutes on VR Chat, one of Facebook’s more prominent existing metaverse apps.
Unfortunately, many metaverse platforms have not yet built in functions to automatically restrict adult users approaching children, or proactively filter or block harmful experiences. When children are wearing VR headsets or AR glasses, parents will no longer be able to peer over their shoulder to see what they are experiencing in real-time.
The Institution of Engineering and Technology recently highlighted that some VR apps today enable users to simulate intimate contact with school children, normalising and desensitising them to various forms of sexual abuse, coercion and assault.
These issues are deeply concerning but also completely foreseeable. They are a continuation of harms we’ve seen manifest and proliferate in today’s online world. Hyper-realistic, potentially invasive experiences in the metaverse can only amplify the trauma they inflict.
And with history as a guide, we cannot assume all technology players across the ecosystem will apply a proactive safety lens.
Australia’s new Online Safety Act ensures that our interlocking regulatory powers reinforce their responsibility to do so. We are in the process of assisting industry to develop detailed mandatory codes, underpinned by Safety by Design principles.
Sitting alongside the Codes are the Basic Online Safety Expectations which platforms will be required to report against. They create a potent transparency function to help eSafety surface up systemic failures that contribute to user harm.
These tools will help us supplement and leverage the intelligence we have already gleaned in remediating serious online harms for our citizens. They will apply not only to today’s shortcomings but those of the coming metaverse and a decentralised, Web 3.0 world.
However, companies can also choose of their own volition to make safety an integral part of architecting the metaverse. Doing so is an act of enlightened self interest.
It requires innovation, investment, advanced technology and creative mindsets but these are all attributes the tech industry has in spades. And a safer, more civil and positive metaverse will lead to better user retention – plus more opportunity for further innovation.
Let’s stop breaking things. The new tech catchcry needs to become, “First, do no harm”.