The misogyny of the metaverse: is Mark Zuckerberg’s dream world a no-go area for women?

17 hours ago 9

Everybody knows that young women are not safe. They are not safe in the street, where 86% of those aged 18 to 24 have experienced sexual harassment. They are not safe at school, where 79% of young people told Ofsted that sexual assault was common in their friendship groups and almost a third of 16- to 18-year-old girls report experiencing “unwanted sexual touching”. They are not safe in swimming pools or parks, or at the beach. They are not even safe online, with the children’s safety charity the NSPCC reporting that social media sites are “failing to protect girls from harm at every stage”.

This will come as no surprise to any woman who has ever used social media. But it is particularly relevant as Meta, the operator of some of the biggest social platforms on the internet, is busily engaged in constructing a whole new world. The company is pumping billions of dollars a year into building its metaverse, a virtual world that it hopes will become the future not just of socialising, but of education, business, shopping and live events. This raises a simple question: if Meta has utterly failed to keep women and girls safe in its existing online spaces, why should we trust it with the future?

Mark Zuckerberg has grandly promised: “In the metaverse, you’ll be able to do almost anything you can imagine.” It’s the sort of promise that might sound intensely appealing to some men and terrifying to most women.

Indeed, the deeply immersive nature of the metaverse will make the harassment and abuse so many of us endure daily in text-based form on social media feel 100 times more real and will simultaneously make moderation 100 times more difficult. The result is a perfect storm. And I am speaking from experience, not idly speculating: I spent days in the metaverse researching my book, The New Age of Sexism.

There is no single definition of the metaverse, but most people use the term to describe a shared world in which virtual and augmented technologies allow users (represented by avatars) to interact with people, objects and environments. Most of Meta’s virtual world is accessible only to those who pay for the company’s Quest headsets, but a limited number of metaverse spaces can be accessed by any device connected to the internet. Advanced technology such as 3D positional audio, hand tracking and haptic feedback (when controllers use various vibrations to coincide with actions you take) combine to make virtual worlds feel real. Your avatar moves, speaks and gestures when you do, allowing users to interact verbally and physically.

Less than two hours after I first entered the metaverse, I saw a woman’s avatar being sexually assaulted. When I approached her to ask her about the experience, she confirmed: “He came up to me and grabbed my ass.”

“Does that happen a lot?” I asked.

“All the time,” she replied, wearily.

I used my haptic controller to “pick up” a bright-yellow marker and moved towards a giant blackboard. “HAVE YOU BEEN ASSAULTED IN THE METAVERSE?” I wrote.

The response was near instantaneous. “Yeah, many times,” someone shouted.

“I think everybody’s been assaulted in the damn metaverse,” one woman replied immediately, in a US accent.

“Unfortunately, it is too common,” a British woman added, nodding.

Both women told me they had been assaulted multiple times.

During my time in the metaverse, sexual harassment and unwanted sexual comments were almost constant. I heard one player shout: “I’m dragging my balls all over your mother’s face,” to another and witnessed male players making claims about “beating off”, as well as comments about “gang bangs”. My virtual breasts were commented on repeatedly. I did not witness any action taken in response – whether by a moderator or by another player.

A damning TechCrunch report from 2022 found that human moderators were available only in the main plaza of Meta’s metaverse game Horizon Worlds – and that they seemed more engaged in giving information on how to take a selfie than moderating user behaviour.

More worryingly still, I visited worlds where I saw what appeared to be young children frequently experiencing attention from adult men they did not know. In one virtual karaoke-style club, the bodies of the singers on stage were those of young women in their early 20s. But based on their voices, I would estimate that many of the girls behind the avatars were perhaps nine or 10 years old. Conversely, the voices of the men commenting on them from the audience, shouting out to them and following them offstage were often unmistakably those of adults.

It is particularly incumbent on Meta to solve this problem. Of course, there are other companies, from Roblox to Microsoft, building user-generated virtual-reality gaming platforms and virtual co-working spaces. But, according to NSPCC research, while 150 apps, games and websites were used to groom children online between 2017 and 2023, where the means of communication was known, 47% of online grooming offences took place on products owned by Meta.

These are not isolated incidents or cherry-picked horror stories. Research by the Center for Countering Digital Hate (CCDH) found that users were exposed to abusive behaviour every seven minutes in the metaverse. During 11 and a half hours recording user behaviour, the report identified 100 potential violations of Meta’s policies. This included graphic sexual content, bullying, abuse, grooming and threats of violence.

A woman wearing VR goggles
A woman wearing VR goggles … Meta’s virtual world has been plagued with reports of sexual harassment and abuse. Photograph: AntonioSolano/Getty Images

In a separate report, the CCDH found repeated instances of children being subjected to sexually explicit abuse and harassment, including an adult asking a young user: “Do you have a cock in your mouth?” and another adult shouting: “I don’t want to cum on you,” to a group of underage girls who explicitly told him they were minors.

Since its inception, Meta’s virtual world has been plagued with reports of abuse. Users have reported being virtually groped, assaulted and raped. Researchers have also described being virtually stalked in the metaverse by other players, who tail them insistently, refuse to leave them alone and even follow them into different rooms or worlds.

In December 2021, a beta tester of the metaverse wrote in the official Facebook group of the Horizon platform: “Not only was I groped last night, but there were other people there who supported this behaviour.”

What was even more revealing than the virtual assault itself was Meta’s response. Vivek Sharma, then vice-president of Horizon at Meta, responded to the incident by telling the Verge it was “absolutely unfortunate”. After Meta reviewed the incident, he claimed, it determined that the beta tester didn’t use the safety features built into Horizon Worlds, including the ability to block someone from interacting with you. “That’s good feedback still for us because I want to make [the blocking feature] trivially easy and findable,” he continued.

This response was revealing. First, the euphemistic description of the event as “unfortunate”, which made it sound on a par with poor sound quality. Second, the immediate shifting of the blame and responsibility on to the person who experienced the abuse – “she should have been using certain tools to prevent it” – rather than an acknowledgment that it should have been prevented from happening in the first place. And, finally, most importantly, the description of a woman being abused online as “good feedback”.

Much subsequent discourse has focused on the question of whether or not a sexual assault or rape carried out in virtual reality should be described as such; whether it might have an impact on the victims similar to a real‑life assault. But this misses the point. First, it is worth noting that the experience of being sexually harassed, assaulted or raped in the metaverse has had a profound and distressing impact on many victims.

When it was revealed in 2024 that British police were investigating the virtual gang-rape of a girl below the age of 16 in the metaverse, a senior officer familiar with the case told the media: “This child experienced psychological trauma similar to that of someone who has been physically raped”.

Second, technology to make the metaverse feel physically real is developing at pace. You can already buy full-body suits that promise to “enhance your VR experience with elaborate haptic sensations”. They have sleeves, gloves and vests with dozens of different feedback points. Wearable haptic technology will bring the experience of being virtually assaulted much closer to the physical sensation of real-life victimisation. All the more reason to tackle it now, regardless of how “realistic” it is or isn’t, rather than waiting for things to get worse.

But most importantly, regardless of how similar to or different from physical offline harms these forms of abuse are, what matters is that they are abusive, distressing, intimidating, degrading and offensive and that they negatively affect victims. And, as we have already seen with social media, the proliferation of such abuse will prevent women and girls from being able to fully use and benefit from new forms of technology.

A 3D rendering of a metaverse city.
A 3D rendering of a metaverse city. Photograph: Kinwun/Getty Images

If Zuckerberg’s vision comes to fruition and the boardrooms, classrooms, operating theatres, lecture halls and meeting spaces of tomorrow exist in virtual reality, then closing those spaces off from women, girls and other marginalised groups, because of the tolerance of various forms of prejudice and abuse in the metaverse, will be devastating. If we allow this now, when the metaverse is (relatively speaking) in its infancy, we are baking inequality into the building blocks of this new world.

At the time of the afore­mentioned virtual-reality rape of an underage girl, Meta said in a statement: “The kind of behaviour described has no place on our platform, which is why for all users we have an automatic protection called personal boundary, which keeps people you don’t know a few feet away from you.”

In another incident, when a researcher experienced a virtual assault, Meta’s comment to the press was: “We want everyone using our services to have a good experience and easily find the tools that can help prevent situations like these and so we can investigate and take action.”

The focus always seems to be on users finding and switching on tools to prevent harassment or reporting abuse when it does happen. It is not on preventing abuse and taking serious action against abusers.

But in the CCDH research that identified 100 potential violations of Meta’s VR policies, just 51 of the incidents could be reported to Meta using a web form created by the platform for this purpose, because the platform refuses to examine policy violations if it cannot match them to a predefined category or username in its database.

Worse, not one of those 51 reports of policy violation (including sexual harassment and grooming of minors) was acknowledged by Meta and as a result no action was taken. It’s not much good pointing to your complaints system as the solution to abuse if you don’t respond to complaints.

Meta’s safety features will no doubt continue to evolve and adapt – but, once again, in a repeat of what we have already seen happen on social media, women and girls will be the canaries in the coalmines, their abuse and suffering providing companies with useful data points with which to tweak their products and increase their profits. Teenage girls’ trauma: a convenient building material.

There is something incredibly depressing about all this. If we are really talking about reinventing the world here, couldn’t we push the boat out a little? Couldn’t we dare to dream of a virtual world in which those who so often face abuse are safe by design – with the prevention and eradication of abuse built in – instead of being tasked with the responsibility of protecting themselves when the abuse inevitably arises?

None of this is whining or asking too much. Don’t be fooled into thinking that we are all lucky to be using Meta’s tools for nothing. We are paying for them in the tracking and harvesting of our data, our content, our photographs, our ideas and, as the metaverse develops, our hand and even eye movements. All of it can be scraped and used to train enormously powerful AI tools and predictive behavioural algorithms, access to which can then be sold to companies at gargantuan prices to help them forecast how we as consumers behave. It is not an exaggeration to say that we already pay Meta a very high price for using its platforms. And if the metaverse really does become as widely adopted and as ubiquitous in the fundamental operation of our day-to-day lives as Zuckerberg hopes, there won’t be an easy way to opt out.

We can’t let tech companies off the hook because they claim the problem is too big or too unwieldy to tackle. We wouldn’t accept similar excuses for dodging regulation from international food companies, or real-life venues. And the government should be prepared to act in similar ways here, introducing regulation to require proved safety standards at the design stage, before products are rolled out to the public.

“Hold on, just building the future here,” Horizon Worlds tells me as I wait to access the metaverse. As we battle to eradicate the endemic harassment and abuse that women and girls face in real-world settings, the metaverse presents a risk of slipping backwards. We are sleepwalking into virtual spaces where men’s entitlement to women’s bodies is once again widespread and normalised with near total impunity.

The Guardian invited Meta to reply to this article, but the company did not respond.

Read Entire Article
Bhayangkara | Wisata | | |