The Real Risks to Children in VR

By ,
January 31, 2023
Child using a VR headset with a digital network overlay, representing the real risks children face in virtual reality

As part of our work protecting users online, ActiveFence’s Child Safety Researcher, Rafael Javier Hernández Sánchez, led a deep analysis and on-platform research of the threats to children from predators in the VR world.

Virtual reality (VR) games and platforms appeal to minors of all ages, and they have become attractive targets for predators seeking children to groom and abuse. As discussed in Trust & Safety in the Metaverse, VR is in its infancy. As a result, Trust & Safety teams do not yet fully understand the risks users face and lack the best risk mitigation approaches to counter them. In this article, we review some of the key findings that provide essential insights for this new arena of Trust & Safety.

The Rise and Risks of Virtual Reality

Virtual reality spaces mimic the physical world.

  • Many VR platforms host massive multiplayer virtual worlds where users can interact through 3D avatars with friends and strangers, in public and private.
  • The created environments have the spatial attributes of physical locations, i.e., a nightclub with tables and seating booths, a bar with a main room, and a restroom. Just as in the physical world, there are also blind spots and hidden spaces in VR spaces.
  • The audio communications simulate reality, with “proximity chats” that allow users to hear each other better the closer their avatars are to one another. Interactions in one room are barely audible or utterly inaudible in the next.

However, like other online platforms, user activity is characterized by reduced behavioral inhibitions. This usage is particularly dangerous, as VR is immensely popular with young users, with significant numbers of under-18s.

The features of VR platforms that provide a simulated world, together with young users acting naively, raise the risk of predators taking advantage and inappropriately engaging with minors who lack the safeguards in their physical surroundings.

Trust & Safety teams tasked with maintaining child safety in VR face unique behavioral and technological challenges.

Minor Access to VR Spaces

While VR platforms and hardware vendors meet the minimum age requirements of the tech industry, allowing only users 13 years or older to use their services – children abound. In one VR platform that allows users to create virtual worlds, we found that despite the platform’s 18+ age limit, underage users were found in every virtual world. Our research and evaluation of user voices and general behavior in VR lobbies suggest a high prevalence of accounts operated by users aged ten or younger. These children easily bypass the age requirements in place by lying about their age or using pre-approved headsets of parents or older siblings.

Parents are often part of the problem. Not understanding the present risks, there is a trend of using VR technology as a babysitter. We have identified many online posts where parents request au pairs to spend time with their children inside VR worlds after returning home from school.

The Risk to Children

Minors join VR lobbies without any oversight. Lacking caution, they run around, talk loudly, and approach adult users. Some adults leave, annoyed by the interaction, while others engage the minors in conversation.

Our investigations found that minors frequently share personal details within earshot of other players. They reveal their names, location, and other personal identifiable information.

Child-in-VR behavior is no different than that seen in 2D gaming platforms. However, the framework enhances this risky behavior, as it is easier to reveal personal details when speaking “in person.” This trusting disposition leaves young users open to abuse.

VR Child Sexual Exploitation

In VR, child predators are already emboldened by the operative freedom afforded by the anonymity of online interactions. They take advantage of new ways to reach and build relationships with minors.

Some predators carry out child sexual exploitation activities by listening out for the voices of minors. They encroach on their virtual personal space and “touch” and grope the avatars of these children.

Others use this access as another method to groom minors. Our research identified predators seeking to convince children to participate in simulated sexual acts or erotic role play (ERP) to build sexual trust with minors. Predators leverage this trust to request off-platform communications where they request or demand sexually explicit self-produced recordings of that child. Child predators carry out this behavior for sexual gratification, while scammers work to conduct sextortion against these minors.

The relative privacy afforded by the physical dimensions of VR spaces facilitates these inappropriate interactions. Threat actors “hide” their attempts to groom children, even in public lobbies by approaching minors out of view from others and, then taking them to hidden spaces. In addition, they exploit those public worlds, such as virtual nightclubs or theaters, with “private” rooms that can be accessed when empty and locked.

In addition to luring minors into private spaces, other predators bring vulnerable users into fully private worlds. The risks are reminiscent of physical world predator-child encounters, where adults offer minors rewards in exchange for accompanying them. There have been reported cases of groomers offering minors real money in exchange for entering their private VR worlds.

A Pathway to Offline Sexual Abuse

A broader concern is that VR can provide a pathway for off-platform child sexual exploitation. As mentioned above, minors have been groomed by adults in VR and coaxed into sending real-world pictures of themselves to their abusers.

There have already been cases of child predators who met children in VR and built exploitative relationships with them over time. Eventually, they visited the minors in person and sexually abused them. In one instance in 2020, a 36-year-old man met a 15-year-old girl on a VR platform. He groomed her and crossed state lines from Louisiana to Florida to meet her. He proceeded to live in her bedroom, unknown to her parents, and sexually abused her for four weeks before his arrest.

Our research into dark web child predator communities uncovered similar activity. We detected adult users describing how they were currently using VR platforms to speak and spend extended periods with underage boys. These men expected to gain a physical relationship with these minors.

A Paradigm Shift for Trust & Safety

VR is not only a technological revolution; it requires a paradigm shift for Trust & Safety. This shift is required because users go into very real places carefully created to imitate reality when they put on a VR headset. One with many of the physical limitations of reality.

Unlike threat actors engaged in 2D online spaces, where exchanges are textual and archived on the platforms or search engine records, children can find themselves locked in rooms with adults, with no one able to hear or see them. Child predators in VR leave no traceable footprint, as the suspicious interactions are fleeting. An inappropriate spoken comment, question, or invitation leaves no digital evidence that traditional OSINT mechanisms can detect.

If Trust & Safety teams are to secure VR for minors, education is essential.

  • Users must understand the complexities of the worlds to which they have access.
    • They should understand why restrictions or supervision on minors are needed.
    • VR platforms should provide resources to users whose behavior suggests that they are at risk of harm and should ensure that their customers understand the safety features in place.
  • Trust & Safety teams should build an in-platform culture of communal responsibility, providing users with the ability and security to intervene and report suspicious activity, especially those relating to minors. This collective ethos is fundamental to foster, as, just as in reality, we need witnesses to identify inappropriate interactions.

Where to next?

To support these efforts, Trust & Safety operations must use off-platform intelligence to monitor communities of threat actors for mentions of VR in their discussions of illicit activities. This is crucial to identify VR abuses, providing actionable insights and greater visibility of abusive trends in their activities.

For a sample view of ActiveFence’s work in VR, access our exclusive research into VR exploitation for child abuse, hate speech, and terrorist group promotion.

Table of Contents