Meta is now opening up its Horizon Worlds online VR platform to younger users, specifically those aged 10 to 12, but this comes with some structured oversight from parents. Parents can manage these accounts to help control what their children can explore in this virtual space.
In a recent announcement, Meta revealed that parents will soon have the ability to approve various VR experiences that are deemed suitable for preteens. Among these are interactive environments like The Space Station and The Aquarium, along with a racing game called Spy School. Kids can express interest in specific worlds, prompting their parents to grant access, or parents can proactively select which worlds their children can explore.
To bolster safety, Meta has rolled out additional protective measures. A new rating system categorizes VR spaces as either 10+, 13+, or 18+, giving a clear indication of which environments are appropriate for different age groups. This means parents can approve all content rated 10+ simultaneously, while 18+ worlds are blocked from view for preteens. Additional precautionary settings include disabling follower suggestions and defaulting preteens’ status and visibility to “offline” unless altered by the parents.
One significant feature is the persistent “Personal Boundary” setting, designed to ensure personal space for avatars with a safe zone that others can’t breach, set at a two-foot radius.
This update follows Meta’s earlier steps towards safety — introducing parent approvals for contacts their children can interact with in VR and requiring Meta Quest 2 or 3 users to verify their birthdate before using the headset.
Since June 2023, parent-managed accounts for preteens have been available, aiming to create a safer virtual experience. However, despite these efforts, concerns linger among parents. Past allegations suggest Meta hasn’t always succeeded in protecting young users, triggering skepticism about the effectiveness of these new measures.
Earlier this year, internal documents revealed in a lawsuit by the New Mexico Department of Justice showed that Meta intentionally promoted its messaging platforms to minors, despite knowing about inappropriate exchanges between adults and children. Furthermore, a lawsuit from 42 state attorneys general accuses Meta of designing its products to captivate children, subsequently impacting their mental health adversely.