The UK’s data watchdog is seeking clarity from Mark Zuckerberg’s Meta regarding parental restrictions on its popular virtual reality headset, as campaigners have worried that it may compromise a web-based child’s security code. The Information Commissioner’s Office said it was contemplating “additional negotiations” with the owner of Facebook and Instagram about the £300 Oculus Quest 2 system, which was a popular Christmas gift. However, child safety experts have warned that the headset’s lack of parental controls – which may allow parents to block information that could be harmful to children – exposes younger users to the risk of abuse on the site.
According to research conducted by the Center for Countering Digital Hate (CCDH), a marketing campaign company, there have been a number of instances of abuse on VRChat, a popular social app among Oculus customers. An adolescent’s avatar – the digital representation used by people on virtual reality platforms – was adopted by two closely breathing males, while another male joked in front of an under-18 that they were a “convicted sex offender.”
The ICO stated that it will contact Meta to inquire about the system’s compliance with the age-appropriate design code, often known as the children’s code, which stipulates that “the best interests of the child should be a major consideration” for on-line services aimed at those under the age of 18.
According to an ICO spokeswoman, “online services and products that utilize personal data and are likely to be accessed by children are expected to comply with the standards of our children’s code.”
“We’re looking forward to continuing our conversation with Meta about its children’s privacy and data protection by design approaches to Oculus products and virtual reality services.” Parents and children who are concerned about how their data is handled can file a complaint with the Information Commissioner’s Office (ICO).”
The legislation aims to prevent websites and apps from exploiting children’s data, and it also applies to “connected devices,” although it does not regulate content. A violation of the code might result in a fine of up to £17.5 million or 4% of an organization’s global turnover, which in the case of Meta could be £8 billion, however formal warnings and reprimands are still possible.
“The concerns regarding the Oculus VR Headset highlight why we need to see’safety by design’ as a new technology norm,” Kidron said. “Kids who use virtual reality headsets like Oculus can access chatrooms and other potentially dangerous services by simply checking a box stating that they are of legal age.” This is insufficient to prevent minors from accessing services that are known to contain child abuse, harassment, racism, and pornography. “Andy Burrows, the NSPCC’s head of child safety online policy, said there were “substantive” concerns about Meta’s adherence to the children’s code.
The CCDH investigation also expressed worries about Zuckerberg’s aspirations for the “Metaverse,” a catch-all term for an immersive VR world in which people collaborate socially and professionally, according to Burrows. “If this is the beginning of Mark Zuckerberg’s Metaverse, it implies that he isn’t committed to constructing it safely from the start, and that important lessons have yet to be learned,” Burrows said.
“We’re dedicated to honoring our commitments under the code and offering age-appropriate experiences for young people,” said the spokesman, noting that under-13s were not permitted to establish accounts or use the system under the Oculus terms of service.
The spokesperson added that Meta was also committed to responsibly developing the Metaverse and had already announced a $50 million (£37 million) funding program to ensure the concept met regulatory and legal requirements, distributing the funds to organizations and educational institutions such as Seoul National University and Women in Immersive Tech.