Roblox is facing renewed scrutiny as it attempts to address long-standing concerns about child safety on its platform. Following years of reports involving child predators and increasing pressure from both the public and government bodies, the company announced on 18 November a new age verification framework. The system aims to protect younger users, but early reactions suggest it also foreshadows a more cumbersome digital future for many online platforms.
In a detailed blog post, Roblox outlined two key elements of its new safety initiative: a voluntary Facial Age Estimation system and optional ID verification. Users may technically opt out of both, but they will lose access to essential features such as basic text chat until they provide the necessary information. Once verified, players are placed into specific age groups, including under 9, 9–12, 13–15, 16–17, 18–20, or 21 and older. Interactions are then limited to people within the same age range.
Those wanting to interact across age groups must use the Trusted Connections feature. This system is intended to support communication only between users who have a confirmed real-world relationship. Roblox plans to enforce this through QR codes, although it says it is exploring additional methods to ensure that players can continue engaging with individuals they know offline.
New verification measures and emerging concerns
The Facial Age Estimation tool operates in the Roblox mobile app using the device’s camera and is processed by an external company called Persona. According to Persona’s website, user data is deleted immediately and never sold. Persona also handles ID checks, though Roblox notes that ID verification requirements may vary by local regulations. In many cases, the process may mimic systems like ID.me, which require a photo of a government-issued document.
Roblox will expand these measures early next year by monitoring all chat, restricting the sharing of social media links, and offering more tools to block or report other users. While these efforts are designed to create a safer environment, many fans already find them intrusive. Reports have surfaced that selfie verification is assigning incorrect ages, creating hurdles for users who must then provide identification to correct the mistakes.
One Reddit user expressed frustration by stating, “I AM A GROWN WOMAN, AND APPARENTLY I’M 12.” Another user commented, “My friend’s brother, who is 14, got passed as 18+,” suggesting that the system could still allow minors to interact with adults despite the new rules. An X user, @Stargenix19, warned, “Whoever has a baby face, plz be aware that Roblox might estimate your age lower than your real age & you have to use your ID as extra proof.”
Roblox’s challenge is significant. Past incidents involving predators have brought legal action and widespread criticism from parents and players who feel the platform has not done enough. The new verification measures attempt to address these concerns but introduce their own complications.
A wider shift toward real-world identity checks
Roblox argues that its approach aligns with the industry’s direction. The company stated that systems such as age verification are “establishing what we believe will become a new industry standard,” and encouraged other platforms to adopt similar methods. “We invite others in the industry to join us and follow our comprehensive approach to help make the entire online world safer for children and teens,” the blog post stated.
This shift is not happening in isolation. Discord recently launched similar verification checks in some regions, while Australia is preparing requirements for major tech platforms like Google to verify users’ ages. Apple has introduced its Digital ID system within the Wallet app, and several US states, including Illinois, have begun rolling out digital identification options. For years, companies such as Blizzard have asked users to submit ID for specific account issues, suggesting that the trend is not new, merely accelerating.
However, critics point out that even if companies promise to delete or protect personal data, breaches remain a risk. Discord suffered a security breach within a year of introducing verification systems. Apple’s strict privacy stance does not prevent law enforcement from pressuring users to unlock devices. Government agencies also share biometric data despite assurances that such information is not used for surveillance. These concerns stretch across industries, including gaming platforms and adult websites, which now face political pressure to enforce age checks.
The rapid spread of verification requirements raises questions about how such systems will function in a complex real-world landscape. The lack of clear legislation adds to users’ anxiety about sharing sensitive data for everyday online activities. Many argue that providing identification to play a simple game with a friend should not require the same level of scrutiny as passing through airport security.
For now, Roblox users may continue to resist these changes. Yet what is happening on Roblox, and increasingly on other platforms such as Steam and Google, is likely to become a standard part of digital life.


