Roblox is facing scrutiny from concerned parents as it rolls out an expanded age-verification system designed to enhance child safety on its popular gaming platform. The company, which boasts 144 million daily users worldwide, is implementing new account categories—Roblox Kids and Roblox Select—that tailor content and communication features based on estimated user age.
However, some parents report that the system has incorrectly classified their children as adults, potentially exposing them to less-protected versions of the service. Matt Kaufman, Roblox's chief safety officer, defended the technology in an interview, stating that the facial analysis system estimates age "within about 1.4 years, plus or minus" for users under 18.
"When you ask users that simple question about their age," Kaufman explained, "they're going to tell you whatever they want to tell you in order to get access."
The new system represents a significant expansion of Roblox's existing safety measures. Roblox Kids accounts, intended for younger children, feature simplified interfaces with no communication tools and access only to curated games. Roblox Select, for users aged nine to 15, allows limited communication and a broader but still restricted content library. Users who don't complete age checks will be restricted to children's content and barred from platform communication.
Kaufman acknowledged that errors can occur but suggested many complaints arise when parents complete age checks on behalf of their children or misunderstand the process. The company offers options to reset age checks, submit appeals, or use ID verification to correct errors, and users may be prompted to re-verify if their behavior appears inconsistent with their estimated age.
Despite these measures, concerns persist. Professor Sonia Livingstone of the London School of Economics noted that while Roblox's response was "encouraging," there remains "mounting evidence its platform continues to pose real risks to children's safety."
"Parents deserve independent confirmation that the moderation is sufficient, that help systems are effective, and that age checks aren't used for commercial profiling," Livingstone emphasized.
The changes come amid growing global pressure on tech firms to protect children online. In the UK, platforms face new duties under the Online Safety Act, while several countries have introduced restrictions or proposals to limit social media use for under-16s.
Roblox's safety approach relies on multiple signals to determine which experiences are available to younger users, including how long a game has been on the platform and the history of its developers. The platform hosts more than two million developers creating content, though games with social or free-form elements won't be default-available on Kids and Select accounts.
Kaufman maintained that Roblox is "going above and beyond what any other gaming platform is doing" regarding child safety, while CEO Dave Baszucki has previously stated that parents should ultimately decide whether they're comfortable letting their children use the platform.