Roblox has announced the introduction of new account types aimed at ensuring age-appropriate access to chat and games for young users. This decision aligns with the company’s implementation of obligatory age verifications for users wishing to engage in chats, a measure introduced in January. The same technology will facilitate the assignment of these new account types.
Under the new structure, users aged five to nine will be categorised under a “Roblox Kids” account, while those aged nine to 15 will fall under the “Roblox Select” account type. Standard Roblox accounts will be available for users aged 16 and above. Notably, only users aged 18 and older will have access to “Restricted Content,” which includes material with strong violence, explicit themes, and mature language.
The rollout of these account types is set to begin globally in June, with existing users afforded a transition period to complete the necessary age verification process.
Roblox Kids accounts will permit access to games rated as “Minimal” or “Mild”, featuring mild violence and humour. For this age group, chat functionalities will be disabled by default, with parents needing to grant approval for any chat interactions. Users with Roblox Select accounts will have access to games rated “Moderate” and can chat with peers in their age group.
Parents will have the option to selectively approve games not available under default account restrictions, allowing for supervised gameplay experiences. Both account types will only include games that have undergone Roblox’s stringent three-step screening process to ensure suitability for young players.
To qualify for consideration in Roblox Kids or Select accounts, developers must first complete an ID verification process, particularly if they are under 16. Additional requirements include enabling two-step verification and maintaining a connection with a parent account. An active subscription to Roblox Plus, which offers exclusive benefits and discounts, is also a prerequisite.
The evaluation of games will involve real-time testing by users aged 16 and above, who will provide feedback before games become accessible to younger users. Furthermore, Roblox will utilise its multimodal moderation system to monitor compliance with community guidelines.
These new measures follow legal actions from various attorney generals, raising concerns regarding child safety on the platform. Recent lawsuits have highlighted issues related to grooming and exposure to explicit content, underscoring the necessity for Roblox’s renewed focus on user safety.
Through these developments, Roblox aims to create a safer and more enjoyable environment for its youthful audience while responding to ongoing concerns regarding online safety.
Fanpage: TechArena.au
Watch more about AI – Artificial Intelligence

