Roblox to ban messaging for young children
Roblox has announced new safety measures that will prevent children under the age of 13 from sending direct messages to other users on the platform. This decision is part of the company’s ongoing efforts to protect young users and create a safer online environment.
Under the new guidelines, child users will be unable to send private messages by default, unless a verified parent or guardian provides explicit permission. Parents will also have enhanced tools to monitor and manage their child’s account, including viewing their list of online friends and setting daily playtime limits.
Roblox is extremely popular among younger children, especially those aged 8 to 12, and is considered the leading gaming platform for this age group in the UK, according to research by Ofcom. However, the platform has faced calls to enhance its safety features to ensure children’s well-being while using the platform. Roblox stated that the new changes will begin rolling out on Monday, with full implementation expected by the end of March 2025.
What the Changes Mean for Young Users
While under-13 users will no longer be able to message others privately, they will still be able to participate in public conversations visible to all players within games, allowing them to interact with friends. This ensures that children can still communicate in a social setting, but private messaging will be restricted unless parents provide consent.
Matt Kaufman, Roblox’s Chief Safety Officer, highlighted the company’s ongoing commitment to safety, noting that Roblox is played by 88 million users daily. He emphasized that over 10% of Roblox’s workforce is dedicated to maintaining the platform’s safety features. Kaufman explained that as Roblox continues to grow, its approach to safety must evolve to keep pace with its expanding user base.
Strengthened Parental Controls
The new measures also provide parents with more control over their child’s activities. To access parental permissions, guardians will be required to verify their identity and age through government-issued ID or a credit card. This step is intended to ensure that parents have the appropriate authority to manage their child’s interactions on the platform.
However, Kaufman acknowledged that verifying user age remains a challenge for many tech companies. He urged parents to work with their children to ensure that they provide accurate information when creating accounts. “Our goal is to keep all users safe, no matter their age,” Kaufman said, stressing the importance of honest age information during the sign-up process.
Industry Response
The NSPCC (National Society for the Prevention of Cruelty to Children), a leading child safety charity in the UK, welcomed the changes, calling them a “positive step in the right direction.” Richard Collard, Associate Head of Policy for Child Safety Online at the NSPCC, emphasized the need for robust systems to verify user age to ensure the effectiveness of these changes. “Roblox must make this a priority to address harm on the platform and protect young children,” Collard added.
Simplified Content Labels and Age-Appropriate Guidelines
In addition to restricting messaging, Roblox also announced plans to introduce clearer content labels for games and experiences on the platform. These labels will replace age-based recommendations and focus on outlining the nature of the content, enabling parents to make decisions based on their child’s maturity level rather than just their age.
Content labels will range from “minimal” to “restricted.” “Minimal” content may include light violence or fear, while “restricted” content could contain more mature themes such as strong violence, explicit language, or graphic depictions. Children under the age of 9 will only be able to access “minimal” or “mild” experiences by default, but parents can grant consent for access to “moderate” content. However, “restricted” games will be off-limits to users under 17, unless they have verified their age using the platform’s tools.
Additional Measures and Legal Requirements
These updates come as part of a broader effort by Roblox to ensure its platform is compliant with upcoming regulations, including the UK’s Online Safety Act, which mandates stricter rules on platforms used by children. The Act will require companies to prevent and address illegal and harmful content, and Ofcom, the UK’s regulatory authority, has warned that companies could face penalties if they fail to keep children safe online.
In November, Roblox also announced that it would block children under 13 from accessing “social hangouts,” where players can communicate via text or voice messages. Starting December 3, game developers will be required to specify whether their games are suitable for children, and any games that do not provide this information will be blocked from under-13 users.
These changes highlight Roblox’s ongoing commitment to child safety while balancing the need to preserve the platform’s social and interactive features. As the platform continues to grow, it is clear that ensuring the safety and well-being of its younger users remains a top priority for the company.