Roblox Steps Up Safety Measures for Young Users: New Features to Protect Under-13s

Roblox is ramping up its efforts to enhance safety and protect younger users on its platform, announcing a series of new features aimed at improving moderation and limiting access to certain content for users under the age of 13. These changes come amid growing concerns over online safety and the need for stricter controls in virtual spaces frequented by children and teens.

Key Points:
Unrated experiences to be restricted: From December 3, 2024, unrated games will be unplayable and undiscoverable by users under 13.
Age-restricted features: Social hangouts and 2D content will be limited to players over 13 starting November 18, 2024.
Stronger parental controls: Roblox has updated its parental controls, offering more oversight of children’s activity on the platform.
Roblox defends its efforts: The company responds to criticism, reinforcing its commitment to user safety and moderation.

New Safety Features Rolling Out Soon
In a statement released on the Roblox Creator Hub, the company outlined two major changes that will be rolled out over the coming months. These updates are designed to provide greater protection for younger players while ensuring parents have more control over their children’s in-game experiences.

Stricter Moderation for Unrated Experiences
Starting December 3, 2024, all creators will be required to complete a questionnaire ensuring that their experiences are suitable for users under 13. If an experience is left unrated or deemed inappropriate for younger audiences, it will be made unplayable, unsearchable, and undiscoverable by children under the age of 13. This move is a direct response to concerns that some games on the platform may expose young users to inappropriate content or interactions.

The new system aims to provide clearer guidance to creators, ensuring they understand the requirements for making their experiences safe for younger audiences. This is part of Roblox’s larger initiative to foster a safer environment by preventing harmful or age-inappropriate content from slipping through the cracks.

Age-Restricted Access to Social Hangouts and User-Generated Content
Another significant change is the restriction of certain social features for users under 13. From November 18, 2024, social hangouts and free-form 2D content (such as drawing and writing) will only be accessible to players over the age of 13. These features, which often allow for unmoderated communication and content creation, have raised concerns over the potential for bullying, inappropriate language, and other risky behaviors in environments where younger users may be present.

By restricting these tools to older users, Roblox is aiming to reduce exposure to such risks and offer a more controlled, age-appropriate social experience for younger players. The move reflects growing concerns over the potential dangers of unfiltered communication and creative freedom in online spaces.

Strengthening Parental Controls and User Protection
These new safety measures are part of Roblox’s broader push to improve its parental control systems and protect younger users. Last month, the company introduced updates to its parental controls, including the addition of content maturity settings and default safety configurations. Parents can now more closely monitor their children’s activities on the platform, gaining insights into the types of games their kids are playing, who they are interacting with, and what content they are exposed to.

In addition, Roblox unveiled a new type of user account that gives parents more oversight, ensuring they can actively manage their child’s gaming experience. The updates provide an extra layer of security, allowing parents to have more peace of mind knowing that they have more control over what their children encounter while playing.

Roblox Responds to Criticism Over Minor Protection
The move to implement more robust safety measures comes after growing criticism of Roblox’s ability to protect minors. A report released by Hindenburg Research in October 2024 accused the company of not doing enough to safeguard young users from inappropriate content and interactions. The report highlighted concerns over the platform’s content moderation practices, particularly regarding user-generated content that might not be suitable for children.

In response, Roblox firmly rejected the accusations, stating that it takes issues of inappropriate behavior and content “extremely seriously.” The company emphasized that it is continuously working to improve its systems, including moderating user-generated content, enhancing reporting tools, and expanding its safety features.

A Step in the Right Direction for Child Safety Online
While Roblox has faced significant scrutiny over its platform’s safety in the past, these new changes represent a substantial step forward in the company’s commitment to protecting younger users. By implementing stricter content moderation, enhancing parental controls, and limiting access to certain features based on age, Roblox is taking concrete steps to address the risks that come with having such a large, young audience.

As online spaces become more integrated into daily life, ensuring the safety of children in digital environments is more important than ever. Roblox’s latest moves reflect a growing awareness of this responsibility, and the company’s ongoing efforts to maintain a safe and positive space for players of all ages.

Leave A Reply