Roblox reported over 13,000 incidents of child exploitation in 2023


US police have reportedly arrested at least 24 people since 2018 who have abducted or abused victims they met on Roblox.

That’s according to data compiled by Bloomberg as part of an eight-month investigation into the popular online game platform, which has some 77 million daily users, over 40% of whom are under the age of 13.

The report, which Bloomberg claims shines a light on the platform’s “predator problem”, reveals that in 2022, Roblox reported almost 3,000 incidents of child exploitation to the National Center for Missing and Exploited Children, a figure which jumped to over 13,000 last year.

While Roblox has been enhancing its child safety measures in recent years, a number of current and former employees claimed that the company is prioritising growth over the wellbeing of its users.

READ MORE  Far Cry: De proyecto técnico a franquicia estrella

They told Bloomberg that prior to 2022, Roblox didn’t have automated systems to search for grooming behaviour beyond simple text filters.

Calls for additional resources were reportedly ignored, leading to a backlog of safety incident reports – claims which a Roblox spokesperson disputed, while adding that the company has a “robust pipeline” of safety features in development.

Roblox’s chief safety officer, Matt Kaufman, also rejected the claim that child endangerment is widespread or a systemic problem on Roblox.

In a lengthy blog published to coincide with the release of Bloomberg’s report, Kaufman claimed: “Roblox has spent almost two decades making the platform one of the safest online environments for our users, particularly the youngest users. Our guiding vision is to create the safest and most civil community in the world.

READ MORE  Stellar Film Archives Book Series Is A Bargain At Amazon - Disney, Star Wars, Stanley Kubrick, And James Bond

“As our platform evolves and scales, forging a new future for communication and connection, our investment in preventative safety measures remains fundamental. To be the best in the world at delivering safe and civil online experiences, this is essential. With each passing year, we implement new strategies and technology to achieve gains in speed and effectiveness of our safety and moderation systems.”

In recent years, Roblox has reportedly appointed new child safety investigators, established a child exploitation moderation team, and a child safety officer also reports directly to the company’s CEO.

Last week, it also announced changes “intended to update how our youngest users access experiences and to provide parents and users more clarity into the types of content available on Roblox”.

READ MORE  Xbox implementará inteligencia artificial | Atomix

Starting this autumn, experiences will be labelled by “the type of content users can expect” to find in them rather than by age.

By default, users under the age of nine will only be able to access experiences with a content label of ‘Minimal’ or ‘Mild’, unless parental controls are used to let them access more mature content.

‘Mild’ content “may contain repeated mild violence, heavy unrealistic blood, mild crude humor, and/or mild fear,” Roblox said.