Update 16/08/25 – 10:15 am (IST): Roblox has released an official response to the lawsuit from the Louisiana Attorney General, countering what it describes as “erroneous claims and misconceptions” about its platform’s safety. The company’s blog post asserts that it would not intentionally put users at risk and that it continuously works to improve its moderation systems. Roblox cited its recent introduction of over 40 new safety features, including updated parental controls, stricter defaults for users under 13, and new age estimation technology that uses video selfies to confirm a user’s age.

The response also highlights a multi-layered safety system that combines a large team of human moderators with advanced AI tools, such as the recently open-sourced Roblox Sentinel. This AI system is designed to detect subtle, long-term patterns of potential grooming. Roblox also mentioned its partnerships with law enforcement and child safety organizations like the National Center for Missing and Exploited Children (NCMEC), noting that it proactively reports harmful content and works with others in the industry to improve online safety.


Original article published on August 15, 2025, follows:

Louisiana’s Attorney General Liz Murrill made headlines Thursday by filing a bombshell lawsuit against Roblox. The wildly popular gaming platform boasts nearly 82 million daily users. But this isn’t just another run-of-the-mill corporate lawsuit. The case against the $90 billion company shows that governments around the world are taking age verification seriously – and gaming platforms better take notice.

Murrill alleges that Roblox “creates an environment where children are susceptible to being groomed by predators without any warnings or safeguards.” The state’s case centers on what it calls fundamental design flaws: users can easily create accounts with fake birthdays, there’s no meaningful age verification process, and parental consent requirements are essentially non-existent.

You can read the complete lawsuit in the embed below or by heading here.

This criticism sounds remarkably similar to concerns raised about traditional social media platforms that have already faced stricter age verification requirements.

The timing couldn’t be more telling. While social media giants have been scrambling to implement age checks – Reddit rolled out age verification in the UK and YouTube has just deployed AI age estimation tools for teens in the US – gaming platforms like Roblox have largely flown under the regulatory radar. That free pass appears to be ending.

The global momentum behind age verification laws is undeniable. Australia recently included YouTube in its social media ban for teenagers, while Canada’s Bill S-209 pushes for comprehensive online age verification. Even Google now will adjust its search based on the user’s age, and the EU is actively testing age verification apps to protect minors from harmful online content.

Unlike static social media posts, games offer real-time interaction, voice chat, and immersive experiences that can blur the lines between virtual and real relationships. Roblox alone reports (via Variety) that “20% of users are under age 9, 20% are ages 9 to 12, 16% are ages 13 to 16 and 44% are 17 or older.” That’s a massive young user base operating in largely unregulated territory.

The Louisiana case includes a particularly disturbing example that illustrates the stakes involved. Just last month in Livingston Parish, law enforcement arrested an individual who “was actively using the online platform Roblox” and “was in possession of and had employed voice-altering technology designed to mimic the voice of a young female, allegedly for the purpose of luring and sexually exploiting minor users of the platform.”

The state argues that “Roblox is overrun with harmful content and child predators because it prioritizes user growth, revenue, and profits over child safety.” This accusation strikes at the heart of how gaming platforms operate – grow first, moderate later. But the pressure isn’t limited to governments intervening.

Even payment processors like Visa and Mastercard are turning away from gaming platforms hosting “adult games.” Earlier today, I also highlighted how PayPal has joined that bandwagon by dropping support for a vast majority of currencies on Steam.

The writing is on the wall for gaming platforms worldwide. As governments continue tightening age verification requirements and holding tech companies accountable for child safety, the distinction between social media and gaming platforms is becoming increasingly meaningless. Both facilitate online interaction between strangers, both attract young users, and both face similar risks of exploitation.

So it seems that social media and gaming platforms will have to drastically step up moderation and regulation for minors to avoid trouble with the law, or run the risk of losing their payment processors.

TechIssuesToday primarily focuses on publishing 'breaking' or 'exclusive' tech news. This means, we are usually the first news website on the whole Internet to highlight the topics we cover daily. So far, our stories have been picked up by many mainstream technology publications like The Verge, Macrumors, Forbes, etc. To know more, head here.

Dwayne Cubbins
1274 Posts

For nearly a decade, I've been deciphering the complexities of the tech world, with a particular passion for helping users navigate the ever-changing tech landscape. From crafting in-depth guides that unlock your phone's hidden potential to uncovering and explaining the latest bugs and glitches, I make sure you get the most out of your devices. And yes, you might occasionally find me ranting about some truly frustrating tech mishaps.

Comments

Follow Us