Roblox Faces Lawsuits Over Child Safety Concerns
This week, “Roblox” is dealing with a surge of lawsuits alleging that gaming platforms aren’t adequately protecting children from predators and sexual content.
A recent complaint in federal court for the Northern District of California claims that a predator posed as a child and exploited a 10-year-old in Michigan. The unnamed man reportedly persuaded the child to share explicit images.
Filed on Thursday, the complaint details how the anonymous 10-year-old met the predator last year on “Roblox” and subsequently suffered from anxiety and other mental health issues.
The lawsuit, brought by Dolman Law Group, describes the situation as “a digital and real-life nightmare for children” under the guise of a safe, child-friendly environment.
As “Roblox” continues to grow—with an average of 111.8 million daily users—the platform is under pressure to enhance child safety measures. The company’s stock closed over 6% down at $117.34 on Friday.
Roblox spokesman Kadia Koroma denied the allegations, stating, “The claim that Roblox intentionally puts users at risk of exploitation is untrue. The system isn’t perfect, yet we have strict safeguards in place, like limits on sharing personal information and images.” He added that some users attempt to bypass these protections.
In early August, Roblox announced it would utilize artificial intelligence to identify early signs of dangerous communications involving children and notify law enforcement.
This lawsuit is part of a broader increase in complaints this year, with critics arguing that gaming platforms prioritize profits over safety. On Thursday, Louisiana Attorney General Liz Maryll also filed a lawsuit against Roblox regarding these concerns.
The platform is experiencing political scrutiny as well; California Congressman Ro Khanna has encouraged social media users to sign a petition demanding that Roblox do more to safeguard children.
Since July, Dolman Law Group has filed five lawsuits against “Roblox” in California, Georgia, and Texas, with a sixth filed on Friday by managing partner Matthew Dolman.
The latest lawsuit mentions various safety measures Roblox has implemented, such as age verification through facial recognition and notifications to parents about potential threats.
Dolman described the platform as a “Wild West,” comparing it to a hunting ground for predators, and he argued that the company’s portrayal of safety is misleading for users and investors alike.
According to a federal complaint filed Thursday, Roblox profits from fees associated with Children Currency, reportedly allowing predators to exchange digital money for explicit images. The lawsuit claims that predators manipulate children into providing “Robooks” by threatening to release sensitive material.
The complaint references a Hindenburg investigation that labeled “Roblox” an inappropriate platform accessible by registering as a child. This raises alarm due to parallels with the behavior of convicted child sex offender Jeffrey Epstein.
Roblox countered these claims, emphasizing their significant investment in trust and safety initiatives, asserting that the platform enforces rules against child exploitation.
Child advocacy groups are increasingly worried about the risks children face in online spaces. A report from Thorn, a nonprofit focused on child safety, revealed that one in five teenagers has experienced sextortion, using platforms like “Roblox,” “Minecraft,” and “Fortnite” to intimidate victims.