News /

Lawsuit Accuses Roblox and Discord of Enabling Sexual Exploitation of 11-Year-Old Girl By Adult Predator


Lawsuit Accuses Roblox and Discord of Enabling Sexual Exploitation of 11-Year-Old Girl By Adult Predator

A new lawsuit is accusing Roblox and Discord of enabling the sexual exploitation of an 11-year-old girl by an adult predator.


The lawsuit also alleges that Snapchat and Meta, the parent company of Instagram, caused the child to become addicted to their products, leading to severe mental health issues and suicide attempts.

The Seattle-based Social Media Victims Law Center filed the lawsuit on behalf of the family.

According to the lawsuit, the Long Beach, California girl, identified only as S.U., began playing the popular game when she was 9 or 10 years old. Her mother set firm rules about screen time and use that she believed her daughter was obeying.

The mother also assumed that there were safeguards in the game to protect the children using it, but now says that there were not.

In 2020, a man named "Charles" befriended S.U. on the game and began messaging her to encourage her to contact him on Discord.

"S.U. was hesitant at first, but Discord offered a 'Keep me safe' setting, which provided Discord would – if selected – monitor S.U.’s activities and protect her from harm," the attorneys for the family explained. "S.U. selected 'Keep me safe' and, believing that Discord would keep her safe, decided to not tell her mother about her Discord account."

Charles immediately began exploiting S.U., eventually introducing her to several of his adult friends through Discord, who also abused, exploited, and manipulated her, according to the lawyers. Charles encouraged her to drink alcohol and take prescription drugs, and the “Keep me safe” setting did not prevent her from receiving the harmful material.

Discord, which requires users to be at least 13, did not verify her age or obtain parental consent.

The men began to encourage the child to open Instagram and Snapchat accounts, which she initially hid from her parents as well. These platforms also require users to be over 13 years old, but neither had any meaningful age verification. The legal team says that S.U. then "fell victim to the social media platforms’ algorithms and became addicted, to the point where she would sneak up to access those products in the middle of the night and became sleep deprived."

At this point, the child's mental health quickly declined.

In July 2020, a Roblox user named "Matthew" began to contact the girl pretending to be a moderator. He used the direct message feature in the game to "exploit and groom" the child as well. Using his claims that he worked for the game, he convinced the child to give him Robux (in-game money) and move their conversation to Snapchat.

Eventually, the predator convinced the child to send him sexually explicit images. She agreed to do it believing that he could not save the images due to the "My Eyes Only" and disappearing content features on Snapchat.

However, Matthew did save the images and allegedly sold them to other predators.

The Social Media Victims Law Center says that by the end of that month, the shame of the sexual exploitation and addiction to social media led the child, who was 11 years old at the time, to attempt suicide for the first time. One month later, she tried again. Her condition continued to decline throughout the fall and winter.

In March 2021, S.U. refused to continue participating in online school and became inconsolable, according to the law firm. She was hospitalized for another five days over a third plan to kill herself by mixing alcohol and sleeping pills.

By June, the child was self-harming.

"Reluctantly, S.U.’s parents put her in a residential program but were forced to withdraw her after a fellow patient sexually assaulted her," the lawyers say. "Her mother, already more than $10,000 in medical debt from 2021 alone, was forced to quit her job to maintain constant vigilance of her daughter in an effort to prevent additional suicide attempts."

The lawyers representing the family allege "Snapchat enabled, facilitated, and profited from her abuse."

"S.U.’s exploitation and addiction led to multiple suicide attempts and self-harm, driving her parents deep into medical debt and causing her to withdraw from school," the Social Media Victims Law Center said in a press release. "Her mother was forced to quit her job to constantly monitor S.U. and prevent further suicide attempts. The lawsuit attempts to hold Roblox, Discord, Meta, and Snapchat responsible for their inherently dangerous and defective product design, failures to warn, and failures to act on known harms their products were causing to children like S.U."

The lawsuit is attempting to hold the social media companies financially responsible for the harms they have caused S.U. and her family. It is additionally seeking an injunction requiring the social media platforms to make their products safer which, the lawsuit claims, "all these defendants could do via existing technologies and at a minimal time and expense to the companies themselves."

“These companies have failed to put in place basic, easily-implemented safeguards that could have kept S.U. and other children like her safe,” said Matthew P. Bergman, founding attorney of SMVLC. “I commend S.U. and her mother for their bravery in sharing their story. It truly demonstrates that these products can harm even the most loving, attentive families. I believe this suit will help bring them justice, and I hope that it leads these companies to improve their products.”

*For corrections please email [email protected]*