Press "Enter" to skip to content

Roblox fails to protect child gamers from predators, sexual content, lawsuit claims

After Quisha Smith allowed her young daughter to open an account on the hugely popular gaming platform of San Mateo’s Roblox, she was surprised to see that when the girl went into a bathroom in a role-playing game, someone asked her to take off her pants, Smith alleges in a lawsuit against the company.

Roblox claims more than 66 million daily users globally, with the fastest-growing age group 17 to 24, and estimated in 2020 that three-quarters of U.S. kids aged 9 to 12 had accounts. While its promotional materials tout a safe and “welcoming environment for all ages,” Smith’s lawsuit alleged the platform exposes children to “rampant sexual content, including the presence of child predators posing as innocuous avatars in an effort to groom child-users.”

Such content, the lawsuit claimed, “is available — and, in some cases, recommended — to children of all ages on the platform, including the youngest of audiences who are barely old enough to be in elementary school.”

Smith, identified in her lawsuit as a California parent of a daughter, 12, and son, 7, is seeking class-action status, to bring in up to millions of other parents who “have been lulled into a false sense of security and allowed their young children to sign up for Roblox.”  According to the lawsuit, Smith saw a social media advertisement from Roblox in 2021 that “indicated Roblox was a gaming platform for children,” so she let her daughter open accounts for herself and her brother. The alleged incident with her daughter in the virtual bathroom occurred last year, the lawsuit said.

Roblox said Thursday it disputes the allegations and will respond to them in court.

“We have an expert team of thousands of people dedicated to moderation and safety on Roblox 24/7, and we act swiftly to block inappropriate content or behavior when detected, including sexual content which violates our Community Standards,” the company, adding that it works with 20 organizations focused on child safety and online safety.

“We have a number of features specifically designed to keep kids safe including filtering text chat on the platform to block inappropriate content or personal information and offering parental controls and features to limit or turn off chat. We have invested in building tools to give parents visibility into their children’s activity.”

The lawsuit noted that Roblox, along with Instagram, erotic-content site OnlyFans, the Apple App Store and other platforms, made the “Dirty Dozen” list of the National Center on Sexual Exploitation. The center alleged in a June report that not only can kids on Roblox be exposed to highly sexualized content, but “countless children have been sexually abused and exploited by predators they met on Roblox,” and that children’s avatars — their representational images in game worlds — have been raped.

Central to the lawsuit is a fundamental problem confronting major Silicon Valley social media companies: How to screen and filter users and content on a massive scale. Roblox says that, like other large platforms, it uses a mix of automated and human moderators. Legal actions against social media firms over child safety are not infrequent — this week, New Mexico sued Meta, accusing it of creating a “breeding ground” for child predators on its Facebook and Instagram apps, claiming young users are exposed to sexual content and contact by adult users.

The Roblox lawsuit filed Wednesday in San Mateo County Superior Court cited alleged incidents including the arrest of a 40-year-old California man who traveled across the U.S. to sexually abuse a 14-year-old girl he met on Roblox, and another 14-year-old girl sexually assaulted by a man who posed as a boy, 17, on the platform.

“In 2022, a 13-year-old girl in Topeka was rescued from a man she met on Roblox who was sex trafficking her,” the lawsuit claimed, citing the center’s report. “In 2022, an 8-year-old girl in North Carolina was targeted by an online predator on Roblox who asked her to send him ‘hot videos.’ The girl’s mother said she had parental controls on all the devices her kids used.”

Roblox, the lawsuit and center claim, “still allows adult strangers to direct message, chat, and ‘friend’ children. These are the default settings when any child opens an account.”

The lawsuit refers to a 2020 regulatory filing by Roblox saying, “We have faced allegations that our platform has been used by criminal offenders to identify and communicate with children and to possibly entice them to interact off-platform, outside of the restrictions of our chat, content blockers, and other on-platform safety measures.” Despite “considerable resources” devoted to prevent such use, Roblox said in the filing, “we are unable to prevent all such interactions from taking place.”

The company, the lawsuit claimed, “failed to disclose that information to parents.”

Also cited in the lawsuit is a warning in April from Canadian police that predators were contacting kids on Roblox, then directing them to apps like Facebook, Instagram or Snapchat to manipulate them into sending nude photos or performing sexual acts.

Smith, who is seeking a court order barring Roblox from claimed illegal conduct and false advertising, said in the lawsuit that she cut her kids off from the platform after the alleged incident in the online bathroom. However, she “remains in the market for digital entertainment for her children,” and would let her children back on Roblox if the company accurately promoted its platform and improved its moderation and other protective systems.


Source: Orange County Register

Be First to Comment

Leave a Reply

Your email address will not be published. Required fields are marked *