Roblox in race against time to tackle abusive language before public listing

Image by Roblox.

Image by Roblox.

Published Nov 19, 2020

Share

By Munsif Vengattil and Joseph Menn

Bangalore - Profanities and other offensive content that basic word-filtering tools are designed to catch can be found in some game titles and user profiles on children's gaming platform Roblox, searches of the website show, despite the company’s "no tolerance" policy and assurances it has safeguards to enforce it.

Powered by user-created games, Roblox is on course for a multibillion-dollar stock market debut before year end, riding the lockdown entertainment boom with its appeal as a place for safe fun and interactions for the youngest gamers.

But parenting groups and investors alike said they were concerned about whether the company's automated systems to moderate content can effectively delete potentially offensive language and images that pop up on the platform.

Simple Google keyword searches of its site - conducted twice by Reuters since the company announced its stock market plans in October - turned up more than 100 examples of abusive language or imagery. One profile, for example, included "shut up and rape me daddy" in the profile description line, while another had "MOLESTINGKIDSISFUNTOME."

In response to written questions, company spokeswoman Teresa Brewer said in a statement that Roblox “has no tolerance for inappropriate content, which is why we have a stringent safety system, including proprietary text filtering technology, third-party machine learning solutions, and customized rules on what to block, which we update daily."

Last month, Roblox removed the examples within hours of Reuters sharing them with the company. Roblox has said it has 1,600 people working full time to eliminate inappropriate content on the platform.

Roblox offers account controls for parents to restrict how their kids can interact with others on the site. It also lets parents limit the child to a curated list of games vetted for kids under the age of 13. Reuters did not find any inappropriate content on such games.

All sites that rely on users to create material must grapple with how much effort to expend policing that content, and even if that is enormous, there could still be inappropriate posts. Unlike Twitter <TWTR.N> and Facebook Inc <FB.O>, which publish quarterly transparency reports about the types and volumes of purged content, Roblox does not provide such data. That makes it difficult to tell how common it is.

Roblox is a free platform which offers millions of games, many of them created by its own young users through a simple programming tool that the company provides. It has been credited with developing kids’ logic and creativity. Like Microsoft Corp's <MSFT.O> Minecraft, Roblox allows users to create and share 3D gaming content via simple tools and send messages to others.

The simplicity of many Roblox games stands in contrast to popular videogame hits like Fortnite or Apex Legends, which depict killing competitions and target teens. Roblox advisor Larry Magid said that three-fourths of U.S. children between nine and 12 used the platform.

Reuters picked about 20 words commonly considered offensive and looked for them using the site's own search tool, and also through Google's system for searching within a specific site. Roblox's tool revealed no hits, because filters were preventing users from actively looking for problematic content while playing Roblox games. However, the Google search showed that kids could see the problematic profiles and descriptions in a variety of ways, including through friend invitations and group activities.

Most of the examples Reuters found on Roblox included deliberate misspelling of obscenities, or the n-word, which industry veterans say should not make it past standard filtering software.

NBC reported last year that it found neo-Nazi and racist profiles on the site, which Roblox later removed.

Yet early this year, an industry expert who asked not to be named, sent Roblox head of safety Remy Malan a dozen examples of games with racial slurs or the word "Jew" in the title, including some with concentration camp uniforms or other imagery, according to screen shots of the email seen by Reuters.

The examples were confirmed by Reuters and dated as far back as 2009. Some of them were deleted after Reuters described them to Roblox in October. Malan did not respond to the expert or to a Reuters request for comment.

Tech and entertainment watchdog Common Sense Media has lifted its suggested age for Roblox players to 13 years old over the last few years, after abusive language in profiles and sexual content in games kept appearing on Roblox after the company said it would remove it, according to Jeff Haynes, who oversees video gaming coverage for the nonprofit.

Five online safety experts who reviewed the examples found by Reuters said they were surprised such profiles and wording managed to slip through when rudimentary filtering systems can catch and remove such content.

Magid, a Roblox advisor and CEO of the nonprofit ConnectSafely.org - which takes funding from Roblox and other companies to promote safety guidelines for parents - said the examples Reuters had found showed the safeguards did not fully work.

"I think scale is part of it. What I don't understand is why the software didn't pick it up," he told Reuters.

As its stock listing draws near, the company could come under closer public scrutiny from Wall Street, said John Streur, chief executive of Calvert Research and Management, which focuses on socially responsible investing.

"From an investor perspective, it will be a major problem if the headlines months from now reveal that the company is unable to manage the risk of its platform," Streur told Reuters. Roblox declined to respond to comment on that view.

Reuters

Related Topics: