Facebook, Twitter and YouTube urged to stop outsourcing content moderation

A New York University report is calling for social media companies to stop outsourcing content moderation. File picture: IANS

A New York University report is calling for social media companies to stop outsourcing content moderation. File picture: IANS

Published Jun 9, 2020

Share

A New York University report published Monday is calling for social media companies to stop outsourcing content moderation.

The report says big social media companies like Facebook, Twitter and YouTube need to use more of their own employees - instead of the outside contractors on which they currently largely depend - to make calls about what posts and photos should be removed. Misinformation is becoming an increasingly big problem on tech platforms during the protests against racial injustice and the novel coronavirus pandemic, and both are happening during an election year in which the industry is already braced for action by bad actors.

Currently, many of those charged with sifting through the reams of content posted to social media platforms are contractors, without the same salaries, health benefits and other perks as full-time employees at Silicon Valley companies.

Paul M. Barrett, deputy director of the NYU Stern Center for Business and Human Rights and author of the report, says it's time for tech companies to reevaluate that system - which he argues results in the moderators being a marginalized class of workers.

Barrett says outsourcing has continued because it saves the industry money, but also because there's a psychological factor at play.

Content moderators are tasked with sifting through what Barrett calls the "worst that the Internet has to offer." Their work often centers on rooting out violence, hate speech, child exploitation and other harmful content. Facebook has developed a separate program for fact-checking, where it partners with news organizations to debunk hoaxes and other widely shared posts that could confuse people about sensitive topics like elections or the pandemic.

"Content moderation isn't engineering, or marketing, or inventing cool new products. It's nitty-gritty, arduous work, which the leaders of social media companies would prefer to hold at arm's length," he told me. "Outsourcing provides plausible deniability."

Content moderation is the latest battleground for the social media giants in Washington.

The high-profile debate over how social media companies handle President Trump's inflammatory content is one of the most politically perilous issues for tech companies. Twitter's recent decision to label a few of the president's comments has escalated an intense debate over how much responsibility the tech companies have to police their platforms - and whether they could go too far in censoring speech online.

"The recent controversy over how Facebook and Twitter handled President Trump's posts underscores how central content moderation is to the functioning of the social media platforms that billions of people use," Barrett said.

The tech companies have taken divergent approaches to addressing these issues, with Facebook leaving the president's incendiary posts alone. Facebook chief executive Mark Zuckerberg's decision no to take any action against a Trump post has enraged employees internally. Zuckerberg last week met with black executives at the company to discuss their objections to the Trump post, Elizabeth Dwoskin and Nitasha Tiku report. Employees questioned whether Facebook was in an "abusive relationship" with the president, according to a trove of documents including more than 200 posts from an internal Facebook message board.

Now the company's content moderators are revolting too.

A group of current and former Facebook content moderators today released a letter criticizing Facebook's decision, and expressing solidarity with full-time Facebook employees who recently staged a virtual walkout.

"We know how important Facebook's policies are because it's our job to enforce them," the moderators wrote, in a letter published on Medium. "Our everyday reality as moderators is to serve as the public square's first responders."

They write that their status as contractors makes it more difficult for them to participate in the employee-drive activism against the company's decisions. They also said they don't have financial security, which makes it more difficult to speak out especially as the pandemic creates broad economic uncertainty.

"We would walk out with you - if Facebook would allow it," they wrote. "As outsourced contractors, nondisclosure agreements deter us from speaking openly about what we do and witness for most of our waking hours."

Strong content moderation isn't just necessary in the high-profile showdowns.

Not every decision about content on Facebook is as high-profile. Zuckerberg and top executives are only making these calls in the most prominent situations. Barrett warns that strong teams are needed in place to deal with the millions of posts and tweets that regularly violate the companies' policies.

"Given the importance of both levels of moderation, it seems odd and misguided that the platforms marginalize content moderation by outsourcing the bulk of it to third-party vendors," he said. "Instead, the companies should be pulling this vital function in-house and investing more in its expansion."

Barrett also laid out the following recommendations for social media companies to improve their content moderation efforts:

- Increase the number of human content moderators: As a starting point, Barrett argues the companies should double their moderator staffs to keep up with the deluge of problematic content on their services. He says this would also allow moderators to rotate more frequently, so they wouldn't repeatedly be exposed to the same sometimes traumatic material.

- Appoint a senior official to oversee content moderation: Barrett says responsibility for content moderation is currently stretched across disparate teams. He argues

there should be a central, senior official who is responsible for both fact-checking and content moderation in the companies.

- Invest more in moderation in "at-risk nations:"

The companies need moderators with understanding of local languages and culture in countries where they operate, Barrett says. This is especially essential in times of instability. Barrett says the tech companies should have offices on the ground in every country where they do business.

- Improve medical care for content moderators: The companies should expand mental-health support and access to psychiatric professionals to assist workers with the psychological effects brought on by repeatedly viewing alarming content, Barrett says.

- Sponsor research into the health risks of these jobs: A third-party content moderation vendor, Accenture, has said that PTSD is a potential risk of content moderation work. But little is known about how often it occurs, and whether there should be time limits on how long content moderators do this work. Barrett says the companies could play a role in funding research into these issues.

- Consider "narrowly tailored" regulation: Trump in recent days has renewed debate over how the tech industry should be regulated by threatening to revoke Section 230, a key shield that protects tech companies from lawsuits for the posts, videos and photos people share on their platforms. The report expresses wariness of politically charged proposals to revoke that shield, but suggests considering a proposal from Facebook to create a "third-party body" to set standards governing the distribution of harmful content.

- Debunk more misinformation: Barrett suggests the companies should more frequently fact-check posts on their services - a job they've long resisted. Though Facebook's decision to not fact-check the president has seen intense pushback in recent days, Barrett notes the company currently has the most robust partnerships with journalism organizations in place to do this work.

The Washington Post

Related Topics: