Graphic images: What is Twitter to do?

The public has less than a week to go to make submissions on a controversial draft law, which has been labelled an attempt to censor social media and other online content.

The public has less than a week to go to make submissions on a controversial draft law, which has been labelled an attempt to censor social media and other online content.

Published Aug 22, 2014

Share

San Francisco - Twitter decided last year to make images more prominent on its site. Now, the social network is finding itself caught between being an open forum and patrolling for inappropriate content.

The pattern goes like this: a major public death spreads graphic images across Twitter. Users express outrage, forcing the company to decide what to remove.

Two recent incidents illustrate the difficulty of the choice. While Twitter is taking pains to remove images of the death of James Foley, the journalist who was beheaded by Islamic militants, some photos of the body of Michael Brown, the teenager who was killed by police in Ferguson, Missouri, remains on users' streams.

To many on Twitter, images of violence against Foley can be seen as spreading a terrorist's message, while publicising Brown's death shines a light on a perceived injustice.

“They're letting the masses decide what should be up and what should not be up,” said Ken Light, a professor of photojournalism at the University of California, Berkeley. “When it's discovered, it needs to be dealt with promptly. The beheading video should never go viral.”

The dilemma faced by Twitter, a proponent of free speech and distributor of real-time information, isn't much different from that of a newspaper or broadcaster, according to Bruce Shapiro, executive director of the Dart Center for Journalism & Trauma at Columbia Journalism School.

“Twitter's situation is exactly like that of a news organisation,” Shapiro said. “Freedom of the press and freedom of expression doesn't mean that you should publish every video no matter how brutal and violent.”

The incidents also happened just after Robin Williams' daughter, Zelda, said she was quitting Twitter after receiving abusive messages following his death.

“In order to respect the wishes of loved ones, Twitter will remove imagery of deceased individuals in certain circumstances,” the San Francisco-based company said in a policy that was enacted last week. “When reviewing such media removal requests, Twitter considers public interest factors such as the newsworthiness of the content and may not be able to honour every request.”

Twitter's software isn't designed to automatically filter all inappropriate content. The company's Trust and Safety team works in all time zones to stamp out issues once they're discovered, according to Nu Wexler, a spokesman for the company. Twitter uses image-analysis technology to track and report child exploitation images, Wexler said.

Twitter doesn't specifically prohibit violent or graphic content on its site -- only “direct, specific threats of violence” and “obscene or pornographic images,” according to its terms of service. It may need to go further, if Facebook's experience is any guide.

In October, around the time Twitter started displaying images automatically in people's timelines, Facebook was dealing with an uproar over a separate beheading video that was spreading around its site. The company resisted taking it down until user complaints intensified, including from British Prime Minister David Cameron. Then Facebook changed its policies.

“When we review content that is reported to us, we will take a more holistic look at the context surrounding a violent image or video,” the Menlo Park, California-based company said at the time. Facebook said it “will remove content that celebrates violence.”

Now that Twitter is encouraging images and video, it will also need to take another look at its rules, according to Columbia's Shapiro.

“I don't think a blanket rule is the point,” Shapiro said. “You do need a company policy that recognises that violent images can have an impact on viewers, can have an impact on those connected to the images, and can have an impact on the staff that have to screen this stuff. You can't ignore Twitter's role in spreading these images.” - Bloomberg/Washington Post

Related Topics: