INTERNATIONAL - Facebook announced in a blog post earlier this week how the social networking site will decide what will be allowed on the platform and the things users should consider before posting.
Monika Bickert VP of Global Product Management for Facebook explained that the company will use a set of community standards that will ultimately give an idea of what will be allowed and what will be taken down.
“We decided to publish these internal guidelines for two reasons,” Bickert said.
"First, the guidelines will help people understand where we draw the line on nuanced issues. Second, providing these details makes it easier for everyone, including experts in different fields, to give us feedback so that we can improve the guidelines – and the decisions we make – over time".
Bickert added that Facebook had set up a dedicated content policy team to help develop these community standards.
“We have people in 11 offices around the world, including subject matter experts on issues such as hate speech, child safety and terrorism. Many of us have worked on the issues of expression and safety long before coming to Facebook,” she said.
“What has not changed – and will not change – are the underlying principles of safety, voice and equity on which these standards are based. To start conversations and make connections people need to know they are safe,” she said.
Additionally, users are also given the right to appeal decisions on posts if they believe Facebook has taken it down by mistake.
Facebook users will also have access to the guidelines.
While Bickert said that the company was on the right track with these guidelines, she noted that its enforcement still is not perfect.
“One challenge is identifying potential violations of our standards so that we can review them. Technology can help here. We use a combination of artificial intelligence and reports from people to identify posts, pictures or other content that likely violates our Community Standards.
“These reports are reviewed by our Community Operations team, who work 24/7 in over 40 languages. Right now, we have 7,500 content reviewers, more than 40% the number at this time last year,” she said.
“Another challenge is accurately applying our policies to the content that has been flagged to us. In some cases, we make mistakes because our policies are not sufficiently clear to our content reviewers; when that’s the case, we work to fill those gaps. More often than not, however, we make mistakes because our processes involve people, and people are fallible.”
Bickert outlined the process for appealing for posts that were removed for nudity/sexual activity, hate speech or graphic violence. They are as follows:
- If your photo, video or post has been removed because it violates the Community Standards, you will be notified, and given the option to request an additional review.
- This will lead to a review by the Facebook team (always by a person), typically within 24 hours.
- If Facebook has made a mistake, it will notify you, and your post, photo or video will be restored.
Image: This post shows an example of a post that could have been incorrectly removed and can now be appealed to be restored. (Image by facebook).
- BUSINESS REPORT ONLINE