Facebook unveiled a new program that will help users report inappropriate posts and afford them the opportunity to appeal if Facebook decides not to remove a flagged item.
Facebook updated their community guidelines on Tuesday and launched an appeals option. In the past users could report a piece of content for review, but there was no appeal if Facebook declined to remove that content. All that changed this week pursuant to this new policy. There is also an appeal option for individuals who think their own content was unfairly removed by Facebook.
NBC News reported that Monika Bickert, the head of global policy management at Facebook told reporters last Thursday that: “At our scale, we receive millions of reports every week in dozens of languages around the world. Even if we are operating at 99 percent accuracy, we are still going to have a lot of mistakes every day. That is the reality of reviewing content at this scale.”
“We want to get it right, which is why we want to make sure we are giving people the option to ask us to reconsider,” she added.
Continuing, NBC News reported:
Under the new policy, users can file an appeal if they believe a piece of their content has been unfairly removed or if they’ve flagged a piece of content that Facebook’s team of content reviewers decided not to remove. Their appeal will be sent to a new human moderator, who will issue a decision within 24 hours.
Monika Bickert, the Vice President of Global Policy Management at Facebook, made the announcement on Tuesday, writing:
One of the questions we’re asked most often is how we decide what’s allowed on Facebook. These decisions are among the most important we make because they’re central to ensuring that Facebook is both a safe place and a place to freely discuss different points of view. For years, we’ve had Community Standards that explain what stays up and what comes down. Today we’re going one step further and publishing the internal guidelines we use to enforce those standards. And for the first time we’re giving you the right to appeal our decisions on individual posts so you can ask for a second opinion when you think we’ve made a mistake.
“Our policies are only as good as the strength and accuracy of our enforcement – and our enforcement isn’t perfect,” she continued, adding:
One challenge is identifying potential violations of our standards so that we can review them. Technology can help here. We use a combination of artificial intelligence and reports from people to identify posts, pictures or other content that likely violates our Community Standards. These reports are reviewed by our Community Operations team, who work 24/7 in over 40 languages. Right now, we have 7,500 content reviewers, more than 40% the number at this time last year.
Another challenge is accurately applying our policies to the content that has been flagged to us. In some cases, we make mistakes because our policies are not sufficiently clear to our content reviewers; when that’s the case, we work to fill those gaps. More often than not, however, we make mistakes because our processes involve people, and people are fallible.
How It Works
Getting into the nuts and bolts of how this new system will work, Bickert advised: “We know we need to do more. That’s why, over the coming year, we are going to build out the ability for people to appeal our decisions. As a first step, we are launching appeals for posts that were removed for nudity/sexual activity, hate speech or graphic violence.”
She then explained how the process works for individuals who have had a piece of content removed:
- If your photo, video or post has been removed because it violates our Community Standards, you will be notified, and given the option to request additional review.
- This will lead to a review by our team (always by a person), typically within 24 hours.
- If we’ve made a mistake, we will notify you, and your post, photo or video will be restored.
She concluded the announcement, writing:
We are working to extend this process further, by supporting more violation types, giving people the opportunity to provide more context that could help us make the right decision, and making appeals available not just for content that was taken down, but also for content that was reported and left up. We believe giving people a voice in the process is another essential component of building a fair system.
[…]
As our CEO Mark Zuckerberg said at the start of the year: “we won’t prevent all mistakes or abuse, but we currently make too many errors enforcing our policies and preventing misuse of our tools.” Publication of today’s internal enforcement guidelines – as well as the expansion of our appeals process – will create a clear path for us to improve over time. These are hard issues and we’re excited to do better going forward.
You must be logged in to post a comment Login