Following The Guardian publishing a leaked copy of Facebook’s content moderation guidelines in May 2017, detailing Facebook’s policies on deciding whether posts should be removed from the service, now the company has prepared a set of guidelines to its users and made it public to gather the views of international users. In addition, Facebook is also introducing an appeal process making it possible for users to submit for a review if they think that their posts have been removed unfairly.
The set of guidelines include bullying, violent threats, harm to one’s self, nudity etc. Head of Global Management at Facebook, Monika Bickert says, “These are real world issues. Those using Facebook and other such similar social media networks reflect the real world community. Most using Facebook do so with good reasons. So, we are being realistic about that. However, we know that there will always be people who will post abusive content or engage in abusive behavior. Report them to Facebook and we will remove them as they will not be tolerated.”
These guidelines will be valid to every country where Facebook functions and they have been translated into more than 40 languages. They are to be updated as and when required in all these languages. These apply to all other services provided by Facebook, including Instagram.
With emerging world-wide crises, there has been pressure on Facebook to improve its moderation. For instance, the United Nations had faulted Facebook for enabling the spreading of hatred of the Rohingya minority community. Access to Facebook was also temporarily suspended in Sri Lanka recently on allegations that messages on Facebook had incited violence against the Sri Lankan Muslims. The New York Times too reported news very recently linking Facebook with killings in countries like Indonesia, India, and Mexico.
As a response, Facebook has decided to increase its safety and security team by two folds from its current 10,000 employees.
Facebook hopes to improve its services through the feedbacks it receives. Facebook is also to prepare a stronger process to respond to appeals made against removing any posts unreasonably. Through this, the users could request the company to review such personal posts that have been removed. In case a post has been removed, the person would receive a notification of same with a review request as well. Once a request for a review is made, Facebook would do so within 24 hours and if it is found that it has made an error, the post would be restored and a notification sent to the person who posted it.