Edition 29 May 2018, by Bárbara Luque
The Netherlands has recently acquired a doubtful reputation on Facebook thanks to the numerous messages filled with hate, racism, and cruel wishes that Dutch users send daily. According to De Volkskrant, the current Dutch team of moderators for the platform receive an approximate eight thousand reports each day, significantly more than the number of posts reported to moderators from other European countries. The number of reports, which usually include complaints about discrimination, racism and hate, has been seen to increase specifically during the seasons of Sinterklaas and political elections, where moderators have to work assessing messages, photos and videos in a rapid rate in order to process this large flow quickly. The Facebook moderators are part of a group of about 1100 people working from Berlin addressing these messages, which also have included suicide and sexual abuse reports.
Not seeing the end of this issue in the last years, the European Union has put social media under pressure to assess and remove these messages more quickly. It was because of this that back in 2016, Facebook, along with Twitter, YouTube and Microsoft signed a code of conduct promising to respond within one day to reports of racist or hate speech. On Facebook, the Dutch subgroup of moderators was a bit understaffed during 2017, which resulted in a lot of reports standing in queue and left unanswered. During this period, Greek moderators pitched in to help, given that they had little to work with in this subject. Regarding this problem, Facebook has stated that “the spread of hatred is a global issue”, thus explaining that it is not specific to the Netherlands. It has been known in the past that the company has taken little action against the hatred, racism, child pornography, terrorist propaganda, and bullying attacks that users spread everyday through the social platform. In fact, by the end of 2017, in an effort to reduce criticism against the company on this matter, Facebook issued a statement apologizing for its mistakes in dealing with hate messages.
In addition, CEO Mark Zuckerberg announced that it would take on three thousand new moderators to assess these issues. By the spring of 2017, there were a total of 4,500 moderators to look out for the 2 billion users of Facebook. Experts say this number should grow to least 9,000 moderators to be able to assess the problem as needed. In April 2018, Facebook published an online guideline on how it would proceed on dealing with hate messages. This manual includes a specific list of which kind of messages are not welcome on the social network. Although the network already used general guidelines that set boundaries for users, this new set goes deeper into the subject through a total of 22 pages. Monika Bickert, head of global policy management at Facebook, stated that these guidelines have been drawn up so that users understand where the company sets the limits on these issues that require greater attention. The manual goes into detail on the rules in the areas of violence, security, offensive content, integrity, and authenticity. In each category, Facebook explains how it plans to deal with messages of this nature.
However, according to a former Facebook moderator interviewed by De Volkskrant, these guidelines are quite broad for what constitutes hate speech on the platform. For example, if someone was to post a photo of a black person with the comment “lazy black”, that would be fine for Facebook. On the other hand, if the same photo includes the comment “all blacks are lazy”, that does go against the guidelines. This proves the problem in understanding the borderlines of hate speech on the platform. The growing worldwide problem doesn’t quite explain why the Dutch are at the top of the list in hate messages reports. Whatever reason there might be behind it, Facebook should get on top of its game in order to assure its users that the social platform is a safe community place, free of any hate speech.