Platform responsibility? Get the backstory - check my book The Closing of the Net - only £15.99!

A disproportionate restriction on freedom of expression – Fails to meet the legality principle – Gap between publicly available information and internal rules. 

These are the conclusions of  the Facebook Oversight Board in one of its first decisions on content removal by the platform. Facebook's action was assessed against human rights standards. The decision paves the way for users whose content is unfairly targeted by enforcement actions to argue for a rights-based approach. Yet it  also highlights some of the difficulties of automated content moderation. It contains much that is to be welcomed, although ultimately, the Board will have demonstrate that it has real teeth. 

I explore the rationale, drawing on my own research where I encountered a similar post to the one in this case.  

The case concerned content related to

1930’s Germany, described by Facebook as ‘Nazi’ content (Case Number 2020-005-FB-UA ). It was a quote incorrectly attributed to Joseph Goebbels, Hitler’s propaganda Minister.  In the Case description, the quote was described as: “there is no point in appealing to intellectuals, as they will not be converted and, in any case, yield to the stronger man in the street...arguments should appeal to emotions and instincts... truth does not matter and is subordinate to tactics and psychology.” The quote was unilaterally removed by Facebook on the basis that it violated Facebook's Community Standard on ‘Dangerous Individuals and Organisations’. However, the user argued that it was political commentary on the former US President Donald Trump, and it was posted during the 2020 US Presidential Election campaign. 

The quote, and the context, resonated with a post that I had come across in my research, and I responded to the Oversight Board’s Call for Public Comments. It was the same quote that is incorrectly attributed to Joseph Goebbels. The quote had been shared from the Ad Sinistram Facebook Page and  posted on the Leeds for Europe  Facebook Page (see image below). In my response, I drew comparisons between the post I had in front of me, and the information that was published about the post in the case. This forms the basis of my comments in this blog post. 

Post on Leeds for Europe Facebook Page September 2019

Above: Text of the Ad Sinistram post shared by Leeds for Europe resonates with the case addressed by the Facebook Oversight Board. 

The Decision, which is binding on Facebook, overturns the removal of the post. It stated that the post did not support Nazi party ideology. It concluded that the removal was a disproportionate restriction on freedom of expression.  Facebook has been instructed to restore the post. The Board makes additional recommendations for Facebook, notably that users should be informed of the grounds for content removal. That is positive for users and to be welcomed. However, a close reading of the Board's reasoning in this case also highlights the complexity of content removal cases. 

The decision assesses Facebook’s actions according to human rights standards under Article 19 of  the International Covenant on Civil and Political Rights (ICCPR)  and with specific reference to the United Nations Guiding Principles of Business and Human Rights. 

The Decision issues a reminder that any rules which seek to restrict freedom of expression must be clear and precise and publicly accessible so that users can adjust their conduct accordingly (legality principle). The Board assessed that Facebook’s Community Standard on Dangerous Individuals and Organisations falls short of what is required. It is imprecise and  lacks clear examples to explain what it means. Users would find it hard to know how the standard would be applied, and it could seem that enforcement actions were arbitrary.  

The Decision is critical of the ‘gap’ between the rules  that Facebook uses internally to make decisions, and the information that it makes public. Facebook told the Board that it assumes that a quote such as an endorsement or  expression of support or praise for the dangerous individual or organisation, unless  the user includes a comment indicating that they do not support it.  Facebook also said that it only looks at the quote or image, but does not take account of the context. The Board felt that the publicly available information provided by Facebook did not make this clear. 

The decision states that Facebook  should have regard to contextual clues.  These contextual clues could include the timing of a post during an election campaign, the user’s location, other users’ responses to a post,  as well as the content of the post itself.  The decision states that Facebook’s failure to do so reflects an unnecessary and  disproportionate restriction on freedom of expression.

Indeed, context is key. From a user perspective, the genuine user would not understand how their content could be mistaken as ‘dangerous’. The user in this case,  was using the historical quote as a commentary on the US political situation in the run up to last November’s US Presidential election.  The Leeds for Europe  case that I reviewed was contextually situated in  contemporary British politics. The meaning of the post would be understood in a similar way as a political commentary. Indeed, these are not isolated instances, and this may well have been a reason why the Board chose this case.   

The Board gave three recommendations. It asked Facebook to notify users of the specific reasons for removing content. The notification should include the specific rule that is being used as the basis for the removal. This may seem obvious. However, based on the evidence in my research, this is a necessary condition.  Facebook frequently  does not specify the basis for content enforcement actions and the users are left wondering what they have done wrong.  It is difficult to appeal against a removal without knowing why it was made. It will be a vast improvement if this recommendation is applied.

The Board called on Facebook to define what it means by ‘Dangerous Individuals’ and provide a list of entities covered. There certainly is a lack of precision in  Facebook’s rules on Dangerous Individuals and Organisations’  and their application.  It would be helpful for users to be more clearly informed of the criteria that Facebook would use when taking action. However, the Board’s recommendation may also be problematic to implement. Compiling a list of dangerous individuals who are living, carries a risk of being defamatory.    

The user also said that “their ability to use Facebook was restricted after they posted the content.“ This is interesting. Facebook frequently applies ancillary sanctions when taking down content.  It may block the user from posting for a period of time – which in two cases that I’ve seen was  up to 30 days. In the Leeds for Europe case that I reviewed, a warning was applied in addition to the content being taken down. This is also known as a ‘strike’, and to me it recalls the failed 3-strikes policies for copyright enforcement**.   A user is given warnings, and  increasingly stronger restrictions are with each new warning, the ultimate sanction  being unpublishing of a Page or suspension of an account. Facebook applies a strikes system regularly alongside content removal.  It does not reveal the number of strikes or how the enforcement actions are determined. The Decision does not ask for the 'strike' to be removed. In my opinion,  if a removal is overturned, strikes should also be removed from the user's account or Page. 

It is not known  what the Decision will mean for similar posts, such as the Leeds for Europe post, but I would sincerely hope that  any restrictions applied to similar posts will also be overturned and the posts restored, and any strikes removed from the users' accounts. 

As a final comment, this type of content is difficult. The Board accepted that there is a massive challenge when moderating what it terms ‘neo-Nazi’ content at a global scale.  The contextual data points recommended by the Board are helpful, but ultimately, these decisions will have to be reviewed with a cultural and political understanding that an automated system is unlikely to have. It should also be clear that this decision does not legitimate politically divisive, disruptive or abusive content, inciting violence or insurrection.  The law allows for the interests of protecting democracy and security to be balanced against freedom of expression.  This is not a defence of Goebbels’ views, rather it is a defence of the right to discuss his views in order to explain to people the insidious ways in which democratic politics can be manipulated.

 ---

Addendum: This is not a commentary on the establishment or role of the Oversight Board, where I do share some of the concerns that have been expressed by others. It relates to research that I have been working for the past year,  investigating take-downs and other restrictions, limits and blocks applied to Facebook Pages by the platform, and it is an  output of that project. I will soon be releasing a paper on the main findings.

 All of the Decisions released this week can be found here 

**3 strikes policies  refer to the Hadopi law in France, the UK's Digital Economy Act 2010, and the debates in  the European Parliament from 2008-2009 over the Telecoms Package and amendments concerning copyright, as well as the ACTA  debates - see my books The Copyright Enforcement Enigma (Palgrane 2012)  and A Copyright Masquerade (Zed 2013). 

---

 Iptegrity is made available free of charge. You may  cite my work, with attribution.  If you reference the material in this article, kindly cite the author as Dr Monica Horten, Iptegrity.com and link back to  this page. You will also find my book for purchase via Amazon.

About me: I’ve been analysing analysing European Union policy on Internet and online content for more than 12 years. I hold a PhD in EU Communications Policy as well as a Post-graduate diploma in marketing. I've worked with the Council of Europe on Internet governance issues, and I was on the Committee that drafted the CoE Recommendation on Internet Freedoms. For many years I was a telecoms journalist, writing for the FT among others, and I was an early adopter of the Internet. My current research is on platform responsibility. Please get in touch if you'd like to know more about my work. 

If you liked this article, you may also like my book The Closing of the Net  available in Kindle and Paperback from only £15.99!   

monica.horten.uni.podgorica.2015.crop.jpg

 

States v the 'Net? 

Read The Closing of the Net, by me, Monica Horten.

"original and valuable"  Times higher Education

" essential read for anyone interested in understanding the forces at play behind the web." ITSecurity.co.uk

Find out more about the book here  The Closing of the Net

PAPERBACK /KINDLE

FROM £15.99

Copyright Enforcement Enigma launch, March 2012

In 2012, I presented my PhD research in the European Parliament.

Don't miss Iptegrity! Iptegrity.com  RSS/ Bookmark      

Iptegrity.com is the website of Dr Monica Horten. She is a policy analyst specialising in Internet governance & European policy, including platform accountability. She is a published author & Visiting Fellow at the London School of Economics & Political Science. She served as an independent expert on the Council of Europe Committee on  Internet Freedom. She has worked on CoE, EU and UNDP funded projects in eastern Europe and the Caucasus. In a voluntary capacity, she has led UK citizen delegations to the European Parliament. She was shortlisted for The Guardian Open Internet Poll 2012.

Iptegrity  offers expert insights into Internet policy (and related issues on Brexit). Iptegrity has a core readership in the Brussels policy community, and has been cited in the media. Please acknowledge Iptegrity when you cite or link.  For more, see IP politics with integrity

Iptegrity.com is made available free of charge for  non-commercial use, Please link-back & attribute Monica Horten. Thank you for respecting this.

Contact  me to use  iptegrity content for commercial purposes