Big tech accountability? Read how we got here in  The Closing of the Net 

Culture Committee of the European Parliament in session 2019

Will they dump  the upload filter?  Moves are afoot in the European Parliament to protect free speech and reject the imposition of upload filters  in two key pieces of legislation before the current sitting of the European Parliament. The Copyright Directive has been stalled by the Council, with a scheduled trilogue cancelled. Meanshlie, amendments to delete requirements for proactive monitoring in the Terrorism Directive are being tabled in two committees.

The Copyright directive, which incorporates a controversial proposal for Internet platforms to police entertainment content,  has been put on ice following a meeting of  the Council of Ministers last Friday  in which 11 Member States voted against a compromise  proposal put forward by the Romanian Presidency. A trilogue meeting with the European Parliament and Commission, scheduled for Monday, was subsequently cancelled. The cancellation was publicised yesterday on Twitter by MEP Julia Reda, and confirmed by a Reuters report. It seems that no alternative dates have been set for futher trilogues.  The upshot is that the Copyright directive may not be adopted before the European elections, as had been the plan.

The Terrorism Content Regulation, just beginning its journey in the European Parliament, is being targetted with amendments to remove a requirement for proactive measures (which imply an upload filter)  as well as other amendments that  tighten up on referrals by law enforcement bodies, and insist on a judicial order for content removals. These and other amendments are important because they would  bring the proposed Regulation in line with rule of law and human rights standards.   

It seems that the controversial Article 13  – the upload filter proposal – was the reason for the rejection of the Copyright Directive.  Among that countries that voted against it were Germany, Sweden and Poland.  This does not necessarily mean the end of the Copyright Directive, but it does look like Article 13 is unlikely to be revived in the near term.  An option would be to carve out the two controversial articles and  adopt the remainder of the Copyright directive.  

The notion of an upload filter has been widely criticised. It would entail the filtering of all content uploaded by users onto platforms.  It  puts at risk freedom of expression,  as well as heightening the surveillance of user activity, as outlined in a report by three UN Special Rapporteurs (dated 7 December 2018). The nature of such a system provides all the necessary tools for mass censorship, and it frequently described as a ‘censorship machine’. Additional concerns are raised by the fact that such a system would be in the hands of private companies. 

The upload filter would also serve to entrench the monopolies of big platforms at the expense of smaller players,  who would struggle to bear the costs.

Rights-holders have lobbied heavily for the upload filter, but changed their mind recently, when they wrote to senior EU officials calling for suspension of the Article 13 trilogues.  Evidence is emerging that the inclusion of the requirement for an upload filter was linked to corporation lobbying by vendors of filtering systems.

It’s curious then that the upload filter also arises in a completely separate piece of legislation, the Terrorism Content Regulation. Article 6 of the proposed Regulation calls for proactive measures by platforms and hosting providers:

(a) preventing the re-upload of content which has previously been removed or to which access has been disabled because it is considered to be terrorist content;

(b) detecting, identifying and expeditiously removing or disabling access to terrorist content.

This text implies the installation of a filter as the point where the user uploads. The Regulation seeks to require providers to disable or remove content within one hour of receiving a referral from a competent authority. this too is problematic.

Further criticisms of the proposed Terrorism Content Regulation related to the  vagueness of  referrals for content that is to be blocked, and the nature of the authority that is asking for the block. This is why MEPs are tabling amendments to impose greater transparency, more clarity on the definition of “terrorist content” and a requirement that content removals orders should only be done by judicial authorities. These amendments are contained in the opinions drafted by the Culture Committee (CULT) and the Internal Market committee (IMCO).

Other  amendments  seek to delete the reference to proactive measures. This would have the effect of  removing the requirement for an ‘upload filter’.

This means it is unlikely that the Terrorism Content Regulation will be adopted during this Parliament – i.e. it will not be adopted before the European elections. The lead  committee is the Civil Liberties committee (LIBE), chaired by British MEP Claude Moraes and the rapporteur is also British – Dan Dalton of the ECR Group (British Conservatives).   He must now work through these amendments and propose its Report for MEPs to vote on.

It is not yet clear how far these amendments reflect a consensus in the Parliament, but the fact that the notion of an upload filter is being shelved by the Council suggests that the amendments might gain support, at the expense of the  Commission’s original proposal.  

Ultimately, this is a question of how the law balances different sets of interests – those of rights holders, law enforcement, the Internet intermediaries, and the users. It has been established that proactive filtering runs counter to EU law under the E-commerce directive. The European Court of Human Rights has identified that freedom of expression online is protected under Article 10 of the European Convention on Human Rights. It the upload filter is defeated in both piece of legislation, it will set a clear line as to  where the balance lies in EU law with regard to  the policing of  content online and the protection of free speech.


If you are interested in my work, please see my books advertised on this site, or contact me via Contact Us page or Twitter.

Media: If you cite any part of this article, please state the author as Dr Monica Horten, and link back to

Photos: screen shot of European Parliament Culture Committee

Iptegrity in brief is the website of Dr Monica Horten. I’ve been analysing analysing digital policy since 2008. Way back then, I identified how issues around rights can influence Internet policy, and that has been a thread throughout all of my research. I hold a PhD in EU Communications Policy from the University of Westminster (2010), and a Post-graduate diploma in marketing.   I’ve served as an independent expert on the Council of Europe  Committee on Internet Freedoms, and was involved in a capacity building project in Moldova, Georgia, and Ukraine. I am currently (from June 2022)  Policy Manager - Freedom of Expression, with the Open Rights Group. For more, see About Iptegrity is made available free of charge for  non-commercial use, Please link-back & attribute Monica Horten. Thank you for respecting this.

Contact  me to use  iptegrity content for commercial purposes


States v the 'Net? 

Read The Closing of the Net, by me, Monica Horten.

"original and valuable"  Times higher Education

" essential read for anyone interested in understanding the forces at play behind the web."

Find out more about the book here  The Closing of the Net


FROM £15.99

Copyright Enforcement Enigma launch, March 2012

In 2012, I presented my PhD research in the European Parliament.


Don't miss Iptegrity!  RSS/ Bookmark