Big tech accountability? Read how we got here in  The Closing of the Net 

The UK is ‘clearly a target for Russia’s disinformation campaigns’. Protecting our democratic  discourse from a hostile state is the role of the intelligence agencies. Integral to that process are the social media platforms, who are private actors.  What role should platforms have in a national security context? The Russia report, released on 21 July, exposes some of the issues.

The Russia report* confirms that the UK is a target for online political interference  by the Russian State (para 31), but it exposes a gaping hole in the ability of the UK authorities to tackle the problem.  It paints a worrying picture of the intelligence agencies abrogating their responsibility to  protect the discourse and processes of the UK against the activities of foreign powers. Despite the known interference on social media, including with the 2016 referendum, there seems to be 

little understanding of what happened or what to do about it.

The Russia report was published on 21 July by the UK Parliament’s Intelligence and Security Committee,  after a delay of many months. The release follows a petition to get it into the public domain.

It describes Russia as a highly capable cyber actor (para13)*, a hostile State (para 33) that is targeting the UK with campaigns to undermine our democratic discourse by either promoting its own agendas, or simply by sowing confusion. It is covertly using online methods, including on social media platforms, to spread false, distracting and distorting narratives (para 31). Specific tactics include bots and trolls, and the use of State-owned international media (para 28) that typically generate high levels of influence, with social media posts attaining a high reach. 

A point of concern raised in the report is that the UK’s State security services claim that they are not responsible (para 33) for tackling this hostile State interference.  

The intelligence agencies suggest that government responsibility lies not with them but with DCMS – the Department for Culture, Media and Sport. DCMS says it is only responsible for policy regarding the use of disinformation, not for state security to protect the public against hostile attacks. There seems to be total confusion in government about who is responsible for cyber-policy overall. The report describes an unnecessarily complicated wiring diagram of responsibilities(para 18).

The intelligence agencies conducted no threat assessments  regarding the interference via social media, either before or after the 2016 referendum. The report suggests that this was  a failure to protect our democracy and that it is important to establish whether a hostile state took deliberate action with the aim of influencing a democratic process, irrespective of whether it was successful or not’ (para 39).

The lack of a threat assessment is particularly shocking given the evidence that is not only in the public domain, but held by the UK Parliament’s  DCMS Select Committee. This evidence (I have sifted through quite a bit of it**) reveals not only the bot and troll activity, but also how a range of covert online techniques were used to influence the 2016 referendum.

As the report rightly says, the intelligence agencies are responsible for safeguarding the democratic processes (paras 31, 33,34) against interference from a hostile foreign State and from ‘actions intended to undermine our democracy (paras 34, 66)’.  In that context, they are responsible for protecting democratic discourse. In the 21st century, the main venue for democratic discourse is provided by the social media platforms.  

The Russia report suggests that the intelligence agencies could have acted in this regard. They could have “stood on the shoulders” (para 46) of this evidence to find the owners of suspicious social media accounts and disrupted malicious activity.

The report then slips in a paragraph about social media platforms (para 35). It says they “hold the key” and are “failing to play their part.” It calls on the government to “establish a protocol to ensure they take covert hostile use of their platforms seriously and have clear timescales within which they commit to removing such material”.

This reflects the UK government’s policy on social media (well-intentioned but needs more work, as outlined by Graham Smith on his Cyberleagle blog). However, the issue at stake here is  national security. As  the Russia report has established, this is the role of the intelligence agencies. Asking social media platforms to take on a national security role would seem to be an arguably problematic approach.

Social Media platforms are private actors. The largest players, such as Facebook, Google, Twitter and Microsoft are global corporations.  They have no public accountability. There is no regulatory oversight.

They do not police their platforms according to the law, but according to their own internal policies***. When it comes to disinformation material, there is a problem because it is usually context-sensitive and not always obvious what is and is not in the category to be taken down.  Figuring that out requires a complex understanding of the political, social and cultural context.

How therefore, should the hostile use of platforms  be defined?  A platform needs a definition to know what to seek out and take down. How should a platform know when something is intentionally false, creating a distorting or distracting narrative, with the aim of either influencing UK politics directly or by sowing discord and division in this country?  

Social media platforms  are likely to make decisions about content take-downs according to criteria that include corporate risk factors, which will be assessed across many countries. We cannot expect they will take decisions according to (what to them will be) narrow UK criteria.

The mechanisms put in place by the platforms to identify what they call ‘false news’ are inappropriate for this task. They rely on a pedantic fact-checking exercise, whereas hostile States seeking to disrupt our democracy use highly sophisticated techniques that would fall straight through that net.  For example, they involve dropping pieces of information into a text that appears otherwise to be legitimate. 

Content moderation is not the solution either. To a content moderator on the other side of the world, with no contextual knowledge of the UK political situation, a post or a meme may not seem to violate the platform policies, yet it may be deeply destructive from a UK perspective. By contrast, innocent, lawful memes and posts do get caught up in trawls by the platforms’ automated systems.

Social media platforms are the vehicle,  but they are not the perpetrator that is creating false narratives. They hold data that could help in uncovering the perpetrators, but they are not law enforcers. They have no mandate to address issues of UK national security and neither should they.

The question that should be asked is around the kind of co-operation that could  help the intelligence services protect democratic discourse. However, this also raises the tricky issue of how far the state can demand for example, to see data, without being overly intrusive into individual rights.  There are already issues with the bulk powers given to the intelligence services under the Investigatory Powers Act.

It is not going to be easy to find the right balance. A threat assessment and enquiry into the 2016 referendum would seem to be a necessary first step. Aligning responsibility in government would be helpful too.  There will be a need to call on social media platforms to assist. However, outsourcing to a private actor would be an abrogation of duty.  

As the Russia report establishes, it is the duty of the intelligence agencies to maintain national security and to safeguard our democratic processes. That is surely the case now, as it ever was, back in the days of James Bond.



*Paragraph numbers refer to the Russia report:

Intelligence and Security Committee of Parliament: Russia (HC 632) Presented to Parliament pursuant to section 3 of the Justice and Security Act 2013.  Ordered by the House of Commons to be printed on 21 July 2020,

The Intelligence and Security Committee of Parliament (ISC) examines the policies, expenditure, administration and operations of the seven Agencies and Departments which form the UK Intelligence Community (UKIC): MI5 (the Security Service); MI6 (the Secret Intelligence Service); GCHQ (Government Communications Headquarters); Defence Intelligence in the Ministry of Defence; the Joint Intelligence Organisation (JIO) in the Cabinet Office; the National Security Secretariat (NSS) in the Cabinet Office; and the Office for Security and Counter-Terrorism (OSCT) in the Home Office.

**I presented  on micro-targeting in the 2016 referendum,  at a seminar hosted by MEP Alexandra Geese  in the European Parliament on 12 November 2019European Parliament Micro-targetting & Profiling Event November 2019

***I'm just completing a preliminary study looking into Facebook's enforcement policies and their impact on a small fleet of pages.  


Iptegrity is made available free of charge. You may  cite my work, with attribution.  If you reference the material in this article, kindly cite the author as Dr Monica Horten, Visiting Fellow, London School of Economics and Political Science , and link back to You will also find my book for purchase via Amazon.

About me: I’ve been analysing analysing European Union policy for more than 10 years. I hold a PhD in EU Communications Policy as well as a Post-graduate diploma in marketing. I've worked with the Council of Europe on Internet governance issues, and I was on the Committee that drafted the CoE Recommendation on Internet Freedoms. For many years I was a telecoms journalist, writing for the FT among others, and I was an early adopter of the Internet. My current research is on platform responsibility. Please get in touch if you'd like to know more about my work. 

If you liked this article, you may also like my book The Closing of the Net  available in Kindle and Paperback from only £15.99!

Iptegrity in brief is the website of Dr Monica Horten. I’ve been analysing analysing digital policy since 2008. Way back then, I identified how issues around rights can influence Internet policy, and that has been a thread throughout all of my research. I hold a PhD in EU Communications Policy from the University of Westminster (2010), and a Post-graduate diploma in marketing.   I’ve served as an independent expert on the Council of Europe  Committee on Internet Freedoms, and was involved in a capacity building project in Moldova, Georgia, and Ukraine. I am currently (from June 2022)  Policy Manager - Freedom of Expression, with the Open Rights Group. For more, see About Iptegrity is made available free of charge for  non-commercial use, Please link-back & attribute Monica Horten. Thank you for respecting this.

Contact  me to use  iptegrity content for commercial purposes


States v the 'Net? 

Read The Closing of the Net, by me, Monica Horten.

"original and valuable"  Times higher Education

" essential read for anyone interested in understanding the forces at play behind the web."

Find out more about the book here  The Closing of the Net


FROM £15.99

Copyright Enforcement Enigma launch, March 2012

In 2012, I presented my PhD research in the European Parliament.

The politics of copyright

A Copyright Masquerade - How corporate lobbying threatens online freedoms

'timely and provocative' Entertainment Law Review


Don't miss Iptegrity!  RSS/ Bookmark