Big tech accountability? Read how we got here in  The Closing of the Net 

TL;DR  They say they do, but the Bill is not clear.  The government has been quite shifty in its use of language to obscure a requirement for encrypted messaging services to monitor users' communications. If they do comply with this requirement, they will have to break the encryption that protects users' privacy, and users risk being less safe online. However, they will also be conflicted in their legal duties to protect users' privacy, as will the regulator Ofcom. Private messaging services are important to millions of UK users. Their obligation under the Online Safety Bill needs clarification and amendment.

***UPDATE 24 May 2022 Quietly behind the scenes, there is confirmation that this is exactly what the government wants to do.***

Read more...

TL;DR A puzzling feature of the UK Online Safety Bill is the special protection it gives to 'content of democratic importance'. It asks the large online platforms to give special treatment to such content, in cases where they are taking a decision to remove the content or restrict the user who posted it. However, the term appears to have been coined by the government for the purpose of this Bill, and what it means is not clear. There is no statement of the policy issue that it is trying to address. That makes it very difficult for online platforms to code for this requirement.

Read more...

 TL;DR The UK government's Online Safety Bill creates a double standard for freedom of expression that protects large media empires and leaves ordinary citizens exposed. It grants special treatment to the large news publishers and broadcasters, who get a carve out from the measures in the Bill so that headlines like the notorious "Enemies of the People" get special protection from the automated content moderation systems. They even get a VIP lane to complain. Foreign disinformation channels would also benefit from this carve-out, including Russia Today. Content posted by ordinary British people could be arbitrarily taken down.

Read more...

 

TL;DR Social media companies will be required by the government to police users' posts by removing the content or suspending the account. Instead of a blue-uniformed policeman, it will be a cold coded algorithm putting its virtual hand on the shoulder of the user. The imprecise wording offers them huge discretion. They have a conflicted role - interfere with freedom of expression and simultaneously to protect it. Revision is needed to protect the rights of those who are speaking lawfully, and doing no harm, but whose speech is restricted in error.

Read more...

Draft Online Safety Bill committee 4November2021

TL;DR Key decisions will be taken behind Whitehall facades, with no checks and balances. The entire framework of the Bill is loosely defined and propped up by Henry VIII clauses that allow the Secretary of State (DCMS and Home Office) to implement the law using Statutory Instruments. This means that Ministerial decisions will get little or no scrutiny by Parliament. This will include crucial decisions about content to be suppressed and compliance functions required of Internet services. Standards for automated detection of illegal content will be determined by the Home Secretary. The concern is whether these powers could ever be used to block lawful but inconvenient speech.

 

Read more...

TL;DR The government's Impact Assessment calculates that this Bill will cost British businesses over £2billion to implement. By its own admisssion, 97 per cent of the 24,000 businesses in scope, are a low risk of having illegal or harmful content on their systems. Only 7-800 are likely to be high risk, and the real target, the big global platforms, only number around half a dozen. It is hard to see how the draft Bill of May 2021 could be justified on this basis. The Bill should focus on the real aim of tackling the global mega-platforms, and the high risk issues like child sexual abuse. For 97 per cent of the 24,000 small British businesses, there is no evidence that they entail any risk and the cost and regulatory effort is disproportionate to the aims.

Read more...

 

TL;DR   A website blocking order is a modern form of censorship. In the wrong hands, it is a dangerous weapon. Blocking orders provided for in Clauses 91-93 of the Online Safety Bill could be used in the most egregious cases to block overseas Internet services that refuse to comply with the Bill. They are not suitable for targeting 'big tech' social media platforms. Blocking orders have been used in the UK for copyright enforcement since 2011, and there is a body of caselaw to draw on. If these orders are used, they should be precise and specify the exact locations of the content, site or server to be blocked.

Read more...

Nadine Dorries, Secretary of State, 4 November 2021, screenshot from Parliamentlive.tv

Tl;DR  The Online Safety Bill is a major piece of legislation intended to tackle the very difficult and troubling issues around social media. However, in its desire to remove the bad stuff, the Bill is setting up a legal and technical  framework that mandates and enforces the automated suppression of online content and social media posts. The lack of a precise aim has enabled it to be moulded in  a way that raises a number of concerns.  Government Ministers will have unprecedented powers to define the content to be removed. They will be able to evade Parliamentary scrutiny through the use of Secondary Legislation. Social media platforms will have a wide discretion to interpret the rules and to determine whether content stays up or goes down. These factors, combined with the overall  lack of precision in the drafting and the weak safeguards for users, means that the Bill is unlikely to meet human rights standards for protecting freedom of expression online.

UPDATED  to reflect the Bill as Introduced to the House of Commons on 17 March 2022

Read more...

I'm delighted to introduce my new working paper 'Algorithms patrolling content: where's the harm?An empirical examination of Facebook shadow bans and their impact on users.

  READ THE FULL ABSTRACT & DOWNLOAD  it via SSRN

Read more...

UK intelligence services have been taking advantage of gaps in the international rules to conduct bulk interception of Internet traffic.  That practice came under scrutiny in the European Court of Human Rights, in a ruling that was released this week.

 The case of Big Brother Watch and Others v the United Kingdom was brought to the Court by human rights activist groups who were concerned about the mass online surveillance being carried out by UK intelligence services. It has resulted in a ruling that lays out essential ground rules for protecting privacy.  

Read more...

The European Parliament has voted to ratify the EU-UK Trade and Co-operation Agreement (TCA). It did so  despite a litany of reservations. But why?

Today, the European Parliament gave its consent to the so-called Brexit deal, formally known as the EU-UK Trade and Cooperation Agreement (TCA). There was an overwhelming majority of 660 in favour, 5 against, and 32 abstentions. The agreement now only needs to be adopted by the European Council, when it will formally enter into force.

It entailed  a simple yes/no vote under the Consent procedure, but as with all things Brexit, it wasn’t really that simple.

Read more...

A disproportionate restriction on freedom of expression – Fails to meet the legality principle – Gap between publicly available information and internal rules. 

These are the conclusions of  the Facebook Oversight Board in one of its first decisions on content removal by the platform. Facebook's action was assessed against human rights standards. The decision paves the way for users whose content is unfairly targeted by enforcement actions to argue for a rights-based approach. Yet it  also highlights some of the difficulties of automated content moderation. It contains much that is to be welcomed, although ultimately, the Board will have demonstrate that it has real teeth. 

I explore the rationale, drawing on my own research where I encountered a similar post to the one in this case.  

The case concerned content related to

Read more...

 As  talks on a UK-EU post-Brexit trade deal enter their tense final stages, a vital agreement on  security co-operation is hanging in the balance. A bespoke proposal has been tabled by the EU. It would facilitate ongoing access to cross-border data that police and intelligence services need. If it cannot be agreed, there are serious risks for law enforcement and individual privacy.  A reluctance on the part of the UK government to commit to future support for the European Convention on Human Rights  puts it  in jeopardy.  

The security co-operation agreement is needed so that UK law enforcement

Read more...

The UK is ‘clearly a target for Russia’s disinformation campaigns’. Protecting our democratic  discourse from a hostile state is the role of the intelligence agencies. Integral to that process are the social media platforms, who are private actors.  What role should platforms have in a national security context? The Russia report, released on 21 July, exposes some of the issues.

The Russia report* confirms that the UK is a target for online political interference  by the Russian State (para 31), but it exposes a gaping hole in the ability of the UK authorities to tackle the problem.  It paints a worrying picture of the intelligence agencies abrogating their responsibility to  protect the discourse and processes of the UK against the activities of foreign powers. Despite the known interference on social media, including with the 2016 referendum, there seems to be 

Read more...

 

The Closing of the Net   (Polity Press 2016)


"takes the pulse of the open web" Journal of IP Law & Practice


PAPERBACK & KINDLE FROM £15.99

 

If the open Internet is an essential precondition for democracy,  should governments or corporations be allowed to restrict it? This is the question at the heart of my book ‘The Closing of the Net’ and it discusses the backdrop to the political controversies of today around such issues as fake news,  terrorism content online,  and mis-use of data – controversies that result in calls for ‘responsibility’ by online companies. The book argues that any regulation of these companies must enshrine public interest criteria, which must balance the competing rights at stake.

Read more...

mh.vc.kiev.nov2015.s.jpg

Iptegrity in brief

 

Iptegrity.com is the website of Dr Monica Horten. I’ve been analysing analysing digital policy since 2008. Way back then, I identified how issues around rights can influence Internet policy, and that has been a thread throughout all of my research. I hold a PhD in EU Communications Policy from the University of Westminster (2010), and a Post-graduate diploma in marketing.   I’ve served as an independent expert on the Council of Europe  Committee on Internet Freedoms, and was involved in a capacity building project in Moldova, Georgia, and Ukraine. I am currently (from June 2022)  Policy Manager - Freedom of Expression, with the Open Rights Group. For more, see About Iptegrity

Iptegrity.com is made available free of charge for  non-commercial use, Please link-back & attribute Monica Horten. Thank you for respecting this.

Contact  me to use  iptegrity content for commercial purposes

 

States v the 'Net? 

Read The Closing of the Net, by me, Monica Horten.

"original and valuable"  Times higher Education

" essential read for anyone interested in understanding the forces at play behind the web." ITSecurity.co.uk

Find out more about the book here  The Closing of the Net

PAPERBACK /KINDLE

FROM £15.99

Copyright Enforcement Enigma launch, March 2012

In 2012, I presented my PhD research in the European Parliament.

The politics of copyright

A Copyright Masquerade - How corporate lobbying threatens online freedoms

'timely and provocative' Entertainment Law Review


 

Don't miss Iptegrity! Iptegrity.com  RSS/ Bookmark