Looking for help with the Online Safety Act  - Ofcom consultation & guidelines? Please get in touch. 

TL;DR The government's Impact Assessment calculates that this Bill will cost British businesses over £2billion to implement. By its own admission, 97 per cent of the 24,000 businesses in scope, are a low risk of having illegal or harmful content on their systems. Only 7-800 are likely to be high risk, and the real target, the big global platforms, only number around half a dozen. It is hard to see how the draft Bill of May 2021 could be justified on this basis. The Bill should focus on the real aim of tackling the global mega-platforms, and the high risk issues like child sexual abuse. For 97 per cent of the 24,000 small British businesses, there is no evidence that they entail any risk and the cost and regulatory effort is disproportionate to the aims.

British Internet businesses will be asked to foot a bill of some £2.12 billion to implement the Online Safety Bill, currently in draft [1], if it is passed by Parliament. That is according to the calculation in the government's own Impact Assessment [2], produced in April last year, which is intended to assess the economic effects of the measures in the Bill.

Some 24,000 British businesses are said to be in scope. The figure of £2.12 billion comes from the government's own Impact Assessment of April 2021. It is the cost that the Internet companies themselves will have to pay over a 10 year period, in present value. The costs will be incurred to implement the necessary systems and process for compliance with the Bill.

In addition to these implementation costs, these business will also have to pay licence fees to Ofcom, totalling £346.7 million over 10 years at present value (£46.0 million per year ). [ p60 S.219 ] Ofcom has already been given £100 million of public money to begin work even though the Bill has not yet been laid before Parliament [3]. It is not clear how this budget has been justified.

The lion's share of the implementation cost is for content moderation, estimated at £1.7 billion [p 50 S.182-183] over a 10-year period (present value). The remainder of the cost is expected to be reflected in the costs of administrative work, including reading the requirements, updating their terms and producing the dozen or so risk assessments that the Bill calls for.

Content moderation is the industry jargon for monitoring and restricting content posted by users users of the platform. When online platforms are asked, for example, to remove hate speech, abuse or disinformation from social media feeds, that is content moderation. The £1.7 billion is the overall cost to all business in scope of the Bill. The annual content moderation costs per business are calculated at £13.4 million for the very large businesses ( those likely to be in Category 1 for the purposes of the Bill), £3.3 million for large businesses, £255, 662 for medium size businesses, £45,058 for small businesses, and £2,540 for micro businesses. The last two apparently only apply where a business is deemed ' high risk' of having illegal content on its platform.

The administrative costs include £9.2 million to cover time to read the requirements. This is stated as two to six hours' reading (per company) at an hourly rate of £20.66 [S127-129]. How many Internet lawyers would work for £20.66 per hour?

Ofcom Codes of Practice are not easy reading and companies will need legal advice. For example, the Code of Practice for the 2010 Digital Economy Act [4] that addressed alleged copyright infringement, was 129 pages, with another Code for cost allocation of 74 pages. And this was 'just' for copyright. Imagine you have a dozen different types of illegal content, each with its own Code.

Then there is £12.4 million for a user reporting system [S.143] and £31 million to conduct the dozen or so risk assessments required by the Bill [S.160] . The risk assessments are expected to cost £3.6 million in the first year.

The cost of the user reporting system needs closer examination. This system is mainly to facilitate users who want to report illegal or harmful content, but also to handle complains about erroneous content removals.

A comparator can be found in the 2010 Digital Economy Act. It had a appeals system for users who believed a restriction had been wrongly imposed. Both of the industries concerned - copyright holders and Internet services - were unhappy with Ofcom's costings and the law was never implemented for this reason. In fact, the government had considerably underplayed the costs The 84 million-a-year bill for DE Act .

The costs do not account for age verification systems. [Summary, Policy Option 2, p5] Given that these are a requirement of the Bill - not stated specifically, but implied in terms of what the services are obliged to do - this would seem to be a critical omission.

The Impact Assessment is somewhat unclear about the types of services that would be encompassed in 24,000 businesses that in scope of the Bill. It confuses 'platforms' with 'Internet Service Providers' and virtual private networks or vpns', which are technically and legally different entities. Platforms are within the Bill's scope, but Internet Service Providers and vpns are out of scope. It also refers to peer-to-peer filesharing and 'consumer cloud storage', [S.385 p107] which by the Bill's own definition of 'user to user services' are excluded.

However, the Impact Assessment is clear about the risks for the different types of businesses. It states that 49% of the 24,000 businesses are low risk of having illegal harmful content [S.118]. That's 11,760 of the businesses in scope. Then another 48% (11,520) are mid-risk. Therefore, 23,280 (97%) of the businesses that this Bill will require to undertake measures are at low-medium risk of any harm, and there is little likelihood that they will have the targeted content on their servers. A majority of those 23,280 businesses (80%), are micro-business [S.119] . Less than 3% (720) are considered high risk and less than 0.1% are likely to be classified as Category 1 services (very large platforms, with a lot of illegal or harmful content).

This what one would expect to find. It reflects the highly skewed shape of the Internet content services industry which is inverted in favour of the largest services. There are around half a dozen social media platforms with a global user base, and a similar number of search services, in an ecosystem of thousands of smaller services that serve national and international user bases. The global platforms have literally billions of users: Facebook 2.895 billion, and You Tube 2.291 billion. Twitter is small by comparison with 436 million. These figures reflect active monthly users in October 2021, and are sourced from Statista. [5]

It has already been identified by government Ministers that the mega-platforms are the key targets , by virtue of the size of their user bases in the UK, and the volume of content and usage that they represent. The Secretary of State, Nadine Dorries, speaking to the Parliamentary Draft Bill Committee on 4 November 2021, said 'This Bill is not to fix the internet. This Bill is solely aimed at platforms that do harm to children'. [6] It has also been identified that some of the very harmful content may reside on other services. If we assume the Impact Assessment has reached a reasonable conclusion, then an appropriate target might be the assumed 720 or so high risk services, in addition to the global platforms.

In the face of these numbers, which are in the government's own Impact Assessment, the Bill should focus on the real aim of tackling the harmful content on global mega-platforms. It should also address the relatively small number of high risk platforms hosting illegal content, notably child sexual abuse.

The targeting of 23,000 low-medium risk services should be dropped. There is little basis for charging all of these businesses a licence fee nor imposing on them the full weight of this bill, given the high costs that will be incurred. It is also hard to justify the £100 million in public money that appears to be being spent on this. There is no evidence that they entail any risk, and the cost and regulatory effort is disproportionate to the aims.

End-note: last time the government forced through a law regulating Internet content without good evidence, the law was never implemented We had no evidence for DEAct, UK gov't confesses

---

Iptegrity is made available free of charge under a Creative Commons licence. You may cite my work, with attribution. If you reference the material in this article, kindly cite the author as Dr Monica Horten, and link back to Iptegrity.com. You will also find my book for purchase via Amazon.

About me: I've been analysing analysing digital policy for over 14 years. Way back then, I identified the way that issues around rights can influence Internet policy, and that has been a thread throughout all of my research. I hold a PhD in EU Communications Policy from the University of Westminster (2010), and a Post-graduate diploma in marketing. For many years before began my academic research, I was a telecoms journalist and an early adopter of the Internet, writing for the Financial Times and Daily Telegraph, among others.

Please get in touch if you'd like to know more about my current research.

If you liked this article, you may also like my book The Closing of the Net which discusses the backstory to the Online Safety Bill. It introduces the notion of structural power in the context of Internet communications. Available in Kindle and Paperback from only £15.99!

Â

[1] Draft Online Safety Bill, May 2021

[2] Unless otherwise stated, the figures and section numbers refer to the Online Safety Bill Impact Assessment of 26 April 2021.
[3] Draft Online Safety Bill Joint Committee Oral Evidence transcript 4 November 2021 Q284, Nadine Dorries
[4] .Online Infringement of Copyright and the Digital Economy Act 2010 : Notice of Ofcom's proposal to make by order a code for regulating the initial obligations
3.Statista, Most popular social networks worldwide as of October 2021, ranked by number of active users https://www.statista.com/statistics/272014/global-social-networks-ranked-by-number-of-users/
4. Draft Online Safety Bill Joint Committee Oral Evidence transcript 4 November 2021 Q284, Nadine Dorries.

Â

Find me on LinkedIn

About Iptegrity

Iptegrity.com is the website of Dr Monica Horten. I am an  independent policy advisor, with expertise in online safety, technology and human rights. I am a published author, and post-doctoral scholar. I hold a PhD from the University of Westminster, and a DipM from the Chartered Institute of Marketing. I cover the UK and EU. I'm a former tech journalist, and an experienced panelist and Chair. My media credits include the BBC, iNews, Times, Guardian and Politico.

Iptegrity.com is made available free of charge for non-commercial use. Please link back and attribute Dr Monica Horten.  Contact me to use any of my content for commercial purposes. Â