Understanding the UK Online Safety Bill

TODO (TODO)

~1,400 words

Published:

Last modified: June 12th, 12,021 HE

The Internet Society’s UK England Chapter recently held a webinar to discuss the newly-published draft Online Safety Bill.

Olivier Crépin-Leblond—the Chapter’s chair—introduced the event and speakers before handing promptly over.

First up was Open Rights Group (ORG) Policy Manager Heather Burns, who provided an overview of the proposed bill and its many concerning provisions. Citing the government’s 2017 Internet Safety Strategy green paper (which I discussed previously here) and 2019 Online Harms white paper, Burns said that this draft bill had been a long time coming and that the ORG et al. were not disappointed, with all the what-ifs that could have been there now confirmed to be absolutely there, all in service of the stated goal to make the UK the safest place in the world to be online.

The bill, in its current form, adds content moderation and service design duties for anybody that offers user-to-user contact or user-generated content. In practice, this means: a flood of new compliance paperwork; service provider having to fill in 1–3 risk assessments for both illegal and subjectively-harmful content, for both adults and children; Ofcom receiving a range of new powers and responsibilities; criminal sanctions, as opposed to just civil ones; and a sweeping change to the intermediary liability that has existed up until now.

To demonstrate this last point, Burns contrasted the current legal regime to the proposed new one. Currently, service providers are responsible for providing transparency in their terms & conditions, performing due diligence on the content shared on their platforms, respecting user freedoms; crucially, there is no general obligation for them to monitor content, which has been inherited from the EU’s 2000 eCommerce Directive. The content currently in scope is obviously illegal content, moderated by human moderators, and with Good Samaritan provisions for moderators that shield them from liability from such content that they discover. The current due process is regulated by the course, ensures (theoretically) consistent treatment and guaranteed a conditional immunity from liability (provided a provider is playing the game).

The bill, on the other hand, is hoped to be a world-leading experiment in shifting from the above model to one built around the duty of care (as transposed from the world of health & safety regulation). In its current form, it would impose a new general monitoring obligation and the requirement to automatically moderate (and be able to modify) content. It also introduced a slew of new business disruption measures, and expands the scope of moderation to include legal but subjectively harmful content, as well as shifting from the existing notice then takedown approach to one of action then notice.

Burns then covered a few areas of the bill in more detail. First up was its attempts to regulate private messaging, which serves to further the singular goal of this government of prohibiting the use of end-to-end encryption (E2EE) and criminalising its use. The bill, said Burns, will make private messages subject to the same duty of care on illegal content, with the follow-on effect that any such automated message-scanning will necessitate the prohibition of E2EE. Though not spelled out explicitly in the bill, Burns noted that it is not unreasonable to expect this to extend to scanning private messages for legal but subjectively harmful content, referencing the example of one think-tank’s recent proposal that the new duty of care should be used to impose criminal charges retroactively onto developers who are currently using E2EE (although, ingeniously, they neglected to mention any exceptions for the use of E2EE in financial transactions, etc.).

Another worrying element of the bill is its introduction (via the back door, of course) of mandatory age and ID verification. Companies within the scope of the bill would be required to assess whether children are able to access (not use, or register on) the service. Evidently, this would require the introduction of age gating to guarantee compliance, and Burns showed an example of the kind of lobbying material that MPs are currently receiving from the age verification technology lobby; chillingly, the same material also touts the possibility of collect nationality information from users, which cannot possibly be justified on the grounds of child protection but wouldn’t Priti Patel love to have a database of the nationalities of everyone who accesses online content in the country?

Finally, Burns addressed the new business disruption measures that the bill would introduce. The power to impose such measures would be granted to both the Secretary of State for the Department for Digital, Media, Culture & Sports (DCMS) and to Ofcom, to be used in the case of both illegal content or a failure to carry out any of the 25 new subjective duties that the bill would introduce on all content. For example, they would be able to target ancillary services (such as payment providers), impose access restrictions, impose fines, etc. Such measures would have go through Ofcom and the courts, unless it is deemed to be a matter of national security or interest.

Burns concluded by framing the bill as an attempt to create a British Web for British people—i.e., a British splinternet—with the costs outsourced to business (and, of course, the government is currently attempting to position the whole thing as a lucrative business opportunity. Pre-legislation scrutiny begins this month, with MPs being heavily lobbied by vested interests and primarily viewing the bill through the lens of social media and the abuse directed at them and their constituents; however, Burns suggested, there is certainly a split on the topic, both across Whitehall and the Tory party.

I missed the subsequent segment from Richard Allan, so next up was Robin Wilton of the Internet Society to discuss the technical incoherence of the bill.

Wilton pointed out that the word encrypt is only actually mentioned twice in the bill—ss 22–23, making it an offense to reply to an Ofcom request with encrypted information with the intent to make it impossible to read—which he proposed was the government acknowleding that it cannot possibly win the argument on mandating backdoors (with the last attempt to do so being the largely-derided ghost proposal in 2018). Despite dire warning from law enforcement about the widespread use of E2EE resulting in a going dark scenario, this is routinely proven to be baseless—Wilton presented the examples of Silk Road, Alphabay, EncroChat and the barely-a-week-old revelations about the hugely successful AN0M sting operation. These case studies, said Wilton, are proof that networked criminal activity produced vulnerabilities that law enforcement can exploit, regardless of their use of E2EE or not. Technically-mediated communications may amplify some crimes, he added, but it doesn’t create them; despite this, the draft bill is silent on the causes and origins of content relating to crimes, but is nonetheless being proposed as a solution to such societal problems.

Wilton quoted former GCHQ Director Robert Hannigan, who argued against such an approach in 2017, and added that there had been no assessment of the economic harm that the bill would cause if introduced. Whilst the bill makes a stab at cost analysis by guessing compliance costs, etc., a recent Internet Society report on a similar bill introduced in Australia found that it had resulted in billions in economic impact.

Wilton concluded by quoting a law firm, who had stated that the bill is so radical and onerous in parts, that if it survives and passes through the legislative system intact, the Internet and social media as we know it will be hard to recognise.

Last up was Chris Yapp, who presented the example of the aforementioned EU eCommerce Directive as a largely-uncontroversial measure at the time of its adoption that, alongside benefitting small companies by enabling them to sell to the entire EU market, also had the unforeseen consequence of allowing large tech. firms to largely avoid paying tax. Unintended consequences, he said, are as important to watch as the intentionality of the bill. Presenting another example, this time of a bill that saw a large amount of controversy around its introduction, he moved onto the Regulation of Investigatory Powers Act. At the time, Yapp’s suggestion that the then-bill would allow local authorities to spy on parents to see if they lived within a school catchement area was apparently laughed at when proposed, only to then happen less than 2 years later.

There then followed a Q&A.