A plethora of high-profile examples in recent months have emphasised the damage that illegal and harmful online content can do, both at an individual and a societal level. Since becoming Collins Dictionary’s word of the year in 2017, “fake news” has been at the forefront of discussion, challenging the democratic backbone of societies; the question of whether or not we can trust the images that we see and the words that we read is emphasised further by the emergence of regional “splinternets”, leading to the realisation that our neighbours’ reality may be different from our own.

There has never been greater scrutiny of online content, leading to increasing pressure from society and regulators on online platform providers to become the “editor” and police the content on their platforms, focussing particularly on the need to protect the vulnerable.  Digital regulation across Europe is now gathering momentum, with divergent legislation proposed under the UK’s Online Safety Bill and the EU’s Digital Services Act. This legislation is extensive, with aspects applying to all businesses that produce content, not just online platform providers.  In this article, we examine some of the key elements of the UK’s Online Safety Bill that was presented to Parliament on 17 March 2022 (the “Bill”), as a prologue to a series of more wide-ranging commentary and as legislation around the globe gathers momentum.

What aspects of online safety does the Bill focus on? 

The Bill is wide ranging, with multiple facets aiming to knit together the current patchwork quilt of related legislation in the UK, whilst treading a fine line between protection and infringement of human rights.

At its core, the Bill has the primary focus of addressing the proliferation of illegal content and online activity and making sure that both children and adults using online services are not exposed to harmful content, even if this content is not illegal, by placing a duty of care on social media platform providers and search engines. These businesses will need to put in place a series of measures to improve user safety, including by seeking to combat trolling, online abuse and unwanted contact on social media. To demonstrate that they are meeting the new requirements, social media platforms and search engines will be required to implement systems and process that improve user safety. It is these provisions that form the focus of the remainder of this article.

It should be noted, however, that additional measures covered by the Bill include:

  • a requirement for age verification on all websites that host pornography;
  • the criminalisation of cyber-flashing;
  • regulation of services targeting UK consumers (even if the service is not based in the UK) and a duty to protect against fraudulent advertising; and
  • the appointment of Ofcom (the UK’s communications regulator) as regulator, with power to enforce the offences contained in the act against companies, senior executives and employees. As part of this supervisory role, Ofcom will oversee the transition to the use of improved systems and processes by social media platforms and search engines.

Protecting children

The Bill imposes a duty on platforms that are likely to be accessed by children, obliging platform operators to protect young people and proactively take measures against activities that threaten children’s safety online. This duty is being introduced against a wider UK regulatory backdrop of the protection of children online, alongside the introduction of the UK ICO’s Age Appropriate Design Code that seeks to both protect children’s privacy online and ensure that services accessed by children are compliant with UK data protection law (read more about the Age Appropriate Design Code here). From a practical perspective, this additional duty adds another layer of compliance for businesses, giving them an increasingly long checklist when undertaking reviews of the user experience as they look to assess compliance with the Bill. The protection of children has been a major political driver behind the introduction of the Bill and one that has, understandably, received the widespread support of the UK public.

Which businesses does the Bill apply to? 

Aspects of the Bill apply to all, but the provisions on which we focus on in this article are largely applicable to search engine services, services that host user generated content (“UGC”) and services that enable online interaction between users, who are subject to a new duty of care to UK consumers imposed by the Bill. All businesses whose services host UGC such as images, videos and comments, or that allow UK users to talk with other people online through messaging, comments and forums, will need to comply with the Bill when it becomes law. This includes messaging apps, some online games, cloud storage platforms and pornography sites.

For a business based outside the UK to be caught, the Bill specifies that the service either has to have a significant number of UK users or that UK users are a target market for the services and that “there are reasonable grounds to believe that there is a material risk of significant harm to individuals in the United Kingdom presented by user generated content / search content (as applicable)“.

What happens if a business does not comply?

Enforcement powers of Ofcom introduced by the Bill range from imposing fines, instructing companies to take corrective action on their non-compliant platforms and even imprisonment of senior management who fail to comply with Ofcom’s information requests or who deliberately destroy or withhold information. In severe cases of non-compliance, platforms could be blocked, presenting a substantial business interruption risk. Looking at the potential financial penalties introduced by the Bill, parallels can be drawn with approach taken under the EU’s General Data Protection Regulation (“GDPR”). Failure to comply with the Bill could result in companies being fined up to 10% of worldwide group revenues for their more recent complete accounting period or £18 million, whichever is greater. This is significantly higher than the GDPR’s 4% of annual global turnover (or €20 million, if higher) cap, however, seemingly reflecting the social and political pressure for greater control in this area.

In a bid to toughen the deterrents against companies breaching their obligations, the Bill goes beyond the use of fines and has the potential to result in criminal liability for those involved in several of the proposed offences. Many of these offences are applicable to employees at every level of the company, however, there is particular focus on the newly introduced “senior manager” role. All that would be required for a senior manager to be caught by the provisions of the Bill would be a failure on their part to take “all reasonable steps” to prevent an offence under the Bill. And, furthermore, should an employee or senior manager find themselves complicit or having engaged in offending behaviour, the Bill specifies the possibility of up two years imprisonment, a high liability burden for even the most experienced of senior managers to have to stomach.

What burden does the Bill impose in terms of user generated content?

Earlier we mentioned UGC, liability for the monitoring of which has been imposed under UK law dealing with copyright, trade marks, privacy rights and the law on defamation (amongst other legal risks) for many years. Faced by increased liability risks relating to UGC under the Bill, one measure that could be implemented by companies looking to ease the burden of ensuring compliance could be the development or implementation of algorithms to monitor harmful content. It may also be sensible to implement a secondary review system, possible manual. This kind of system may already be being used by some businesses to address existing liability for the display of UCG on their sites, and will now, thanks to the Bill, pose a level of additional complexity that may take a long time and considerable expense to overcome. In terms of assessing liability in the event of non-compliance, we will have to wait and see how far liability can be reduced through the implementation of (for example) additional manual systems if an algorithm fails to pick up harmful content and/or a human reviewer fails to spot an offending comment online.

What should businesses do next?

Firstly, we recommend that businesses assess if they are caught by the provisions of the Bill before undertaking a careful audit of exactly which provisions will apply to their operations, the parts of their websites this relates to and the potential liability exposure this presents. Rather than wait for the Bill to become legislation, now is the time to take steps to address the impact of the Bill look at what can be done to address potential liability. DLA Piper’s team of legal advisors will keep you updated as the Bill progresses and are on hand to provide any legal support required regarding the impact of the Bill.

 

If you would like advice in relation to the Online Safety Bill or more information about any of the points raised in this article please contact one of the authors or your usual DLA Piper contact.