8 of 8

2 November 2023

The UK's Online Safety Act – 8 of 8 Insights

The Online Safety Act's approach to protecting fundamental rights and freedoms

Timothy Pinto asks whether the OSA has found the right balance between protecting freedom of expression, privacy, journalistic content and content of democratic importance, and protecting online users.

More
Author

Timothy Pinto

Senior Counsel

Read More

A key weakness with the 2019 Online Harms White Paper – the precursor to the Online Safety Bill (OSB) – was the lack of concrete protection for the right of freedom of expression. This right, as enshrined in Article 10 of the European Convention on human rights, is "one of the essential foundations of a democratic society and one of the basic conditions for its progress and for each individual’s self-fulfilment" (Lingens v Austria).

The focus on protecting users from illegal and some types of harmful user-generated content (UGC) will result in systems, measures, policies and practices which are designed to identify and prevent certain types of UGC from being published and to facilitate its prompt removal if it has already been published. This potentially clashes with the right of freedom of expression. By monitoring and taking action against user information and users, the service provider will also potentially be interfering with a user's rights of privacy.  As a result of concerns raised that the original OSB did not do enough to protect fundamental rights and freedoms, protections for freedom of expression and the right to privacy as well as for related content, were enhanced over the course of the Bill's passage.

What is now the Online Safety Act (OSA) following Royal Assent, has specific clauses covering:

  • freedom of expression and privacy (for all services)
  • content of democratic importance (for Category 1 services)
  • news publisher content (for Category 1 services), and
  • journalistic content (for Category 1 services).

Are these provisions enough to protect the fundamental rights and freedoms? 

Freedom of expression and privacy

There is a duty on all user-to-user services, when deciding on and implementing safety measures and policies, to have particular regard to the importance of protecting:

  • users' right to freedom of expression within the law, and
  • users from a breach of any statutory provision or rule of law concerning privacy.

Search services have similar obligations which also extend to protecting the freedom of expression of "interested persons" – a person (or business) located in the UK and responsible for a searchable website or database. 

Category 1 services have additional duties to carry out and publish up to date assessments of the impact their safety measures and policies may have on the rights to freedom of expression and privacy and to specify in a public statement the positive steps taken in response to their impact assessments to protect users' rights of freedom of expression and privacy.

There is only a duty to have particular regard to these fundamental rights. Therefore, if a service provider can show that it has considered them with sufficient thought and at the appropriate time, it will probably have complied. Good record keeping when making decisions which may affect users' free speech or privacy may help to demonstrate that the duty has been fulfilled.  Service providers will be treated as complying with their duties regarding freedom of expression and privacy if they take or use the relevant recommended measures to incorporate safeguards to protect users' rights.

Freedom of expression

The OSA does not explain what freedom of expression within the law means. The explanatory notes merely say this includes the common law; the law must be English law. Service providers will need to know what that law is to be able to have regard to it.

There is a substantial body of case law in England (and the ECHR which the English courts must consider under the Human Rights Act 1998) about the nature and value of freedom of expression, including that:

  • "Article 10 [ECHR] is applicable, subject to paragraph 2, not only to 'information' or 'ideas' that are favourably received or regarded as inoffensive or as a matter of indifference, but also to those that offend, shock or disturb" (Socialist Party v Turkey (1998)).
  • "…free speech includes not only the inoffensive but the irritating, the contentious, the eccentric, the heretical, the unwelcome and the provocative provided it does not tend to provoke violence. […] From the condemnation of Socrates to the persecution of modern writers and journalists, our world has seen too many examples of state control of unofficial ideas. A central purpose of the European Convention on Human Rights has been to set close limits to any such assumed power"… "'freedom only to speak inoffensively is not worth having'" (Sedley LJ in Redmond-Bate v DPP (1999)).

Furthermore, Article 10 ECHR includes the freedom to hold, impart and receive information, opinions and ideas. Therefore, it is not only the potential right of the poster to post their content which comes into play, but also the rights of the community of users as a whole to receive the content, which forms part of this fundamental right.

Privacy

The right of privacy under Article 8 ECHR includes "correspondence" and can potentially include content posted online, depending on such things as whether the poster has a reasonable expectation of privacy.

Of course, some user-to-user communications will be deliberately public, but others might be private communications to one person or to a small group. In any event, user-generated content will also fall within the UK GDPR and Data Protection Act 2018 as it is likely to be personal data. To be able to have particular regard to privacy, service providers will therefore need to understand the relevant laws, including the tort of misuse of private information and data protection law which is expressly mentioned in the OSA.

Content of democratic importance

Category 1 service providers have additional duties to protect political speech, in particular "content of democratic importance" (CDI). CDI is content that:

  • is news publisher content (NPC) or regulated user-generated content, and
  • is, or appears to be, specifically intended to contribute to democratic political debate in the UK or a part of the UK.

NPC is either content directly generated on the service by a recognised news publisher (defined in s50), or user content reproducing or linking to a full article, written item or recording published by a recognised news publisher.

The key question for the second condition is whether the content is or appears to be intended to contribute to democratic political debate.

It is bound to cover debate about party politics and promoting or opposing political parties and national or local government policies. But how wide does it go? And what about more extreme views? Would a controversial remark calling for immigration from certain countries to be banned contribute to democratic political debate or be so antithetical to democratic values that it does not even fall within the definition (notwithstanding the broad scope of the right of freedom of expression)? If the former, Category 1 services have certain duties (see below).

To be considered CDI appears to require that the debate happens in the UK but would cover debate over non-UK politics (eg US politics), however the wording is ambiguous.

If the second condition is satisfied, then a wide range of content will be caught since the definition of "regulated user-generated content" is broad, covering any UGC which is not explicitly exempt.

Category 1 service providers have the following duties regarding CDI:

  • to operate proportionate (taking into account size and capacity of the service provider) systems and processes to ensure that the importance of the free expression of CDI is considered when making decisions about how to treat, or whether to take down or restrict access to content or take action against users generating CDI
  • to ensure these systems and processes for decision making about content and users apply in the same way to a diversity of political opinion, and
  • to specify in their terms of service the policies and processes which take into account CDI and to ensure such terms are clear, accessible and applied consistently.

It may not be straightforward for service providers to ensure that their systems and decision making apply in the same way to a diversity of political opinion. For example, should they treat online discussions in favour of arguably xenophobic/nationalistic political parties in the same way as mainstream centrist parties who are anti-discrimination?

Service providers will probably need a list of principles to try to ensure they treat diverse parties the same way. They will also need to be careful to be consistent with what they allow or take down.

News publisher content

Category 1 service providers have specific duties to protect NPC.  This allows them to take action in the form of taking down content, restricting users' access to content or adding warning labels (except warning labels normally encountered only by child users).  Other actions are allowed where acting on a relevant term of service.  Service providers can also take action against a person including by warning, suspending or banning them from using a service or restricting their ability to use it.

Before taking action to protect NPC or against a recognised news publisher, service providers are required to comply with various notification and information obligations.  If they fail to do so, they have to act swiftly to make the required notifications and provide a reasonable time period in which an application to reverse the action can be made.  There are exemptions from the notification requirements where the service provider reasonably considers it may incur criminal or civil liability, or the NPC amounts to a relevant offence, or where the news publisher or the NPC content in question has already been banned from the relevant service.

Journalistic content

Category 1 service providers have duties to protect journalistic content. This is defined as content which is:

  • news publisher content or regulated user-generated content
  • generated for the purposes of journalism, and
  • UK-linked (ie UK users are targeted, or the content is likely to be of interest to a significant number of UK users).

What is content generated for the purposes of journalism? Does it only mean "news-related material" which is defined for part of the definition of a "recognised news publisher"? That applies to material consisting of news, opinions or information about current affairs and gossip about celebrities, other public figures or other persons in the news.

Journalistic content is likely to be wider than that, as journalism generally encompasses more than news, current affairs and gossip about public figures. It is not clear whether or the extent to which it includes citizen journalism (posts by individuals who are not professional journalists making information available to the public about current events – eg when they happen to be present when an earthquake, terrorist attack or riot takes place or just providing information or comment about current affairs). The UK ICO's draft Code on Journalism and data protection says the more something resembles the activities traditionally carried out by the mainstream media or other clear sources of journalism, the more likely it is to be journalism.  The same is likely to be the case in the context of online safety.  This issue may be expanded in Ofcom's codes of practice and/or by court cases.

The duties to protect journalistic content include:

  • to operate proportionate systems and processes designed to ensure the importance of the free expression of journalistic content is taken into account when making decisions about how to treat content and whether to take action against users generating, uploading or sharing such content
  • duties around complaints procedures in response to a decision to take down or restrict access to content
  • to specify in the terms of service how journalistic content is identified and considered, how free expression is taken into account when making decisions and the policies for handling complaints about this content
  • to ensure the terms of service are clear, accessible and applied consistently.

Like with CDI, the obligation regarding journalistic content when making decisions on whether to take down content or take action against users is to ensure that service providers properly consider the importance of the free expression of journalistic content. Again, good record keeping before and at the decision-making stage may help demonstrate the obligation has been fulfilled.

Codes of practice

Ofcom is required to produce codes of practice and guidance to assist with OSA compliance.  One or more codes must be produced on all duties which apply to regulated service providers.  This will therefore need to address the duties concerning freedom of expression, privacy, CDI and journalistic content. In the course of preparing the codes, Ofcom must consult with various persons who represent different interests and/or have certain expertise. This includes persons whom Ofcom considers have relevant expertise in equality issues and human rights, including the right to freedom of expression under Article 10 ECHR and privacy under Article 8 ECHR

In addition, Ofcom also has a duty to produce and publish a report assessing various factors around age assurance. In doing this it must consider a service provider's need to protect users from a breach of any statutory provision or rule of law concerning privacy.

The OSA also includes an obligation on Ofcom to state in its annual report the steps it has taken and the processes it has operated to ensure its online safety functions have been exercised in a manner compatible with Articles 8 and 10 ECHR.

Are fundamental rights and freedoms sufficiently protected?

The OSA ensures that freedom of expression and privacy and, for Category 1 services, CDI, NPC and journalistic content, are given attention. Whether this is enough to protect these fundamental rights may depend on the relevant codes of practice and ultimately on the service providers in question.

The key test will be whether these rights are sufficiently protected at the decision-making stage in relation to potential action against allegedly illegal and the types of allegedly legal but harmful content covered by the OSA, as well as against the users who posted it.

The obligations to "have particular regard to" these rights or ensure they are "taken into account" do not seem onerous. Weighed up against the duties to protect children and adults from illegal content and specified harms, it is possible that freedom of expression and user privacy will take second place in at least borderline cases. Service providers will need to have a good understanding of these fundamental rights to uphold them.

It is not clear why CDI, NPC and journalistic content are only given protection within Category 1 services. Perhaps it was thought that smaller platforms do not have the resources to cope with these additional duties or the power and influence to make a big difference. However, genuine political speech and journalism should arguably be adequately protected across the board, not only on the bigger platforms. Having said that, as these types of speech are manifestations of the right of freedom of expression – to which the duties apply across all services – the distinctions may be academic.

Return to

home

Go to Interface main hub