19 September 2022

Digital Services Act (DSA) - an overview

The EU's DSA and the UK's OSB: a comparison of their approaches to online safety

Adam Rendle looks at the differences and similarities in the approach of the EU and UK to online safety under incoming legislation.

More
Author

Adam Rendle

Partner

Read More
Author

Adam Rendle

Partner

Read More
In this edition

Two parallel pieces of legislation in the UK and EU have similar aims but create almost completely separate regimes for user generated content digital services to comply with. 

The Digital Services Act (DSA) has "the objective to ensure a safe, predictable and trusted online environment, addressing the dissemination of illegal content online and the societal risks that the dissemination of disinformation or other content may generate". (Recital 9).

The Online Safety Bill (OSB) "delivers the government’s manifesto commitment to make the UK the safest place in the world to be online while defending free expression" by requiring platforms "to tackle and remove illegal material online, particularly material relating to terrorism and child sexual exploitation and abuse."  

These may sound very similar, however, the way the legislation attempts to achieve these stated aims is very different in practice. Here we look at some of the key overlaps and differences digital service providers will face when looking to comply with both. For more detail on the DSA, please see the remaining articles in this month's Interface.  And for more on the OSB, please see the set of articles we published in June's Interface.

This article is written on the basis of the OSB as it stood on 28 June 2022 (available here). At the time of writing (early September 2022), the UK parliamentary procedure had been put on hold but the new Prime Minister confirmed in Parliament on 7 September that her government will be proceeding with the OSB.  She also noted that there are some issues that government needs to deal with so, while ensuring the under 18s are protected from harm, she wanted to make sure that free speech is "allowed".  She said that there may therefore be some "tweaks" required to the current drafting.

Services in scope

The definitions of services in scope are very broad and apply generally to all services which are involved in the sharing and dissemination of user generated content. While the definitions are drafted very differently, in practice they will capture many of the same services which are currently reliant on the eCommerce Directive's hosting immunity (especially those defined as "online platforms" in the DSA), and the different sets of obligations will apply in parallel to them. Where the definitions diverge is in the categorisation of the larger/higher risk services to which enhanced obligations apply: it is quite possible that a very large online platform (VLOP) under the DSA won't be a Category 1 service in the OSB.

The DSA is built on the foundations of the well-established classes of intermediary services known from the eCommerce Directive: mere conduits, caches and hosts. There is then a subset of host services, online platforms, which involve the dissemination of content to the public and then a further subset of online platforms, the very large online platforms or VLOPs, which have at least 45 million monthly active users and are designated as such by the Commission. Search engines are also covered. The safety obligations are based on mechanisms which derive from the well-known notice and taken down approach introduced by the eCommerce Directive. Indeed, the intermediary safe harbour protections are repeated and recast at the start of the DSA, as is the prohibition on general monitoring.  See here for more.

The OSB introduces and applies to new categories of regulated services: those which enable users to encounter UGC generated directly on, uploaded to or shared on the service by other users, and search engines. There are then enhanced obligations on larger and potentially more harmful services, which will be determined by criteria to be published at a later date (Category 1 and Category 2A and 2B services). There are some detailed definitions of exempt services eg for services with limited functionalities such as those whose only user content is below-the-line content. Those sorts of exemptions aren't replicated in the DSA. The existing safe harbour regime in the UK (derived from EU law) isn't referenced in the OSB.

In both cases there must be a link to the UK or EU (as applicable), and the legislation will apply to services even if the providers are not established in those countries.

Content in scope

The DSA uses a very broad concept of illegal content, essentially covering anything which is illegal offline and treating all illegal content equally. Information about sale of illegal goods and services is also covered. The OSB takes a multi-layered approach to illegal content and keeps some content out of scope (ie intellectual property, consumer protection and goods and services), applying different obligations to different content, depending on whether it is related to terrorism, child sexual exploitation and abuse and other specified priority illegal content. There is also a catch-all category of illegal content where individuals are the victims of the offence. The OSB also has categories of content which are lawful but harmful to adults or children, which will be within the duties of, respectively, Category 1 services and services likely to be accessed by children. What is within those categories will, in part, be determined by secondary legislation.

Different approaches to safety obligations

This is where the differences between the regimes are most stark. The OSB expects much more of services by imposing overriding safety duties on them, whereas the DSA's obligations are narrower and more specific.

Under the OSB, all services have to carry out illegal content risk assessments and then have an overriding duty of care to mitigate and manage the risks identified. Risk assessments are likely to be burdensome and will require services to understand the possible incidence of a large number of priority offences. Services likely to be accessed by children and Category 1 services also have to carry out additional assessments.

Services will then need to take proportionate measures to mitigate and manage the risks identified, including by having in place systems and processes designed to:

  • prevent all users from encountering priority illegal content and remove access to all illegal content once made aware of it
  • prevent or protect children from encountering harmful content, and
  • for Category 1 services, allow adult users to have more control over the content they see. 

These steps may entail putting in place or adjusting measures in relation to proactive and reactive content moderation, recommendations, age gating, and both user-facing and internal policies (including upskilling moderation teams on the broad illegal and harmful content definitions in the OSB).

In order to fulfil the OSB's safety duties, services will need to take measures (if proportionate) in areas including regulatory compliance and risk management arrangements, the design of functionalities, algorithms and features, policies on terms of use and user access, content moderation, functionalities allowing users to control the content they encounter, user support measures, and staff policies and practices, among others.

As a parallel to these risk mitigation measures, the DSA requires only VLOPs to produce an annual assessment of the systemic risks stemming from the design, functioning and use of their services, have measures in place to mitigate those systemic risks, be subject to independent annual audits, and establish an independent compliance function.

For all other services in scope in the DSA, their safety obligations are more limited than in the OSB, and centre (for hosting services) on the removal of illegal content on notification (see here for more). Building on the process inherent in and implied by the eCommerce Directive's immunity for hosts, hosting services are subject to a granular set of requirements about the "notice and action" mechanisms they now need to put in place, which are similar to the well-known DMCA requirements. Online platforms need to provide "trusted flaggers" (bodies certified as having particular expertise and competence in identifying and notifying illegal content) enhanced service under these mechanisms. They will need to suspend their services to users who frequently provide manifestly illegal content

Other measures in the DSA go more towards transparency and user empowerment than safety; the DSA is envisaged to be a consumer protection measure, as well as enhancing safety. For example, services have to provide a point of contact for regulators and users and need to publish annual reports on their content moderation. Online platforms and VLOPs have to publish transparency reports every six months, reporting on topics such as number of suspended accounts and removed content. They also have 'know your business customer' obligations in relation to traders and are required to design and organise their interfaces to enable traders to comply with their consumer protection obligations (see here for more).

The OSB also requires the larger services to produce annual transparency reports, covering information specified by Ofcom and contained in a long list in the OSB, including the incidence and dissemination of illegal content, reporting systems, and measures taken to comply with safety duties.

In addition, the OSB introduces user empowerment tools for Category 1 services, where proportionate to do so, giving adult users more control over legal but harmful content and the ability to filter out non-verified users. That would give adults the ability to reduce the likelihood of encountering that content or to be alerted to the harmful nature of the content they may encounter.

Notwithstanding these differences, there is one key similarity between the two approaches. Neither of them regulates specific pieces of content. So, for example, regulators will not be able to challenge decisions or failures to act around a piece of content. Instead, the regulatory focus is on systems, measures, policies and practices.

Compliance similarities

Both the DSA and OSB require services to review their policies, procedures and user terms. However, the changes needed will be very different in practice.

For example, the OSB requires greater detail in terms and conditions about how particular types of content will be dealt with and how individuals are to be protected from that content, and reporting and takedown processes. Providers also need to include information about any proactive technology used to comply with their safety duties. Category 1 services need to summarise the findings of their most recent adults' risk assessment and provide how priority content that is harmful to adults is to be treated. Children's services also have various disclosure obligations in their terms.

Likewise, the DSA requires service providers to include information about content restrictions in their terms, including information on steps taken on content moderation (including algorithmic decision-making) and their complaint handling system. Terms have to be clear, plain, intelligible, user-friendly and unambiguous.  For services primarily directed at minors, conditions for and restrictions on use of the service have to be explained in a way that minors can understand. VLOPs have to provide concise summaries of their terms in clear and unambiguous language, and publish their terms in the official languages of all Member States in which they offer their services.

Both require reporting/notice mechanisms for content in scope eg allowing users to report illegal content.

For the first time, services are subject to complaints procedures (albeit only online platforms in the DSA), in which users affected by take down/moderation decisions are able to challenge them. This will lead services to develop their own (or relying on third party provider) dispute resolution procedures. Indeed, under the DSA, users will be able to select a certified out-of-court dispute settlement body to resolve content moderation and service suspension disputes.

The OSB classes those redress measures in part as a breach of contract: services need to inform users about their right to bring a breach of contract claim if their content is taken down, or access restricted to it, in breach of the terms of service. The OSB also makes services subject to user or affected party complaints that they aren't complying with, for example, their safety or freedom of expression duties.

Advertising

Advertising is covered in different ways by the OSB and DSA.

Under the OSB, Category 1 services will need to put in place proportionate systems and processes to prevent individuals from encountering fraudulent adverts on the service, minimise the length of time for which they are present, and swiftly take it down when aware of or alerted to it.

In the DSA, focus is more on transparency of advertising so online platforms need to identify for each specific advert, in a clear, concise and unambiguous manner and in real time, that the ad is an ad (including through prominent markings), the advertiser (or, if different, the person who paid for it), and meaningful information about the main parameters used to determine to whom the ad is presented and how to change the parameters. VLOPs have to make a repository available of this, and other, information for a year after the ad was last presented on their interfaces (see here for more).

Freedom of speech

The DSA requires services to mitigate how their content moderation systems impact free speech. When applying restrictions on use in their terms, services have to pay due regard to the rights and legitimate interests of all parties involved, including the fundamental rights of users, like freedom of expression, freedom and pluralism of the media, and other Charter fundamental rights and freedoms. The OSB is similar, in requiring services to have regard to the importance of protecting users' right to freedom of expression when deciding on, and implementing, safety measures and policies.

Under the OSB, Category 1 services have to carry out and publish assessments of the impact on freedom of expression of their safety measures and policies, and specify in a public statement the positive steps taken in response to their impact assessments to protect users' rights of freedom of expression. They also have additional duties to protect political speech, in particular, "content of democratic importance" and journalistic content.  Under the DSA, VLOPs have to consider actual or foreseeable negative effects of the exercise of fundamental rights (including freedom of expression and the freedom and pluralism of the media) as part of their annual systemic risk assessments.

Areas with no/few overlaps

The DSA:

  • has intellectual property infringement within its scope, which is expressly excluded from the OSB
  • deals only with illegal content.  Conversely, the OSB requires certain services to take action in relation to lawful but harmful content
  • requires service providers to appoint a contact for communication with authorities and a point of contact for users of the services. There is no such requirement in the OSB
  • requires services to provide a statement of reasons why they took action in relation to content or the user's activities, including removing content or suspending use of the service. There are no equivalent provisions in the OSB
  • introduces prohibitions on "dark patterns" applicable to providers of online platforms. They are prohibited from designing, operating or organising their interfaces "in a way that deceives, manipulates or otherwise materially distorts or impairs the ability of recipients of their service to make free and informed decisions." The UK government is, separately from the OSB, considering legislating on dark patterns, in a way which would apply to all digital services and not just those in scope of the OSB, although it held back from including this in incoming consumer protection reforms
  • requires disclosure by online platforms of the parameters used in their recommender systems and, for VLOPs, the ability for users to turn off systems based on profiling. Recommender systems are not specifically regulated in the OSB
  • imposes crisis response obligations on VLOPs, should the functioning and use of their services significantly contribute to extraordinary circumstances leading to a serious threat to public security or public health in the EU or a significant part of it.  

Navigating the framework

Many UGC services are likely to be within the scope of both the OSB and the DSA and they will have a complex task ahead of them to understand which obligations apply and then comply in a complementary way.  This is made all the more complicated by the fact that the legislation is not coming in at the same time, and because much of the detail of the OSB will be set out in secondary legislation, guidance and codes of practice.  Service providers within the scope of both sets of rules will need to look at their compliance holistically rather than treating the obligations separately, despite the many differences in approach.

Return to

home

Go to Interface main hub