2024年4月25日
– 1 / 3 观点
The UK's Online Safety Act (OSA) was passed on 26 October 2023. Much of the detail around compliance will only become fully clear once Ofcom guidance and codes of practice are finalised.
Under the OSA, additional obligations apply to Category 1 and Category 2A and 2B services which are defined as follows:
Category 2A – search services meeting threshold conditions relating to the number of users and any other relevant factors.
Not only do services in scope of the OSA need to wait for the relevant codes of practice and guidance to understand the nature of their duties and how to comply with them, they also need to wait for secondary legislation to determine the threshold conditions and whether or not they are classified as a categorised service and therefore subject to additional requirements.
Ofcom launched phase three of its online safety regulation plan on 25 March 2024. This covers additional duties for categorised services including transparency reporting, user empowerment, fraudulent advertising and user rights.
Phase three will follow a three-step process:
Finally, following consultations, final codes and guidance will be published.
Phase three began with a call for evidence to inform the planned early 2025 consultation. The call seeks input from stakeholders on:
duties relating to access to information about a deceased child's use of a service.
The call closes on 20 May 2024.
Ofcom has also published its advice to the government on the thresholds which will determine categorisation as a category 1, 2A or 2B service as follows.
Category 1: should apply to services which meet either of the following conditions:
Condition 2 – allows users to forward or reshare user-generated content; and uses a content recommender system; and has more than 7m UK users on the user-to-user part of its service representing around 10% of the UK population.
Category 2A: should apply to services which meet both of the following criteria:
has more than 7m UK users on the search engine part of its service representing around 10% of the UK population.
Category 2B – should apply to services which meet both of the following criteria:
has more than 3 million UK users on the user-to-user part of its service, representing around 5% of the UK population.
The Secretary of State must now set out threshold conditions in secondary legislation, taking Ofcom's advice into account. Once the legislation is passed, Ofcom will gather information as needed from regulated services and produce a published register of categorised services.
It is not certain that Ofcom's threshold recommendations will be accepted without amendment but the recommendations are helpful indicators for organisations trying to understand where they are likely to fit into the online safety regime.
Ofcom published information on its powers to gather information relating to the death of a child under s101 OSA on 27 March 2024. This section came into force on 1 April 2024. It gives Ofcom powers to request information from the services set out in s100(5)(a-e) of the OSA to support a coroner's or procurator fiscal's investigation into the death of a child. Ofcom can request:
Content generated, uploaded or shared by the child.
The news that WhatsApp has changed its terms and conditions to allow children aged 13-15 to use its services prompted renewed debate in the media about access by children to social media, messaging services and even to smartphones.
Reports suggest that the government plans to try and persuade some of the biggest social media businesses to voluntarily alert parents when their children access unsuitable content, and may also be considering mandating greater parental controls or even banning under-16s from social media.
The government may be watching recent developments in Florida whose ban on social media for children under 14 is scheduled to come into effect on 1 July 2024. A new Bill bans under-14s from social media and requires minors aged 14 and 15 to get explicit parental consent to create a user account. Existing accounts which become unlawful on 1 July will need to be closed and data deleted by relevant companies unless it is required to be retained by law.
The Florida Bill also states that companies meeting set criteria which intentionally publish or distribute content harmful to minors, must use age verification to block access to the content by under-18s. Obviously the legal framework is very different in the USA and the Bill (an outlier in the US) is expected to face legal challenges on the basis that it breaches the First Amendment right to free speech. If it survives challenges it will be instructive to see whether it is enforceable in practice.
If the UK government were to proceed with restricting children's access to social media, there would be an additional set of online safety, not to mention data protection considerations for in-scope businesses and, no doubt, still more for regulators like Ofcom and the ICO to do in this area. The ICO has set out its priorities when it comes to protecting children online during 2024-25 which, unsurprisingly, include working closely with Ofcom in its capacity as regulator of the Online Safety Act. It seems unlikely, however, that the OSA will fully resolve the online safety debate.
2024年4月25日
作者 Debbie Heywood 以及 Louise Popple