A woman on a conference call with her team

Online Safety Act: is the UK a safer place to go online?

The UK’s landmark Online Safety Act received Royal Assent on 23 October 2023 following a lengthy and controversial legislative journey. The reach and content of the Act was heavily scrutinised by government, committees, charities, campaigners, coroners and families, including the family of Molly Russell, a teenager who took her own life after viewing online content, and whose inquest marked a turning point in the Act’s journey. Charlotte McClelland, Associate, and Danny McShee, Partner at Kennedys tell us more.

The Online Safety Act aims to facilitate the UK government’s ambition to make the UK “the safest place in the world to go online, and the best place to grow and start a digital business”. Businesses that provide online services will now have a legal obligation to identify, mitigate and manage the risks of harm from illegal content and activity, particularly that which is harmful to children.

The Act has similar aims to the EU’s Digital Services Act (DSA), which seeks to reduce harms, combat risks online and protect users’ fundamental rights. The DSA came into force on 16 November 2022 and will apply in EU Member States from 17 February 2024.

Services in scope
The Act applies to UK businesses that provide the following regulated services:

  • A ‘user to user service’ – a service through which content is generated, uploaded and shared by service users. Examples include online social media platforms and online forums.
  • A ‘search service’ – a service that includes a web search engine. As this is a common feature on many digital platforms, many are likely to fall within scope of the Act.


Service providers that publish or display pornographic content will also be covered. Platforms and websites that have an internal site search function fall outside the scope of the Act.

The Act will also apply to non-UK service providers if that service has a significant number of UK users or UK users form the target market, or one of the target markets, for the service.

Certain regulated service providers will be categorised as Category 1, 2A or 2B. The most onerous obligations, including those relating to transparency, accountability and freedom of expression, are to be placed on providers of Category 1 services.

Category 1 providers will be no longer be required to assess risks from, and set terms of service in relation to, legal but harmful content activity accessed by adults. Instead, they will be required to set clear terms of service in relation to the restriction or removal of user-generated content, and the suspension or banning of users on grounds related to user-generated content. They will also be required to provide optional user empowerment tools to give users greater control over the content they see.

The Act’s long journey through Parliament has meant that many businesses have already started preparing to proactively regulate content on their platforms. This includes protecting users from being exposed to illegal content by a) removing all illegal content, b) removing content banned by the service providers’ terms and conditions and c) enabling users to tailor the type of content they see, in doing so avoiding potentially harmful content.

Looking ahead, in-scope providers should also put in place additional measures to protect children from harmful or inappropriate content, such as bullying, pornography and the promotion of self-harm, and tackle legal but harmful content, such as eating disorder content.

Ofcom's role
Ofcom has come under pressure, including by Molly Russell’s father, to act “bold” and “fast” to take immediate action under its new powers.

The regulator is expected to take a phased approach to enforcement. On 9 November 2023, it opened the first of four consultations that will inform the development of future codes of practice and guidance for in-scope companies. The Act’s new rules will come into force once the codes and guidance are approved by Parliament.

The codes of practice will set out measures that regulated services can take to mitigate the risk of illegal harm. They will also provide for:

  • A register of risks relating to illegal content, and risk profiles of service characteristics that Ofcom’s assessment has found to be associated with heightened risk of harm.
  • Draft guidance on how service providers can conduct their own risk assessments and on how they can identify illegal content.
  • Draft guidance on record keeping.
  • Ofcom’s draft enforcement guidelines.

 

Once Ofcom’s increased responsibilities and enforcement powers have begun, it will be able to levy fines of up to £18 million or 10% of global annual revenue, whichever is larger. Senior employees of in-scope services could also be held personally liable for non-compliance with the legislation and face criminal charges with custodial sentences.

Comment
Many service providers have already strengthened their policies in an attempt to pre-empt the legislative framework provided for under the Act and in attempts to avoid hefty fines and custodial sentences for senior members of the businesses. Given the controversy and high-profile nature of the Act’s journey through Parliament, implementation and enforcement of the new provisions is unlikely to be smooth sailing.