Ofcom’s protection of children consultation: a quick guide.

Ofcom, the UK’s new regulator for online safety, have released their second major consultation as part of their implementation of the Online Safety Act. While Ofcom’s first major consultation was focused on illegal content, this one is focused on helping services understand how to comply with their duties to protect children from harmful content. While the focus of the two consultations is slightly different, they share one thing in common: both are vast. The consultation includes hundreds of pages of information covering the causes and impacts of online harms to children, how to carry out children’s risk assessments, and much more.

That’s a lot of reading. Where do I start?

In brief, there are four steps.

Firstly, services need to complete a “child access assessment” to determine if children are likely to access their service, or a part of their service. Services can state that children are not likely to access their service by implementing highly effective age assurance.

Secondly, if children are likely to access a service, the service provider needs to complete a “children’s risk assessment.” To do this, services need to understand the kinds of content that may be harmful to children, and assess the risk of harm to children given the nature of their service. Services need to report the results of their children’s risk assessment to senior governance bodies, and update them every 12 months. Some services need to publish the results of their risk assessments in their Terms of Service.

Thirdly, services need to implement appropriate safety measures. These measures fall into three broad areas, which we expand on below. In brief, they are:

  • Robust governance and accountability;

  • Safer platform design choices; and

  • Providing children with information, tools and support.

Fourthly, services need to keep their risk assessments and safety measures under review – with a review at least every 12 months.

The proposed regulatory framework for protecting children online follows a basic pattern that is recognisable across digital regulation: understand the causes of harm, assess risk, implement measures to mitigate these risks, and regularly review the effectiveness of these measures.

Do even small services need to comply?

Yes. Ofcom will take a proportionate approach to regulation, and specific measures are recommended according to certain criteria, such as the size of a service, its functionalities, and the outcome of its risk assessment.

However, this does not mean smaller services do not need to comply with regulation. Ofcom state explicitly that even the smallest services will need to take action as a result of their proposals, and cost or inconvenience are not excuses to neglect the protection of children.

Okay. So, what does this look like in practice?

There is no silver bullet, or one-size-fits-all technology that a service can use to protect children online. Ofcom highlight that safety measures need to work together to create a safer online environment for children. Below, we provide some detail on each of the three broad areas of safety measures proposed by Ofcom, and highlight a few specific measures for each.

It’s important to understand that Ofcom propose many more measures than we have space to include in this article. If you would like to have a discussion about the broad range of measures, or understand a specific measure in more detail, reach out to illuminate tech and we’d be happy to arrange a call.

Robust governance and accountability.

Service providers need to implement internal structures and processes to ensure that:

(1)      there is adequate oversight of decision-making processes;

(2)      clear roles and responsibilities exist within the organisation; and

(3)      effective reporting and review mechanisms are place.

For example, all services should name a person accountable to the most senior governance body for compliance with children’s safety duties. Large services and those that pose multiple risks to children should also have internal monitoring and assurance processes to provide independent assurance that the measures they implement are effective.

Safer platform by design choices.

Broadly, these measures include:

  1. implementing age assurance;

  2. making recommender systems (algorithms) safer; and

  3. making sure content moderation systems are effective.

For example, services that don’t prohibit the most harmful kinds of content (known as “primary priority content”) need to prevent children from encountering such content by implementing highly effective age assurance. Services that use algorithms to recommend content to users (“recommender systems”) need to ensure these systems do not show harmful content to children.

The codes do not, at this stage, expect services to implement automated content moderation systems. However, all user-to-user services must have some form of content moderation in place that is designed to “swiftly take action against content harmful to children.” Large services, or those that pose multiple risks for content that is harmful to children, need to set and record internal content policies.

Providing children with tools, information and support.

Broadly, these measures include:

  1. having clear terms of service in place;

  2. making sure children can easily report content and make complaints; and

  3. providing children with tools and support to stay safe.

For example, terms of service should be clear and accessible to children. Services should provide complaints systems that are easy to access and use, acknowledge when they have received a complaint, and let the child know an expected timeframe for their complaint to receive a response. Finally, children should be able to disable comments on their own posts, and certain services should signpost children to support at key points in the user journey.

What’s the next step?

This is a consultation, rather than a final set of proposals. Anyone can respond via Ofcom's website until 17th July. If you have views on keeping children safe online, safeguarding user rights, or the impact of regulation, responding to Ofcom’s consultation is a significant opportunity to play a role in shaping the UK’s digital regulatory landscape. If you’d like support in delivering a response, get in touch with illuminate tech today.

In terms of complying with online safety duties, all services within scope of the Online Safety Act will need to complete an illegal harms risk assessment towards the end of 2024, and will need to comply with all illegal content safety duties from the beginning of 2025, according to current timelines. Services will then need to complete the children’s access assessment in Q2 of 2025, before, if relevant, completing a children’s risk assessment and complying with the duties regarding the protection of children.

The UK’s new online regulatory framework may feel complex, but with research showing that platforms that feel safe will reap financial rewards, it also presents a huge opportunity. Online services that lean into cultivating safer, more trusted platforms won’t just be doing the right thing: they will help distinguish themselves from the competition, build a loyal user base, and develop a positive brand image.

Illuminate tech are uniquely placed to help you unlock these benefits. We combine first-hand experience of the UK’s digital regulatory landscape with deep technical expertise to help your organisation make the most of regulation. Get in touch today to find out more.

Next
Next

The opportunity of regulation