Age checks to protect children online

Latest News,Legislation,Technology

Philip Young — February 21, 2025

New guidance on age assurance for messaging apps and search engines has been announced by Ofcom. The move is the latest step in the communications watchdog’s response to the Online Safety Act (2023) which will create a safer online environment in the UK, particularly for children.

The new guidance explains how social media platforms and search engines must use “highly effective” age assurance to provide better protection for children. These providers are described in Part 3 of the Act. Similar advice – which we explain separately – has also been issued for providers who display pornographic content (described under Part 5 of the Act).

Highly effective age assurance

Messaging apps and search engines are required to carry out a children’s access assessment by 16 April 2025 to establish whether their service is likely to be accessed by children. Providers will need to comply if their services are used by people in the UK, regardless of where the provider is based.

Ofcom have said “we anticipate that most of these services will need to conclude that they are likely to be accessed by children within the meaning of the Act.”

Providers at risk of being used by children will need to adopt an age assurance ‘method’ that Ofcom considers to be reliable. A method is defined as the technology that underpins the process of determining whether or not a user is a child.

Ofcom has made clear that it won’t allow online platforms to pay lip service to the guidelines. It’s not enough simply to adopt the tech. Providers need to clear two important hurdles. Firstly, they must show that their overall process is highly effective. Secondly, since any such process will likely involve personal data, providers must operate within the UK’s data protection laws.

Performance criteria

In meeting the guidelines, providers can either develop their own in-house age assurance or they can buy third party tech. Platforms hoping to skip the cost of starting from scratch can integrate the services they need from an age assurance specialist, such as Luciditi, allowing them to easily embed AI-powered age assurance into their website.

Alternatively, providers may choose to rely on wider system-level age assurance methods. These may eventually be built into devices, app stores, or browser operating systems.

Ofcom says that “regardless of where the age assurance occurs in the ecosystem”, whether developed in-house or by a third party, providers are responsible for ensuring that their age assurance performs to the standard required by Ofcom.

To be sure that their age assurance method is highly effective, providers must ensure it meets four criteria:

• It is technically accurate – evaluated against appropriate metrics.
• It is robust – can correctly determine the age of a user in a range of real-world settings.
• It is reliable – is shown to be consistent, particularly when involving AI or machine learning.
• It is fair – minimises bias and discriminatory outcomes.

Age assurance from a third party supplier who has been certified against the UK Digital Identity and Attributes Trust Framework, (such as Luciditi) won’t automatically be considered to be compliant. But certification may help to show that a provider is working to meet the four criteria to ensure its tech is highly effective.

Approved methods of age assurance

The guidelines offer a non-exhaustive list of options that Ofcom considers capable of providing effective age assurance. These include:

Open banking

A user could allow their bank records to be used for assurance. The bank does not reveal the user’s date of birth, nor any other information. It simply tells the provider that the user is aged 18 or over.
 
Photo-identification matching

A provider can use tech that reads an image from an uploaded photo-ID document and then compares this to a selfie of the user to verify that they are the same person.
 
Facial age estimation

AI-powered tech can analyse the features of the user’s face to estimate their age. Whilst highly accurate when determining ‘adult or not’, for young adults it often results in a step-up requirement. This occurs where the required age is say 18 with a buffer of 5 years, meaning that the user would need to be determined to be at least 23 in order to pass the check on FAE alone. For younger adults, verification is required (see digital identity services below).
 
Mobile-network operator (MNO) age checks

Each of the UK’s MNOs has agreed to automatically apply a content restriction filter (CRF), preventing children from accessing age-restricted websites via pay-as-you-go and contract SIMs. Age checks rely on looking for the CRF on a user’s phone. Users can remove the CRF by proving they are an adult. If the CRF has been removed, the MNO assures providers that the recorded user is over 18.
 
Credit card checks

Credit card issuers must verify that applicants are 18 or over. Providers relying on credit card age checks ask users to enter their card number and a payment processor then checks that the card is valid. Approval by the issuing bank can be taken as evidence that the user is over 18.
 
Email-based age estimation

This is another estimation method, this time analysing online services where the user’s email address has been used. An email address associated with financial institutions such as mortgage lenders indicates the user is likely to be over 18.
 
Digital identity services

Users can confirm their identity via an app such as Luciditi. Verification of that identity is held in their digital wallet, protected by high-grade security. Typically, the user’s digital identity is not shared with a provider. The app simply uses the data to confirm that the user is 18 or over.
 

Unapproved methods

Ofcom warns that some methods will fall short of the guidelines if used without any additional form of age assurance. These include:

• asking a user to enter their date of birth without any further evidence to confirm it.
• asking a user to tick a box to confirm that they are 18 years of age or over.
• relying on payment methods which do not require a user to be 18, for example debit cards.
• relying on a clause in the terms of service that prohibits children from using the service.
• general disclaimers asserting that all users should be 18 years of age or over.
• warnings that the content is only suitable for over 18s.

Technology may fail to measure up to Ofcom’s expectations if it doesn’t meet two important considerations. The tech needs to be accessible (easy to operate by all users) and it must have ‘interoperability’ (it must be able to communicate with other tech systems).

Protecting privacy

Ofcom notes that all age assurance methods use personal data and are therefore subject to the UK’s data protection regime. The guidance suggests that data protection should be designed into a provider’s chosen method from the outset.

Ofcom also urges providers to familiarise themselves with relevant laws on data protection and privacy, including:

  1. Data Protection Act 2018
  2. Privacy and Electronic Communications Regulations (PECR) 2003. The PECR will apply to anyone who stores information on or gains access to information on a user’s device, for example, by using cookies or other similar technologies
  3. UK GDPR – under which data protection principles include:
    o Lawfulness, fairness and transparency
    o Purpose limitation
    o Data minimisation
    o Accuracy
    o Storage limitation
    o Security
    o Accountability

 
Further details on data protection are available from the Information Commissioner’s Office (ICO). In particular, the ICO’s Children’s code is a statutory code of practice which sets out 15 standards that internet services have to follow if they are likely to be accessed by children.

To comply with Ofcom’s Protection of Children Codes of Practice, (due to come into effect in July 2025), providers are required to keep records of the steps they’ve taken to protect children. Records concerning privacy will help to demonstrate a provider’s commitment to data protection. Providers failing to measure up to expectations on data protection may be referred to the ICO.

At first glance, Ofcom’s update may seem like a benign set of guidelines but they will be enforced under the Online Safety Act. Melanie Dawes, Ofcom’s Chief Executive, said “We’ll be monitoring the response from industry closely. Those companies that fail to meet these new requirements can expect to face enforcement action.” Ofcom describe their approach as “flexible, tech-neutral and future-proof.” Their guidelines have the potential to create a safer life online for people in the UK, especially children.

Want to know more?

Luciditi technology can help meet some of the challenges presented by the OSA. If you would like to know more, Contact us for a chat today.

Get in touch