Categories
Legislation Other News

Adult content sites given summer deadline on age checks

Online platforms that show adult content to UK users must introduce age checks by July 2025, under tough new guidelines from Ofcom. The measures explain in practice how websites and apps will have to comply with the Online Safety Act (2023) to stop children finding online pornography.

Children are exposed to adult content from an early age. Research suggests that, among those who have seen it, the average age they first encounter it is 13 – although more than a quarter have seen it by age 11 (27%), and one in 10 as young as nine (10%).

Creating a safer online environment

According to Ofcom, the vast majority of adults (80%) are broadly supportive of age assurance measures to stop children seeing pornography. Age checks will be enforced under the Online Safety Act (OSA), which aims to make the internet a safer environment for UK users, particularly children.

Providers of adult content who publish or produce pornography are covered under Part 5 of the OSA. Under Ofcom’s new guidelines, they need to be implementing highly effective age assurance and restricting access to their site as soon as possible and by July at the latest.

Services covered by the Part 5 guidelines will principally include platforms that publish pornographic content under their own control, such as studios and pay sites, (as opposed to user-generated content). Part 3 of the OSA, covering social media platforms, user-to-user sites and search engines, must comply with a separate set of guidelines from Ofcom, as explained here.

Identifying relevant providers

The guidelines for Part 5 services apply to providers who meet three conditions:

1. pornographic content is published or displayed on their service.
2. the service is not exempt from the OSA.
3. the service has links to the UK.

Condition 1: Under the OSA, a provider of adult content means the entity or individual who controls what is seen on an internet service. Content is pornographic if it is “reasonable to assume that it was produced solely or principally for the purpose of sexual arousal.” This can include still and moving images, audio and audio-visual content, and artificial images whether animated or created by AI.

Condition 2: Pornographic content is not covered by Part 5 if it:

• is user-generated, (in which case it is subject to Part 3).
• consists only of text, or text accompanied by a GIF or emoji (which are not pornographic).
• is an on-demand programme service, for example a pornographic subscription channel.
• is an internal business service (ie intranet services) that meets specific requirements.

Condition 3: Part 5 only applies if a provider “has links with the United Kingdom.” A link with the UK exists if either:

• the service has a significant number of UK users.
or
• UK users form one of the target markets for the service, or the only target market.

The Act does not define what is meant by a “significant number” of UK users. But Ofcom is likely to reject exemption claims made on the basis that a provider only has a relatively small user base. Under the guidelines, Ofcom suggests providers should “err on the side of caution” when assessing whether they have a significant number of UK users.

While “target market” is not defined by the Act, the guidelines confirm that a service would be seen to be targeting the UK if any of the following apply:

• it is marketed toward UK users.
• generates revenue from UK users.
• includes content that is tailored for UK users.
• has a UK domain or provides a UK contact address and/or phone contact number.

An online service may include adult material that falls under Part 3 (eg user-generated content) along with other material (perhaps shot in a studio) that is covered by Part 5. The separate guidelines covering Part 3 services require platforms to carry out a children’s access assessment by 16 April 2025 to establish whether their service is likely to be accessed by children.

Ofcom has confirmed that “services such as tube, cam and fan sites will be covered by both Part 3 and Part 5 guidance. These services must carry out children’s access assessments by 16 April.” Providers unsure whether Part 5 of the OSA applies to them can use Ofcom’s online tool.

Enforcement programme

Part 5 providers must use age assurance (verification, estimation, or both) to ensure that children are not able to see pornographic content. Age assurance tech must be “highly effective” at determining whether a user is a child, and it must ensure that children are prevented from seeing adult content.

All Part 5 services must have highly effective age assurance processes in place by July 2025. The same deadline also applies to Part 3 providers that allow user-generated adult content.

For Part 5 services, specifically, the requirement to adopt age assurance came into effect on 17 January 2025. Ofcom immediately opened an “enforcement programme”, examining progress towards age assurance across the adult content sector. It is also writing to all Part 5 providers, asking for an update on the age assurance measures they’re considering.

Ofcom said, “We will contact a wide range of adult services – large and small – to advise them of their new obligations and monitor their compliance. We will not hesitate to launch investigations and take enforcement action against services that do not comply.”

Determining highly effective age assurance

Providers can either develop their own in-house age assurance or they can buy third party tech. Partnering with an age assurance specialist, such as Luciditi, would allow them to easily embed AI-powered age assurance into their website. Providers must ensure that no pornographic content can be seen before users verify their age.

An age assurance method is highly effective only if it meets each of four criteria:

• It is technically accurate – evaluated against appropriate metrics.
• It is robust – can correctly determine the age of a user in a range of real-world settings.
• It is reliable – is shown to be consistent, particularly when involving AI or machine learning.
• It is fair – minimises bias and discriminatory outcomes.

Providers will need to adopt an age assurance method that Ofcom considers to be reliable, such as:

• Open banking
• Photo-identification matching
• Facial age estimation
• Mobile-network operator age checks
• Credit card checks
• Email-based age estimation
• Digital identity services

These options are explained in more detail in our accompanying article on the Part 3 guidelines, along with methods that Ofcom regards as unreliable. Whichever method is adopted, Ofcom says it should be easy to use without unduly preventing adult users from accessing legal content.

Keeping accurate records

Any reliable age assurance process will likely include both estimation and verification, and as such will be subject to the UK’s data protection laws. For this reason, Part 5 providers must keep accurate records, detailing:

• the kinds of age verification or estimation used, and how they are used.
• how UK users will be protected from a breach of privacy (for example relating to personal data).

Providers must summarise their records in a publicly available statement, which must explain their chosen method of age assurance. Ofcom’s guidelines on privacy are further explained in our article on Part 3 providers.

Ofcom aims to ensure that the internet in the UK will soon be safer for children. This means that providers need to be thinking about the changes that are required by law. The new guidelines can be seen as fleshing out the OSA under which Ofcom has the power to fine repeat offenders up to £18 million, or 10% of qualifying worldwide revenues (whichever is greater).

Melanie Dawes, Ofcom’s Chief Executive, said: “For too long, many online services which allow porn and other harmful material have ignored the fact that children are accessing their services. Either they don’t ask or, when they do, the checks are minimal and easy to avoid. That means companies have effectively been treating all users as if they’re adults, leaving children potentially exposed to porn and other types of harmful content. Today, this starts to change.”

Want to know more?

Luciditi technology can help meet some of the challenges presented by the OSA. If you would like to know more, Contact us for a chat today.

Get in touch

Categories
Legislation Other News

Age checks for social media and search engines

New guidance on age assurance for messaging apps and search engines has been announced by Ofcom. The move is the latest step in the communications watchdog’s response to the Online Safety Act (2023) which will create a safer online environment in the UK, particularly for children.

The new guidance explains how social media platforms and search engines must use “highly effective” age assurance to provide better protection for children. These providers are described in Part 3 of the Act. Similar advice – which we explain separately – has also been issued for providers of pornographic content (described under Part 5 of the Act).

Highly effective age assurance

Messaging apps and search engines are required to carry out a children’s access assessment by 16 April 2025 to establish whether their service is likely to be accessed by children. Providers will need to comply if their services are used by people in the UK, regardless of where the provider is based.

Ofcom have said “we anticipate that most of these services will need to conclude that they are likely to be accessed by children within the meaning of the Act.”

Providers at risk of being used by children will need to adopt an age assurance ‘method’ that Ofcom considers to be reliable. A method is defined as the technology that underpins the process of determining whether or not a user is a child.

Ofcom has made clear that it won’t allow online platforms to pay lip service to the guidelines. It’s not enough simply to adopt the tech. Providers need to clear two important hurdles. Firstly, they must show that their overall process is highly effective. Secondly, since any such process will likely involve personal data, providers must operate within the UK’s data protection laws.

Performance criteria

In meeting the guidelines, providers can either develop their own in-house age assurance or they can buy third party tech. Platforms hoping to skip the cost of starting from scratch can integrate the services they need from an age assurance specialist, such as Luciditi, allowing them to easily embed AI-powered age assurance into their website.

Alternatively, providers may choose to rely on wider system-level age assurance methods. These may eventually be built into devices, app stores, or browser operating systems.

Ofcom says that “regardless of where the age assurance occurs in the ecosystem”, whether developed in-house or by a third party, providers are responsible for ensuring that their age assurance performs to the standard required by Ofcom.

To be sure that their age assurance method is highly effective, providers must ensure it meets four criteria:

• It is technically accurate – evaluated against appropriate metrics.
• It is robust – can correctly determine the age of a user in a range of real-world settings.
• It is reliable – is shown to be consistent, particularly when involving AI or machine learning.
• It is fair – minimises bias and discriminatory outcomes.

Age assurance from a third party supplier who has been certified against the UK Digital Identity and Attributes Trust Framework, (such as Luciditi) won’t automatically be considered to be compliant. But certification may help to show that a provider is working to meet the four criteria to ensure its tech is highly effective.

Approved methods of age assurance

The guidelines offer a non-exhaustive list of options that Ofcom considers capable of providing effective age assurance. These include:

Open banking

A user could allow their bank records to be used for assurance. The bank does not reveal the user’s date of birth, nor any other information. It simply tells the provider that the user is aged 18 or over.
 
Photo-identification matching

A provider can use tech that reads an image from an uploaded photo-ID document and then compares this to a selfie of the user to verify that they are the same person.
 
Facial age estimation

AI-powered tech can analyse the features of the user’s face to estimate their age. Whilst highly accurate when determining ‘adult or not’, for young adults it often results in a step-up requirement. This occurs where the required age is say 18 with a buffer of 5 years, meaning that the user would need to be determined to be at least 23 in order to pass the check on FAE alone. For younger adults, verification is required (see digital identity services below).
 
Mobile-network operator (MNO) age checks

Each of the UK’s MNOs has agreed to automatically apply a content restriction filter (CRF), preventing children from accessing age-restricted websites via pay-as-you-go and contract SIMs. Age checks rely on looking for the CRF on a user’s phone. Users can remove the CRF by proving they are an adult. If the CRF has been removed, the MNO assures providers that the recorded user is over 18.
 
Credit card checks

Credit card issuers must verify that applicants are 18 or over. Providers relying on credit card age checks ask users to enter their card number and a payment processor then checks that the card is valid. Approval by the issuing bank can be taken as evidence that the user is over 18.
 
Email-based age estimation

This is another estimation method, this time analysing online services where the user’s email address has been used. An email address associated with financial institutions such as mortgage lenders indicates the user is likely to be over 18.
 
Digital identity services

Users can confirm their identity via an app such as Luciditi. Verification of that identity is held in their digital wallet, protected by high-grade security. Typically, the user’s digital identity is not shared with a provider. The app simply uses the data to confirm that the user is 18 or over.
 

Unapproved methods

Ofcom warns that some methods will fall short of the guidelines if used without any additional form of age assurance. These include:

• asking a user to enter their date of birth without any further evidence to confirm it.
• asking a user to tick a box to confirm that they are 18 years of age or over.
• relying on payment methods which do not require a user to be 18, for example debit cards.
• relying on a clause in the terms of service that prohibits children from using the service.
• general disclaimers asserting that all users should be 18 years of age or over.
• warnings that the content is only suitable for over 18s.

Technology may fail to measure up to Ofcom’s expectations if it doesn’t meet two important considerations. The tech needs to be accessible (easy to operate by all users) and it must have ‘interoperability’ (it must be able to communicate with other tech systems).

Protecting privacy

Ofcom notes that all age assurance methods use personal data and are therefore subject to the UK’s data protection regime. The guidance suggests that data protection should be designed into a provider’s chosen method from the outset.

Ofcom also urges providers to familiarise themselves with relevant laws on data protection and privacy, including:

  1. Data Protection Act 2018
  2. Privacy and Electronic Communications Regulations (PECR) 2003. The PECR will apply to anyone who stores information on or gains access to information on a user’s device, for example, by using cookies or other similar technologies
  3. UK GDPR – under which data protection principles include:
    o Lawfulness, fairness and transparency
    o Purpose limitation
    o Data minimisation
    o Accuracy
    o Storage limitation
    o Security
    o Accountability

 
Further details on data protection are available from the Information Commissioner’s Office (ICO). In particular, the ICO’s Children’s code is a statutory code of practice which sets out 15 standards that internet services have to follow if they are likely to be accessed by children.

To comply with Ofcom’s Protection of Children Codes of Practice, (due to come into effect in July 2025), providers are required to keep records of the steps they’ve taken to protect children. Records concerning privacy will help to demonstrate a provider’s commitment to data protection. Providers failing to measure up to expectations on data protection may be referred to the ICO.

At first glance, Ofcom’s update may seem like a benign set of guidelines but they will be enforced under the Online Safety Act. Melanie Dawes, Ofcom’s Chief Executive, said “We’ll be monitoring the response from industry closely. Those companies that fail to meet these new requirements can expect to face enforcement action.” Ofcom describe their approach as “flexible, tech-neutral and future-proof.” Their guidelines have the potential to create a safer life online for people in the UK, especially children.

Want to know more?

Luciditi technology can help meet some of the challenges presented by the OSA. If you would like to know more, Contact us for a chat today.

Get in touch

Categories
Legislation Other News Technology

Digital ID to be accepted as proof of age for alcohol sales in the UK

Digital ID to be accepted as proof of age in alcohol sales. Apps like Luciditi offer safer alternative to physical ID documents.

It will soon be possible to buy alcohol using a digital identity to confirm your age, the government has announced. The decision means that consumers will be able to use trusted ID apps like Luciditi rather than take their passport or driving licence to pubs, clubs, and supermarkets.

The move comes in response to a consultation that ran from January to March, 2024. This asked stakeholders whether young people should be allowed to use a digital identity service to prove they’re old enough to buy alcohol.

A clear majority of respondents (72%) said that existing legislation should be updated to allow consumers to use digital ID in retail settings such as supermarkets, off-licences, restaurants, pubs, and clubs. Digital identity apps like Luciditi can facilitate a quick and easy process at the point of sale, similar to contactless payments or scanning a QR code.

Luciditi relies on evidence scanned from personal identity documents uploaded by the user. These are stored in a digital wallet and are securely protected by high-grade security. Typically they’re not shown to a third party, the app simply assures a retailer that the user has been verified as 18 or over.

Digital identity trust framework

Age assurance depends on trust. A digital identity service will only be accepted in the sale of alcohol if it has been certified against government standards.

At the moment, trusted digital ID providers, including Luciditi, are voluntarily certified against the UK’s digital identity trust framework, a set of rules defining a good digital ID service.

The framework will soon be put on a statutory footing by the Data (Use and Access) Bill, (DUA), currently passing through Parliament. After the DUA Bill receives royal assent, expected sometime later this year, the framework will underpin a new Digital Verification Service. This will oversee a public register of certified digital ID providers.

The government has also said that DUA will be amended once it comes into law, expanding the scope of digital identities so they can be used as proof of age in alcohol sales.

Verification, rather than estimation

The consultation process, initiated by the previous government, focused on whether the Licensing Act 2003 should be updated to allow the use of digital ID. A total of 251 complete responses were received from licensing authorities, the alcohol and hospitality industries, policing, trading standards, technology companies, delivery partners, civil society organisations, and members of the public.

Most respondents thought the proposed changes would have a positive impact, though some raised concerns about data protection and the potential for digital identities to be hacked or faked.

Similar concerns raised in the past prompted the decision to create the Digital Verification Service, putting the trust framework on a more formal footing. Only verification will be acceptable in the sale of alcohol. Other technologies, such as age estimation, currently fall outside the framework and won’t be accepted.

Next steps

The government is currently looking at the minimum level of service required from a digital ID provider. At the moment, anyone checking an ID document when selling alcohol needs to confirm three things:

1. Does it show that the person is over 18?
2. Does it belong to the person presenting it?
3. Is it a genuine document?

In practice, this means that a retailer must look at the date of birth to work out the person’s age, compare a photo to the person standing in front of them, and look for security features like holograms or ultraviolet marks to make sure the documents are real.

Digital identity tech needs to do the same, quickly, reliably, and securely. The Department for Science, Innovation and Technology (DSIT) is working with the Home Office on three main requirements.

Firstly, to be certified and included on the new register, a provider must be able to securely use ID data to verify that someone is 18 or over. Secondly, the tech must confirm that the digital identity belongs to the person presenting it. On this, the Office for Digital Identities and Attributes (part of DSIT) said that:

“Digitally, this can be done using biometric authentication. For example, a user can scan their face with their smartphone to access their digital identity. The scan of their face is bound to the photo on the original document. This allows them to securely prove that the identity belongs to them. By logging into the app in this way, the person can prove that the identity belongs to them.”

Thirdly, the identity must be verified as genuine. This means that a digital ID has to be scanned by a device rather than simply assessed by a person, similar to an e-ticket being scanned at a venue. Age assurance may involve scanning a QR code or using NFC technology similar to contactless payments.

Remote sales of alcohol

The Licensing Act was passed in 2003, since then however the way that people buy alcohol has changed. The consultation also looked at using digital IDs in remote sales, when alcohol is bought in a setting that’s not face-to-face.

Drink can be bought online, or in other ways that don’t involve face-to-face contact, for example at supermarket self-checkout tills or in a restaurant that accepts orders via an app. Age checks are currently required at the point of sale, but not at the point of delivery.

The consultation asked whether age checks at the point of delivery should be introduced. This is complicated by the fact that it’s an offence to sell alcohol to a person who’s drunk, and so the consultation also asked whether there should be mandatory checks at the point of delivery to determine whether someone is already intoxicated.

These questions raise practical difficulties, for example in the ability to leave goods in a safe place for the customer to collect. While a majority of respondents (58%) agreed that the Licensing Act should be updated to make it an offence to deliver to someone who’s drunk, there were concerns about how this would work in practice.

On these complicated issues, the government said “this is an area that requires further consideration. We will undertake further work in this area in due course.”

Safer checks for young people

A key priority is to safeguard the aims of the Licensing Act 2003 including the need to protect children from harm. This led to a requirement for age checks, which in practice means that young people have to carry valuable personal documents into pubs, clubs, and restaurants.

When they hand over their document to be checked by whoever’s at the door, people are potentially exposing their name, sex, current address and date of birth. Digital ID promises to maintain the integrity of the Licensing Act while allowing faster, safer and more secure verification.

James Hawkins, from the British Beer and Pub Association, said “this welcome change brings the Licensing Act in line with current technology and will make a visit to the pub easier for both customers and staff.”

Digital ID services generated £2.05 billion in 2023/2024, and employed over 10,000 people – half of them outside London. ID tech, from trusted providers, helps to boost the economy, protect young people, and perhaps even cut queues at the bar.

Categories
Legislation Other News

New Data Bill set to radically expand use of digital ID in the UK

Potential boost to UK economy by £10 billion over 10 years

Digital identity providers are to be regulated in a sweeping overhaul of the law. The Data (Use and Access) Bill will introduce oversight of digital ID, replacing the current voluntary system with a government register that could bring a £10 billion boost to the economy over 10 years.

Digital identities are an increasingly common way of accessing age-restricted online services or products. Providers like Luciditi offer security at the level demanded by banks, allowing individuals to securely verify their identity and/or age and helping to minimise the risk of identity fraud.

Trust is a key feature of an industry that serves as a ‘middle man’ between online consumers and suppliers. At the moment, digital ID providers – including Luciditi – may be voluntarily certified against the UK’s digital identity trust framework, a set of rules defining a good digital ID service.

However, in some cases where identity needs to be verified, for example employers looking to check someone’s criminal record or right to work, trust is needed at a deeper level. Similarly public agencies would be able to operate far more efficiently if identity verification were better regulated.

Setting standards, regulating providers

The Data (Use and Access) Bill – abbreviated to DUA – will transform certification, putting it on a more formal footing. DUA proposes wide-ranging reforms that will expand trust in four main areas:


1. UK digital identity and attributes trust framework
DUA will bring legal standing to the trust framework, expanding it and transforming it into the Digital Verification Service, giving it a broader and more structured regulatory foundation.

2. Register of digital identity services
DUA will establish a publicly available register of digital ID providers, listing organisations that have been independently assessed and certified against the trust framework. Under DUA, ministers will assess applications to join the register and potentially refuse applications or de-list providers.

3. Trust mark
Under DUA, registered providers will be able to display a designated ‘trust mark’ to distinguish their services in the market.

4. Information sharing
DUA will lay the foundations of a new information gateway, which in time will allow public authority data to be shared with registered services to enable identity and eligibility to be checked.  

Reusable ‘smart data’

Key parts of DUA are inherited from a Bill that began life under the previous government but failed to get through parliament before the general election. In the King’s Speech in July, the new Labour government introduced the Digital Information and Smart Data Bill, (DISD).

DISD was updated over the summer, and the Bill was renamed as DUA to reflect its broader scope. Nevertheless, ‘smart data’ remains at its heart – a reference to standard identity data managed in a smart way.

For example, Luciditi’s app allows individuals to upload their personal data to a digital identity wallet. After they’ve done this once, they can then reuse the data as many times as they want whenever they sign up for online services or products that require identity or age checks.

While online suppliers may ask for identity verification or proof of age, Luciditi’s app typically does not release personal data to a third party. It simply gives them basic assurance that the individual is who they say they are and the age they claim to be. The app gives users a choice. If they choose to, they can release the data to a third party that needs to see it, for example when opening a bank account. The key point here is that Luciditi allows users to choose whether to share information, every time.

Standardised reliance on digital identity throughout a particular industry is known as a smart data scheme, though at the moment only online banking comes close to this.

Outside banking, suppliers need to believe that assurance is reliable. For them, digital identity providers who are voluntarily certified against the trust framework can be expected to securely support online access to restricted services and products. Regulation of this system into broader areas of the economy will speed up processes such as renting a home or starting a new job.

Embedding ID tech into public agencies

Under DUA, identity assurance will only be available to an authorised third-party supplier upon an individual’s request. But it paves the way for something more.

The government has said it wants to “harness the power of data for economic growth, to support a modern digital government, and to improve people’s lives.”

Government agencies, managing the data of millions of individuals, are currently bogged down in paperwork. By embedding smart data and identity assurance in public services, DUA will make it easier and quicker to manage people’s personal data, cutting down on laborious bureaucratic procedures.

The government believes that DUA “will free up 1.5 million hours of police time and 140,000 NHS staff hours every year speeding up care and improving patients’ health outcomes.” The Bill would allow for healthcare information – like a patient’s pre-existing conditions, appointments and tests – to be easily accessed in real time across all NHS trusts, GP surgeries and ambulance services, no matter which IT system they were using.

The new Bill will also allow births and deaths to be recorded online instead of via the current paper-based system. Registrations could also be carried out over the phone, not just in person.

For some people, making it easier to release their personal health data will be a cause for concern. In Luciditi’s case, the developers’ previous product already safeguards health data for millions of people across the UK. The Luciditi app benefits from this same level of trusted security.

The government has confirmed that the Bill does not include a mandatory national digital ID card, or any requirement to possess a digital identity. By simply creating a legislative structure of standards, governance and oversight for providers, DUA will demand that all registered digital ID providers meet a similar standard.

Managing oversight of DUA


The trust framework and register of providers will be overseen by a newly created team known as the Office for Digital Identities and Attributes (OfDIA) which sits within the Department for Science, Innovation and Technology. DUA’s provisions will be carried out, under the authority of the DSIT Secretary of State.

OfDIA staff created the trust framework, in collaboration with industry, academia, and civil society groups, and intend to review and refresh it every year. OfDIA believe that, under the framework, hundreds of thousands of digital identity checks are already taking place each month.

Beyond digital ID, the new Bill also supports a national chart of the UK’s underground infrastructure. The National Underground Asset Register is a new digital map that will shape the way pipes and cables are installed, operated and repaired. It will give planners and excavators standardised, secure, instant access to the information they need, reducing excavation accidents that can quickly disrupt business.

Next steps


DUA was introduced in the House of Lords on 23 October and will take a year or so to work its way through Parliament. Welcoming its introduction, Technology Secretary Peter Kyle said: “This Bill will help us boost the UK’s economy, free up vital time for our front-line workers, and relieve people from unnecessary admin so that they can get on with their lives.”

The new legislation won’t force anyone to use a digital identity. But a legally-protected structure of standards will give consumers and online suppliers new confidence in the broader use of smart data schemes.

Data is critical for UK business: 77% of UK companies handle some form of digital data, increasing to 99% for businesses employing more than 10 people. DUA has the potential to unlock much needed national growth and ensure that, in the push towards digitally-based business, Britain isn’t left behind.