Categories
Latest News Technology

Top Shelf 2.0 – Keeping Adult Content Out of Children’s Reach

Why Highly Effective Age Assurance is the Digital Equivalent of Keeping Adult Content Out of Children’s Reach.

Before the Internet, access to adult content was governed by a simple physical rule: children couldn’t reach what they couldn’t see. Adult video tapes were sold inside adult-only stores that had covered windows; and magazines containing adult content weren’t banned or hidden completely but placed out of reach on the top shelf of newsagents. It was a pragmatic solution, not entirely foolproof, but widely accepted.

With the introduction of the UK Online Safety Act, and the debate surrounding age assurance, we are essentially returning to this principle, updated for the digital age. Yet critics argue that age verification technology is inherently flawed or ripe for abuse. If we step back from the binary rhetoric of privacy versus protection, it becomes clear that the proposed digital safeguards mirror measures we’ve already deemed necessary in the physical world.

A Brief History of the Top Shelf

The idea of placing adult magazines on the top shelf of shops emerged from growing public concern in the 1970s and 1980s about children’s exposure to sexually explicit content. Though there was no specific legislation mandating the practice, it was reinforced by voluntary codes of conduct and the influence of the Indecent Displays (Control) Act 1981. This Act prohibited the public display of indecent material that could be seen by children or non-consenting adults, and retailers adopted the “top shelf” as a practical compromise.

Newsagents, supermarkets, and convenience stores agreed to keep pornographic magazines above children’s eye level, often with covers partly obscured by modesty boards. This system relied not on surveillance or identity checks but on design and societal consensus. We accepted the idea that adult content was legal but should be restricted to adults. It worked – perhaps imperfectly – but well enough to be accepted in society for decades.

The Challenge of the Digital Shelf

Fast forward to 2025. Children now navigate digital environments far more freely than physical ones. The internet has no shelf height. Age gates – simple pop-ups asking users to confirm they’re over 18 – are effectively useless, as any child can click “Yes”. And unlike a newsagent, the web doesn’t come with a shopkeeper who can intervene.

The Online Safety Act 2023, overseen by Ofcom, is the UK’s legislative attempt to replicate these social norms online. At its core, the Act mandates platforms that host user-generated or pornographic content to implement Highly Effective Age Assurance technologies. According to Ofcom’s detailed guidance released in 2024 and in early 2025, this doesn’t simply mean checking a tick-box. It means deploying proven, privacy-preserving technologies that can reliably estimate or verify a user’s age.

What is ‘Highly Effective Age Assurance’?

According to Ofcom’s official guidance (Part 3 of its age assurance statement), a “highly effective” system is one that:

• Minimises the risk of children accessing harmful content.
• Is accurate and resistant to circumvention.
• Protects user privacy and data security.
• Does not unduly restrict adult users from accessing legal content.

These systems can range from age estimation using facial analysis (without retaining biometric data), to checks against government-issued ID or Open Banking and the use of one-tap digital identity solutions such as provided by the Luciditi App. Presented to relying parties and end users as GDPR-compliant privacy-preserving multi-option solutions, leveraging zero-knowledge proofs or equivalent methods to confirm age without disclosing personal identity or traceable information.

Furthermore, “orchestration service providers” like Luciditi can provide solutions that provide end-users with multiple ways to prove their age – all of which

This is not surveillance – it’s selective access control and disclosure. Just as we did not demand names or addresses at the newsagent’s counter, the goal here isn’t to build dossiers on citizens, but to keep adult content where it belongs: accessible to adults, not children and ensuring that the content cannot be accidentally or inadvertently stumbled upon while casually surfing the web. In 2023, The Children’s Commissioner reported that a quarter of young people had encountered pornography by the age of just 11.

Critics and Concerns

Critics of age assurance suggest that these measures are a Trojan Horse for censorship and state control. They warn that mandatory verification could drive users to unregulated or offshore sites or expose them to data breaches.

These concerns are not entirely unfounded – early age verification systems in the UK (such as the 2019 Digital Economy Act’s attempted implementation) failed due to poor planning and data protection issues. Technology has moved on as has our understanding and implementation of solutions built upon the principle of privacy by design. We now have biometric estimation tools that don’t store images, encrypted ID checkers, and robust oversight and enforcement from Ofcom to ensure transparency.

Moreover, the comparison to censorship falls short. Censorship is about restricting content for everyone. Age assurance is about restricting content for children. The distinction is fundamental.

Conclusion: From Paper to Pixels

The principle underpinning Ofcom’s guidance on highly effective age assurance is not new – it’s simply a modern reapplication of the values that have governed adult content for decades. The physical world used shelf height and obscured covers; the digital world uses encrypted ID checks and biometric age estimation.

In both cases, society accepts that legal adult content should be restricted to adults. The Online Safety Act makes that principle enforceable online, where shelf height is no longer a barrier.

Far from being a scam or surveillance plot, age assurance done well is a necessary evolution of longstanding norms. And just as few today question the logic of putting adult magazines out of children’s reach, so too should we come to see highly effective digital age assurance as common sense in the online era.

Want to know more?

Luciditi technology can help organisations implement Digital Identity and Age Proofing Technology for online and in person use cases. If you would like to know how, Contact us for a chat today.

Get in touch

Categories
Legislation Other News

Adult content sites given summer deadline on age checks

Online platforms that show adult content to UK users must introduce age checks by July 2025, under tough new guidelines from Ofcom. The measures explain in practice how websites and apps will have to comply with the Online Safety Act (2023) to stop children finding online pornography.

Children are exposed to adult content from an early age. Research suggests that, among those who have seen it, the average age they first encounter it is 13 – although more than a quarter have seen it by age 11 (27%), and one in 10 as young as nine (10%).

Creating a safer online environment

According to Ofcom, the vast majority of adults (80%) are broadly supportive of age assurance measures to stop children seeing pornography. Age checks will be enforced under the Online Safety Act (OSA), which aims to make the internet a safer environment for UK users, particularly children.

Providers of adult content who publish or produce pornography are covered under Part 5 of the OSA. Under Ofcom’s new guidelines, they need to be implementing highly effective age assurance and restricting access to their site as soon as possible and by July at the latest.

Services covered by the Part 5 guidelines will principally include platforms that publish pornographic content under their own control, such as studios and pay sites, (as opposed to user-generated content). Part 3 of the OSA, covering social media platforms, user-to-user sites and search engines, must comply with a separate set of guidelines from Ofcom, as explained here.

Identifying relevant providers

The guidelines for Part 5 services apply to providers who meet three conditions:

1. pornographic content is published or displayed on their service.
2. the service is not exempt from the OSA.
3. the service has links to the UK.

Condition 1: Under the OSA, a provider of adult content means the entity or individual who controls what is seen on an internet service. Content is pornographic if it is “reasonable to assume that it was produced solely or principally for the purpose of sexual arousal.” This can include still and moving images, audio and audio-visual content, and artificial images whether animated or created by AI.

Condition 2: Pornographic content is not covered by Part 5 if it:

• is user-generated, (in which case it is subject to Part 3).
• consists only of text, or text accompanied by a GIF or emoji (which are not pornographic).
• is an on-demand programme service, for example a pornographic subscription channel.
• is an internal business service (ie intranet services) that meets specific requirements.

Condition 3: Part 5 only applies if a provider “has links with the United Kingdom.” A link with the UK exists if either:

• the service has a significant number of UK users.
or
• UK users form one of the target markets for the service, or the only target market.

The Act does not define what is meant by a “significant number” of UK users. But Ofcom is likely to reject exemption claims made on the basis that a provider only has a relatively small user base. Under the guidelines, Ofcom suggests providers should “err on the side of caution” when assessing whether they have a significant number of UK users.

While “target market” is not defined by the Act, the guidelines confirm that a service would be seen to be targeting the UK if any of the following apply:

• it is marketed toward UK users.
• generates revenue from UK users.
• includes content that is tailored for UK users.
• has a UK domain or provides a UK contact address and/or phone contact number.

An online service may include adult material that falls under Part 3 (eg user-generated content) along with other material (perhaps shot in a studio) that is covered by Part 5. The separate guidelines covering Part 3 services require platforms to carry out a children’s access assessment by 16 April 2025 to establish whether their service is likely to be accessed by children.

Ofcom has confirmed that “services such as tube, cam and fan sites will be covered by both Part 3 and Part 5 guidance. These services must carry out children’s access assessments by 16 April.” Providers unsure whether Part 5 of the OSA applies to them can use Ofcom’s online tool.

Enforcement programme

Part 5 providers must use age assurance (verification, estimation, or both) to ensure that children are not able to see pornographic content. Age assurance tech must be “highly effective” at determining whether a user is a child, and it must ensure that children are prevented from seeing adult content.

All Part 5 services must have highly effective age assurance processes in place by July 2025. The same deadline also applies to Part 3 providers that allow user-generated adult content.

For Part 5 services, specifically, the requirement to adopt age assurance came into effect on 17 January 2025. Ofcom immediately opened an “enforcement programme”, examining progress towards age assurance across the adult content sector. It is also writing to all Part 5 providers, asking for an update on the age assurance measures they’re considering.

Ofcom said, “We will contact a wide range of adult services – large and small – to advise them of their new obligations and monitor their compliance. We will not hesitate to launch investigations and take enforcement action against services that do not comply.”

Determining highly effective age assurance

Providers can either develop their own in-house age assurance or they can buy third party tech. Partnering with an age assurance specialist, such as Luciditi, would allow them to easily embed AI-powered age assurance into their website. Providers must ensure that no pornographic content can be seen before users verify their age.

An age assurance method is highly effective only if it meets each of four criteria:

• It is technically accurate – evaluated against appropriate metrics.
• It is robust – can correctly determine the age of a user in a range of real-world settings.
• It is reliable – is shown to be consistent, particularly when involving AI or machine learning.
• It is fair – minimises bias and discriminatory outcomes.

Providers will need to adopt an age assurance method that Ofcom considers to be reliable, such as:

• Open banking
• Photo-identification matching
• Facial age estimation
• Mobile-network operator age checks
• Credit card checks
• Email-based age estimation
• Digital identity services

These options are explained in more detail in our accompanying article on the Part 3 guidelines, along with methods that Ofcom regards as unreliable. Whichever method is adopted, Ofcom says it should be easy to use without unduly preventing adult users from accessing legal content.

Keeping accurate records

Any reliable age assurance process will likely include both estimation and verification, and as such will be subject to the UK’s data protection laws. For this reason, Part 5 providers must keep accurate records, detailing:

• the kinds of age verification or estimation used, and how they are used.
• how UK users will be protected from a breach of privacy (for example relating to personal data).

Providers must summarise their records in a publicly available statement, which must explain their chosen method of age assurance. Ofcom’s guidelines on privacy are further explained in our article on Part 3 providers.

Ofcom aims to ensure that the internet in the UK will soon be safer for children. This means that providers need to be thinking about the changes that are required by law. The new guidelines can be seen as fleshing out the OSA under which Ofcom has the power to fine repeat offenders up to £18 million, or 10% of qualifying worldwide revenues (whichever is greater).

Melanie Dawes, Ofcom’s Chief Executive, said: “For too long, many online services which allow porn and other harmful material have ignored the fact that children are accessing their services. Either they don’t ask or, when they do, the checks are minimal and easy to avoid. That means companies have effectively been treating all users as if they’re adults, leaving children potentially exposed to porn and other types of harmful content. Today, this starts to change.”

Want to know more?

Luciditi technology can help meet some of the challenges presented by the OSA. If you would like to know more, Contact us for a chat today.

Get in touch

Categories
Legislation Other News

New age assurance guidelines for user-to-user and search platforms

Loading the Elevenlabs Text to Speech AudioNative Player...

 

Ofcom’s second consultation offers early insight into new rules

New guidelines protecting children from harmful content bring search engines and user-to-user platforms a step closer to mandatory age assurance. The draft regulations from Ofcom, the UK’s online safety regulator, are open to consultation. But they provide an early glance at the tough new rules that will restrict access to content from 2025.

The proposed guidelines are Ofcom’s latest response to the Online Safety Act. Passed last year, the Act will give Britain one of the toughest online regulatory systems in the world. Social media apps, search engines and other online services will need to adopt robust age checks and stop their algorithms recommending harmful content to children.

What is harmful content?

This is the second of Ofcom’s four consultation exercises on finalising the regulations that will flesh out the Act’s skeleton framework. The first, which closed in February, focused on protecting people from illegal content. The current discussions will lead to new rules designed to stop children accessing harmful content. The Act divides harmful content into three broad categories:

Primary priority content (PPC) that is harmful to children:

Pornographic content, and content which encourages, promotes, or provides instructions for suicide, self-harm, and eating disorders.

Priority content (PC) that is harmful to children:

Content which is abusive or incites hatred, bullying content, and content which encourages, promotes, or provides instructions for violence, dangerous stunts and challenges, and self-administering harmful substances.

Non-designated content that presents a material risk of harm to children:

Any types of content that do not fall within the above two categories which presents “a material risk of significant harm to an appreciable number of UK children.”
 
Based on these definitions, Ofcom has published draft Children’s Safety Codes which aim to ensure that:

  1. Children will not normally be able to access pornography.
  2. Children will be protected from seeing, and being recommended, potentially harmful content.
  3. Children will not be added to group chats without their consent.
  4. It will be easier for children to complain when they see harmful content, and they can be more confident that their complaints will be acted on.

 

Creating a safer online environment

In a four-week period (June-July 2023), Ofcom found that 62% of children aged 13-17 encountered PPC/PC online. Research also found that children consider violent content ‘unavoidable’ online, and that nearly two-thirds of children and young adults (13-19) have seen pornographic content. The number of girls aged 13-21 who have been subject to abusive or hateful comments online has almost tripled in 10 years from 20% in 2013 to 57% in 2023.

To create a safer online environment for children, Ofcom has outlined a series of steps that search services and user-to-user platforms will be expected to take.

Online services must determine whether or not they are likely to be accessed by children. To help in this, Ofcom has posted an online tool, here. Platforms that are likely to be accessed by children must:

  1. Complete a risk assessment to identify risks posed to children, drawing on Ofcom’s ‘children’s risk profiles’.
  2. Prevent children from encountering primary priority content relating to suicide, self-harm, eating disorders, and pornography. Services must also minimise children’s exposure to other serious harms defined as ‘priority content’, including violent, hateful or abusive material, bullying content, and content promoting dangerous challenges.
  3. Implement and review safety measures to mitigate the risks to children. Ofcom’s Safety Codes include more than 40 measures such as robust age checks, safer algorithms, effective moderation, strong governance and accountability, and more information and support for children including easy-to-use reporting and complaints processes.

 

Highly effective age assurance

There is no single fix-all measure that services can take to protect children online. But the package of measures recommended by Ofcom prominently relies on age assurance. Ofcom anticipates that most digital services not using age assurance are likely to be accessed by children. Once the final draft of the new rules comes into force, age assurance will be mandatory.
 
In practice, this will mean that all services will have to ban harmful content or introduce what Ofcom describes as “highly effective age-checks” restricting access to either the whole platform or parts of it that offer adults-only content. Ofcom defines “highly effective” as age assurance capable of technical accuracy, robustness, reliability, and fairness, with further details here.
 
Regulated services will no longer be able to get away with an ineffective ‘I am 18’ button. They will need to commit to age assurance technology to ensure their services are safer by design.
 
The quickest way of doing this is to adopt a proven digital ID product, like Luciditi. Ian Moody, Luciditi co-founder and CEO, says, “Easier and more cost-effective than starting from scratch, Luciditi can be easily embedded in web sites or apps, either by using a pre-built plugin or by using our Software Development Kit.”
 
Ofcom have specifically said their measures will apply to all sites that fall within the scope of the Act, irrespective of the size of the business. ‘We’re too small to be relevant’, won’t wash as an excuse.
 
Services cannot refuse to take steps to protect children simply because the work is too expensive or inconvenient. Ofcom says, “protecting children is a priority and all services, even the smallest, will have to take action as a result of our proposals.”
 

“Don’t wait for enforcement and hefty fines” – Tech Sec

According to Ofcom, children who have encountered harmful content experience feelings of anxiety, shame or guilt, sometimes leading to a wide-ranging and severe impact on their physical and mental wellbeing.
 
The lawlessness exploited by some of the world’s leading social media platforms has contributed to the deaths of children like 14-year-old Molly Russell. The coroner’s report concluded that watching content promoting suicide and self-harm had contributed to Molly’s death by suicide.
 
“We want children to enjoy life online”, said Dame Melanie Dawes, Ofcom Chief Executive, “but for too long, their experiences have been blighted by seriously harmful content which they can’t avoid or control. Many parents share feelings of frustration and worry about how to keep their children safe. That must change.”
 
The consultation exercise closes on July 17, 2024. Ofcom says, “We will take all feedback into account, as well as engaging with children to hear what they think of our plans. We expect to finalise our proposals and publish our final statement and documents in spring 2025.”
 
Welcoming Ofcom’s proposals, Technology Secretary Michelle Donelan said, “To platforms, my message is engage with us and prepare. Do not wait for enforcement and hefty fines – step up to meet your responsibilities and act now.”
 
The Online Safety Act doesn’t pull its punches. Repeat offenders will potentially be fined up to £18 million or 10% of global revenue, whichever is greater, and company managers risk going to jail for up to two years. In the coming months, platforms will need to be proactive in committing to the age assurance products that will help them stay on the right side of the law.
 
In Britain at least, the carefree distribution of harmful content is about to change. Ofcom’s proposals go much further than current industry practice and demand a step-change from tech firms in how UK children are protected online.
 

Want to know more?

Luciditi’s Age Assurance technology can help companies meet these strict new guidelines.  If you would like to know more, Contact us for a chat today.

Get in touch

Categories
Legislation Other News

Understanding the Online Safety Act: Implications for Adult Sites

Loading the Elevenlabs Text to Speech AudioNative Player...

Ofcom calls for biometric age checks to stop children seeing adult content

Tough new guidance from Ofcom aims to protect children from seeing online pornography. The Online Safety Act, passed last autumn, restricts underage access to adult content. New details have been published explaining how this can be done through age assurance, giving digital identity platforms like Luciditi a frontline role in helping content providers stay on the right side of the law.

On average, children first see online pornography at age 13 – although nearly a quarter discover it by age 11 (27%), and one in 10 as young as nine (10%), according to research. Before turning 18, nearly eight in 10 youngsters (79%) have encountered violent pornography showing coercive, degrading or pain-inducing sex acts.

The Online Safety Act (OSA) aims to protect children by making the internet in the UK the safest place online in the world. Under the OSA, sites and apps showing adult content will have to ensure that children can’t access their platform.

Highly effective age checks

The new law has been described as a skeleton act. The bare bones approved by parliament will be fleshed out one topic at a time by the communications watchdog Ofcom.

Ofcom’s first update, last November, focused on protecting people from online harms. Now, its second period of consultation and guidance aims to protect children from online pornography through what it describes as “highly effective age checks.” The new guidance looks in detail at the age assurance tech that providers will need to adopt.

The porn industry has long been an early adopter of innovation – from online credit card transactions to live streaming. Age assurance, tried and trusted in other sectors, is unlikely to pose any technical challenges whether providers develop it in-house or adopt an existing product.

Businesses flouting the OSA can be fined up to £18 million or 10% of global revenue, and their directors jailed for up to two years. Nevertheless, the vast majority of adult content providers will be committed to maintaining a profitable, stable, and compliant operation that avoids tangling with the law. They don’t want kids looking at inappropriate material any less than anyone else.

The difficulties of staying in-house

To comply with the OSA, providers must introduce age assurance – through age estimation, age verification or a combination of both.

In most cases, adults will be able to access a site through age estimation tech. Smart AI assesses a selfie in estimating whether a user is at least five years older than 18. Users who are 18 or thereabouts will be asked to verify their age through personal identity data confirming their date of birth.

The big question for both providers and users is who should oversee the selfies and data, providers or third-party specialists?

If developed in-house, estimation and verification can bring challenges perhaps unique to the porn industry. Criminals target users by surreptitiously activating the camera on their device and threatening to release the footage if money isn’t handed over. Just the threat of this can lead to a payout, even without evidence that the camera was actually activated.

Mindful of a risk of blackmail or other breaches of anonymity, users may be reluctant to send a selfie to a porn site. Asking them to give up their personal data poses an even bigger challenge. Explicit website Pornhub said regulations requiring the collection of “highly sensitive personal information” could jeopardise user safety.

Social media users are already sceptical – memes have started appearing showing someone accessing a porn site and being asked for a selfie before they enter. In the US, similar worries about age checks led users to access porn sites via a virtual private network (VPN). In Utah, demand for VPNs surged by 847% the day after new age checks came into effect.

Staying in-house means having to overcome widespread concerns. Providers who are legitimate, established, and successful but owned by an international parent group may particularly struggle to persuade British users that their selfie and data will be permanently and properly safeguarded.

Expertise from Luciditi

There is an easy, trusted alternative to the in-house route. Digital ID platforms such as Luciditi create an ‘air-gapped’ solution. A specialist in age assurance, Luciditi is British, well-established, and trusted by the UK government as Britain’s first supplier of a digital PASS proof of age card. Its developers, who have a background in digitally managing sensitive NHS records, have brought Luciditi to a range of industries. Users are already sending selfies and data to Luciditi for other age-restricted products or services.

Ofcom suggests that age assurance could involve tech associated with facial age estimation, photo ID matching and open banking all of which Luciditi already perform. Luciditi securely processes all selfies and data and instantly destroys it after use. Nothing is given to a third-party beyond an automated nod that a user is an adult. This meets Ofcom’s requirement for providers to take care in safeguarding privacy.

Prevention of tracking also an important factor, not just by the site operator, but also by the data source. So if a user chooses Open Banking to prove their age, your bank can’t see “why” they needed it or “whom” they shared it with – often called a “double blind” verification. Having certified systems handling privacy, anonymity and security is essential if it is ever to be trusted by users.

“We’re perfectly placed to support the adult content industry with age assurance”, said Ian Moody, Luciditi CEO, “our in-depth experience in supporting online providers of other age-restricted products means we can quickly bring sites up to the new standards set by Ofcom.”

Embedded in a provider’s site, Luciditi’s tech would operate behind the scenes, independently overseeing access. Providers could welcome new users with a message saying that access is managed by a reputable, independent third-party, safeguarding anonymity. This would assure users that they are not sending anything directly to the owners of a porn site. Additionally, providers can embed Luciditi across all their age-restricted products and services, whether relating to adult content or not.

User-generated content

As an established digital identity platform, Luciditi supports individuals as well as businesses. Users download the Luciditi app, which is free and easy to use. This lets them create their own digital identity wallet, safely storing their selfie and data and letting them breeze through an age check in a couple of taps.

This facility will benefit providers who host adult user-generated content and who need to know that performers are aged 18 or over. This issue isn’t covered by the latest guidance but will be included in Ofcom’s next update, due in spring 2024. Providers who choose to act early can future-proof their business now by addressing this issue as part of their wider approach to age assurance.

No alternatives

During the current process of consultation, which ends on March 5th, Ofcom will not be looking at softer options. For providers looking to retain their audience, age assurance is the only show in town. “Our practical guidance sets out a range of methods for highly effective age checks”, said Dame Melanie Dawes, Ofcom’s Chief Executive, “we’re clear that weaker methods – such as allowing users to self-declare their age – won’t meet this standard.”

The OSA effectively fired a starting gun. The race is now on for adult content providers to accept its provisions, take appropriate action, and adopt the tech they need before the law is enforced in or after 2025.

It’s not just about completing the work before the new measures are actively enforced. It’s about acting quickly to maintain a competitive position. Businesses that build trust early will seize the advantage in developing their market share. It’s not just the new law that providers need to be mindful of, it’s each other.

Want to know more?

Luciditi’s Age Assurance technology can help meet some of the challenges presented by the OSA. If you would like to know more, Contact us for a chat today.

Get in touch