Categories
Latest News Legislation Technology

New age assurance guidelines for user-to-user and search platforms

Loading the Elevenlabs Text to Speech AudioNative Player...

 

Ofcom’s second consultation offers early insight into new rules

New guidelines protecting children from harmful content bring search engines and user-to-user platforms a step closer to mandatory age assurance. The draft regulations from Ofcom, the UK’s online safety regulator, are open to consultation. But they provide an early glance at the tough new rules that will restrict access to content from 2025.

The proposed guidelines are Ofcom’s latest response to the Online Safety Act. Passed last year, the Act will give Britain one of the toughest online regulatory systems in the world. Social media apps, search engines and other online services will need to adopt robust age checks and stop their algorithms recommending harmful content to children.

What is harmful content?

This is the second of Ofcom’s four consultation exercises on finalising the regulations that will flesh out the Act’s skeleton framework. The first, which closed in February, focused on protecting people from illegal content. The current discussions will lead to new rules designed to stop children accessing harmful content. The Act divides harmful content into three broad categories:

Primary priority content (PPC) that is harmful to children:

Pornographic content, and content which encourages, promotes, or provides instructions for suicide, self-harm, and eating disorders.

Priority content (PC) that is harmful to children:

Content which is abusive or incites hatred, bullying content, and content which encourages, promotes, or provides instructions for violence, dangerous stunts and challenges, and self-administering harmful substances.

Non-designated content that presents a material risk of harm to children:

Any types of content that do not fall within the above two categories which presents “a material risk of significant harm to an appreciable number of UK children.”
 
Based on these definitions, Ofcom has published draft Children’s Safety Codes which aim to ensure that:

  1. Children will not normally be able to access pornography.
  2. Children will be protected from seeing, and being recommended, potentially harmful content.
  3. Children will not be added to group chats without their consent.
  4. It will be easier for children to complain when they see harmful content, and they can be more confident that their complaints will be acted on.

 

Creating a safer online environment

In a four-week period (June-July 2023), Ofcom found that 62% of children aged 13-17 encountered PPC/PC online. Research also found that children consider violent content ‘unavoidable’ online, and that nearly two-thirds of children and young adults (13-19) have seen pornographic content. The number of girls aged 13-21 who have been subject to abusive or hateful comments online has almost tripled in 10 years from 20% in 2013 to 57% in 2023.

To create a safer online environment for children, Ofcom has outlined a series of steps that search services and user-to-user platforms will be expected to take.

Online services must determine whether or not they are likely to be accessed by children. To help in this, Ofcom has posted an online tool, here. Platforms that are likely to be accessed by children must:

  1. Complete a risk assessment to identify risks posed to children, drawing on Ofcom’s ‘children’s risk profiles’.
  2. Prevent children from encountering primary priority content relating to suicide, self-harm, eating disorders, and pornography. Services must also minimise children’s exposure to other serious harms defined as ‘priority content’, including violent, hateful or abusive material, bullying content, and content promoting dangerous challenges.
  3. Implement and review safety measures to mitigate the risks to children. Ofcom’s Safety Codes include more than 40 measures such as robust age checks, safer algorithms, effective moderation, strong governance and accountability, and more information and support for children including easy-to-use reporting and complaints processes.

 

Highly effective age assurance

There is no single fix-all measure that services can take to protect children online. But the package of measures recommended by Ofcom prominently relies on age assurance. Ofcom anticipates that most digital services not using age assurance are likely to be accessed by children. Once the final draft of the new rules comes into force, age assurance will be mandatory.
 
In practice, this will mean that all services will have to ban harmful content or introduce what Ofcom describes as “highly effective age-checks” restricting access to either the whole platform or parts of it that offer adults-only content. Ofcom defines “highly effective” as age assurance capable of technical accuracy, robustness, reliability, and fairness, with further details here.
 
Regulated services will no longer be able to get away with an ineffective ‘I am 18’ button. They will need to commit to age assurance technology to ensure their services are safer by design.
 
The quickest way of doing this is to adopt a proven digital ID product, like Luciditi. Ian Moody, Luciditi co-founder and CEO, says, “Easier and more cost-effective than starting from scratch, Luciditi can be easily embedded in web sites or apps, either by using a pre-built plugin or by using our Software Development Kit.”
 
Ofcom have specifically said their measures will apply to all sites that fall within the scope of the Act, irrespective of the size of the business. ‘We’re too small to be relevant’, won’t wash as an excuse.
 
Services cannot refuse to take steps to protect children simply because the work is too expensive or inconvenient. Ofcom says, “protecting children is a priority and all services, even the smallest, will have to take action as a result of our proposals.”
 

“Don’t wait for enforcement and hefty fines” – Tech Sec

According to Ofcom, children who have encountered harmful content experience feelings of anxiety, shame or guilt, sometimes leading to a wide-ranging and severe impact on their physical and mental wellbeing.
 
The lawlessness exploited by some of the world’s leading social media platforms has contributed to the deaths of children like 14-year-old Molly Russell. The coroner’s report concluded that watching content promoting suicide and self-harm had contributed to Molly’s death by suicide.
 
“We want children to enjoy life online”, said Dame Melanie Dawes, Ofcom Chief Executive, “but for too long, their experiences have been blighted by seriously harmful content which they can’t avoid or control. Many parents share feelings of frustration and worry about how to keep their children safe. That must change.”
 
The consultation exercise closes on July 17, 2024. Ofcom says, “We will take all feedback into account, as well as engaging with children to hear what they think of our plans. We expect to finalise our proposals and publish our final statement and documents in spring 2025.”
 
Welcoming Ofcom’s proposals, Technology Secretary Michelle Donelan said, “To platforms, my message is engage with us and prepare. Do not wait for enforcement and hefty fines – step up to meet your responsibilities and act now.”
 
The Online Safety Act doesn’t pull its punches. Repeat offenders will potentially be fined up to £18 million or 10% of global revenue, whichever is greater, and company managers risk going to jail for up to two years. In the coming months, platforms will need to be proactive in committing to the age assurance products that will help them stay on the right side of the law.
 
In Britain at least, the carefree distribution of harmful content is about to change. Ofcom’s proposals go much further than current industry practice and demand a step-change from tech firms in how UK children are protected online.
 

Want to know more?

Luciditi’s Age Assurance technology can help companies meet these strict new guidelines.  If you would like to know more, Contact us for a chat today.

Get in touch

Categories
Legislation Other News

Understanding the Online Safety Act: Implications for Adult Sites

Loading the Elevenlabs Text to Speech AudioNative Player...

Ofcom calls for biometric age checks to stop children seeing adult content

Tough new guidance from Ofcom aims to protect children from seeing online pornography. The Online Safety Act, passed last autumn, restricts underage access to adult content. New details have been published explaining how this can be done through age assurance, giving digital identity platforms like Luciditi a frontline role in helping content providers stay on the right side of the law.

On average, children first see online pornography at age 13 – although nearly a quarter discover it by age 11 (27%), and one in 10 as young as nine (10%), according to research. Before turning 18, nearly eight in 10 youngsters (79%) have encountered violent pornography showing coercive, degrading or pain-inducing sex acts.

The Online Safety Act (OSA) aims to protect children by making the internet in the UK the safest place online in the world. Under the OSA, sites and apps showing adult content will have to ensure that children can’t access their platform.

Highly effective age checks

The new law has been described as a skeleton act. The bare bones approved by parliament will be fleshed out one topic at a time by the communications watchdog Ofcom.

Ofcom’s first update, last November, focused on protecting people from online harms. Now, its second period of consultation and guidance aims to protect children from online pornography through what it describes as “highly effective age checks.” The new guidance looks in detail at the age assurance tech that providers will need to adopt.

The porn industry has long been an early adopter of innovation – from online credit card transactions to live streaming. Age assurance, tried and trusted in other sectors, is unlikely to pose any technical challenges whether providers develop it in-house or adopt an existing product.

Businesses flouting the OSA can be fined up to £18 million or 10% of global revenue, and their directors jailed for up to two years. Nevertheless, the vast majority of adult content providers will be committed to maintaining a profitable, stable, and compliant operation that avoids tangling with the law. They don’t want kids looking at inappropriate material any less than anyone else.

The difficulties of staying in-house

To comply with the OSA, providers must introduce age assurance – through age estimation, age verification or a combination of both.

In most cases, adults will be able to access a site through age estimation tech. Smart AI assesses a selfie in estimating whether a user is at least five years older than 18. Users who are 18 or thereabouts will be asked to verify their age through personal identity data confirming their date of birth.

The big question for both providers and users is who should oversee the selfies and data, providers or third-party specialists?

If developed in-house, estimation and verification can bring challenges perhaps unique to the porn industry. Criminals target users by surreptitiously activating the camera on their device and threatening to release the footage if money isn’t handed over. Just the threat of this can lead to a payout, even without evidence that the camera was actually activated.

Mindful of a risk of blackmail or other breaches of anonymity, users may be reluctant to send a selfie to a porn site. Asking them to give up their personal data poses an even bigger challenge. Explicit website Pornhub said regulations requiring the collection of “highly sensitive personal information” could jeopardise user safety.

Social media users are already sceptical – memes have started appearing showing someone accessing a porn site and being asked for a selfie before they enter. In the US, similar worries about age checks led users to access porn sites via a virtual private network (VPN). In Utah, demand for VPNs surged by 847% the day after new age checks came into effect.

Staying in-house means having to overcome widespread concerns. Providers who are legitimate, established, and successful but owned by an international parent group may particularly struggle to persuade British users that their selfie and data will be permanently and properly safeguarded.

Expertise from Luciditi

There is an easy, trusted alternative to the in-house route. Digital ID platforms such as Luciditi create an ‘air-gapped’ solution. A specialist in age assurance, Luciditi is British, well-established, and trusted by the UK government as Britain’s first supplier of a digital PASS proof of age card. Its developers, who have a background in digitally managing sensitive NHS records, have brought Luciditi to a range of industries. Users are already sending selfies and data to Luciditi for other age-restricted products or services.

Ofcom suggests that age assurance could involve tech associated with facial age estimation, photo ID matching and open banking all of which Luciditi already perform. Luciditi securely processes all selfies and data and instantly destroys it after use. Nothing is given to a third-party beyond an automated nod that a user is an adult. This meets Ofcom’s requirement for providers to take care in safeguarding privacy.

Prevention of tracking also an important factor, not just by the site operator, but also by the data source. So if a user chooses Open Banking to prove their age, your bank can’t see “why” they needed it or “whom” they shared it with – often called a “double blind” verification. Having certified systems handling privacy, anonymity and security is essential if it is ever to be trusted by users.

“We’re perfectly placed to support the adult content industry with age assurance”, said Ian Moody, Luciditi CEO, “our in-depth experience in supporting online providers of other age-restricted products means we can quickly bring sites up to the new standards set by Ofcom.”

Embedded in a provider’s site, Luciditi’s tech would operate behind the scenes, independently overseeing access. Providers could welcome new users with a message saying that access is managed by a reputable, independent third-party, safeguarding anonymity. This would assure users that they are not sending anything directly to the owners of a porn site. Additionally, providers can embed Luciditi across all their age-restricted products and services, whether relating to adult content or not.

User-generated content

As an established digital identity platform, Luciditi supports individuals as well as businesses. Users download the Luciditi app, which is free and easy to use. This lets them create their own digital identity wallet, safely storing their selfie and data and letting them breeze through an age check in a couple of taps.

This facility will benefit providers who host adult user-generated content and who need to know that performers are aged 18 or over. This issue isn’t covered by the latest guidance but will be included in Ofcom’s next update, due in spring 2024. Providers who choose to act early can future-proof their business now by addressing this issue as part of their wider approach to age assurance.

No alternatives

During the current process of consultation, which ends on March 5th, Ofcom will not be looking at softer options. For providers looking to retain their audience, age assurance is the only show in town. “Our practical guidance sets out a range of methods for highly effective age checks”, said Dame Melanie Dawes, Ofcom’s Chief Executive, “we’re clear that weaker methods – such as allowing users to self-declare their age – won’t meet this standard.”

The OSA effectively fired a starting gun. The race is now on for adult content providers to accept its provisions, take appropriate action, and adopt the tech they need before the law is enforced in or after 2025.

It’s not just about completing the work before the new measures are actively enforced. It’s about acting quickly to maintain a competitive position. Businesses that build trust early will seize the advantage in developing their market share. It’s not just the new law that providers need to be mindful of, it’s each other.

Want to know more?

Luciditi’s Age Assurance technology can help meet some of the challenges presented by the OSA. If you would like to know more, Contact us for a chat today.

Get in touch

Categories
Legislation Other News

The race is on to become compliant with UK online safety law

Loading the Elevenlabs Text to Speech AudioNative Player...

 

The Online Safety Act 2023 is now law and enforcement will be phased in by Ofcom. How should you and your business prepare?

Ambitious plans to make Britain the safest place to be online have recently become law. The Online Safety Act 2023 covers all large social media platforms, search engines, and age restricted online services that are used by people in the UK, regardless of where such companies are based in the world. What does the Act mean for you and your business, and how should you prepare for it? Our complete guide to the new legislation answers five key questions

1. What is the Online Safety Act?

The Online Safety Act (OSA) is a new set of laws aimed at creating a safe online environment for UK users, especially children. Due to be phased in over two years, the law walks a fine line between making companies remove illegal and harmful content, while simultaneously protecting users’ freedom of expression. It has been described as a ‘skeleton law’, offering the bare bones of protection which will be fleshed out in subsequent laws, regulations, and codes of practice.

The OSA has had a long and difficult journey. An early draft first appeared in 2019 when proposals were published in the Online Harms White Paper. This defined “online harms” as content or activity that harms individual users, particularly children, or “threatens our way of life in the UK, either by undermining national security, or by reducing trust and undermining our shared rights, responsibilities and opportunities to foster integration.”

The Act covers any service that hosts user-generated images, videos, or comments, available to users in the UK. It includes messaging applications and chat forums, and therefore potentially applies to the major social media platforms such as X (Twitter), TikTok, Facebook, Instagram, BeReal, Snapchat, WhatsApp, YouTube, Google, and Bing.

Fears of a censor’s charter

Early drafts of the “Online Safety Bill” were described by critics as a censor’s charter, and parts of it have been rewritten over time. The current version might have had its claws clipped but it still has teeth. Repeat offenders will potentially be fined up to £18 million or 10% of global revenue, whichever is greater, and company managers risk going to jail for up to two years.

The Act provides a ‘triple shield’ of protection. Providers must:

  • remove illegal content
  • remove content that breaches their own terms of service
  • provide adults with tools to regulate the content they see

Children will be automatically prevented from seeing dangerous content without having to change any settings.

2. What are the key provisions of the OSA?

The Act targets what it describes as “regulated services”, specifically large social media platforms, search engines, or platforms hosting other user to user services (for example of an adult nature), along with companies providing a combination of these. Precisely which providers in particular the Act will affect won’t be known until the government publishes further details about thresholds.

Providers will have to comply with measures in four key areas:

  • Removing illegal content
  • protecting children
  • restricting fraudulent advertising
  • communication offences such as spreading fake but harmful information

2.1 Illegal content

Providers will be expected to prevent adults and children from accessing illegal content. Previously, this was largely restricted to material associated with an act of terrorism or child sexual exploitation. Under the new law, illegal content also includes anything that glorifies suicide or promotes self-harm. Content that is illegal and will need to be removed, includes:

  • child sexual abuse
  • controlling or coercive behaviour
  • extreme sexual violence
  • fraud
  • hate crime
  • inciting violence
  • illegal immigration and people smuggling
  • promoting or facilitating suicide
  • promoting self-harm
  • revenge porn
  • selling illegal drugs or weapons
  • sexual exploitation
  • terrorism

Guidance published by the government explains that “platforms will need to think about how they design their sites to reduce the likelihood of them being used for criminal activity in the first place.”

The largest social media platforms will have to give adults better tools to control what they see online. These will allow users to avoid seeing material that is potentially harmful but which isn’t criminal, (providers will have to ensure that children are unable to access such content).

The new tools must be effective and easy to access and could include human moderation, blocking content flagged by other internet users, or sensitivity and warning screens. They will also allow adults to filter out contact from unverified users, which will help stop anonymous trolls from reaching them.

2.2 Protecting children

The OSA affects material assessed as being likely to be seen by children. Providers will have to prevent children from accessing content regarded either as illegal or harmful. The government’s guidance suggests that the OSA will protect children by making providers (in particular social media platforms):

  • prevent illegal or harmful content from appearing online, quickly removing it when it does.
  • prevent children from accessing harmful and age-inappropriate content.
  • enforce age limits and age checking measures.
  • ensure the risks and dangers posed to children on the largest social media platforms are more transparent, for example by publishing risk assessments.
  • provide parents and children with clear and accessible ways to report problems online when they do arise.

“Harmful” is a grey area. The Act gives the government minister responsible for enforcing the new law (the secretary of state) the power to define “harmful”. The OSA suggests the minister will do so where there is “a material risk of significant harm to an appreciable number of children.” According to the government guidance, harmful content includes:

  • pornographic material
  • content that does not meet a criminal level but which promotes or glorifies suicide, self-harm or eating disorders
  • content that depicts or encourages serious violence
  • online abuse, cyberbullying or online harassment

Social media companies set age limits on their platforms, usually excluding children younger than 13. However, many younger children have accounts. The OSA aims to clamp down on this practice.

2.3 Fraudulent advertising

Under the OSA, providers will have to prevent users from seeing fraudulent advertising such as ‘get rich quick’ scams. An advert will be regarded as fraudulent if it falls under a wide range of offences listed in the Act, from criminal fraud to misleading statements in the financial services area. For large social media platforms and search engines, advertising material is fraudulent if it:

    1. (a) is a paid-for advert
    1. (b) amounts to an offence (the OSA lists possible fraud offences) and
    1. (c) (in the case of social media) is not an email, SMS message (or other form of messaging as listed)

Social media platforms and search engines must:

  • Prevent individuals encountering fraudulent adverts
  • Minimise the length of time such content is available
  • Remove material (or stop access to it) as soon as they are made aware of it

Providers must also include clear language in their terms of service regarding the technology they are using to comply with the law.

2.4 Communication offences

Under the OSA, an offence of “false communication” is committed if a message is sent by someone who intended it “to cause non-trivial psychological or physical harm to a likely audience.” The law applies to individual users and “corporate officers” (who can be guilty of neglect), but excludes news publishers and other recognised media outlets.

An offence of “threatening communication” would be committed if someone sends a message that threatens death or serious harm (assault, rape, or serious financial loss), with the intent of making the recipient fear that the threat would be carried out.

The Act also makes it illegal to encourage or assist an act of self-harm. A crime occurs if an offending message is sent, or if someone is shown an offending message (whoever originally wrote it).

Sending flashing images can also be regarded as an offence. Possible prison sentences under this part of the OSA vary depending on the offence but can be up to five years. A company need not be a provider of regulated services to be caught by this part of the law.

Amendments will be made to the 2003 Sexual Offences Act making it illegal to share or threaten to share intimate pictures, if the offender was seeking to cause distress.

3. What are the requirements for age assurance tech?

In recent years, the UK has been edging ever closer to adopting an online age verification system. After passing the Digital Economy Act (2017), Britain became the first country to allow such a system to be implemented. Websites selling pornography would have had to adopt “robust” measures that stopped children accessing their content. However, enforcing this was easier said than done.

The possibility of a wide variety of porn outlets around the world collecting the personal identity data of UK users, led to concerns about breaches of the General Data Protection Regulation (GDPR). The scheme was abandoned in 2019, and at that point the baton was passed to the OSA.

Whether adults accessing pornography will encounter mandatory age assurance under the OSA is still the subject of legislative debate. However, adult content providers will need to ensure that children are not able to see such material. The Act says:

“A provider is only entitled to conclude that it is not possible for children to access a service, or a part of it, if age verification or age estimation is used on the service with the result that children are not normally able to access the service.”

This then may lead to providers committing to age assurance by default to ensure compliance. In its final version, the Act tightens up definitions of ‘assurance’, clarifying how and when this may be provided – whether by estimation tech, verification measures, or both.

Digital ID providers, such as our own platform Luciditi, use age estimation AI to give quick and easy access to the majority of users. Those close to the age threshold will need to arrange access via age verification, which relies on personal data. Luciditi only sends a simple ‘yes’ or ‘no’ reply to online age restricted access requests. The data itself is securely managed and can’t be seen by third parties. Keeping business operations compliant, Luciditi can be embedded in a client’s website by developers (ours or yours), or even simply via a plug-in.

Under the terms of the Act, providers will have to say what technology they are using, and show they are enforcing their age limits. More detail is expected to be given by Ofcom (see 4, below), later this year. In June 2023, Ofcom said:

“Online pornography services and other interested stakeholders will be able to read and respond to our draft guidance on age assurance from autumn 2023. This will be relevant to all services in scope of Part 5 of the Online Safety Act.”
[Part 5 relates to online platforms showing pornographic content].

Lawyer Nick Elwell-Sutton notes that “whether age verification for children will be a mandatory requirement is still the subject of ongoing consultation, but many service providers may voluntarily seek to implement verification at age 18 to avoid the more stringent child safety requirements.”

Age assurance technology will likely need to conform to existing government standards, including the UK Digital Identity and Attributes Trust Framework (DIATF). Introduced in 2021, DIATF sets the rules, standards, and governance for its digital identity service providers, like Arissian who developed Luciditi. One of the key principles behind DIATF is the need for mutual trust between users and online services, a principle it shares with the OSA.

Iain Corby, executive director for the Age Verification Providers Association, a politically neutral trade body representing all areas of the age assurance ecosystem, commented: “For too long, regulators have neglected enforcement of age restrictions online. We are now seeing their attention shift towards the internet, and those firms which offer goods and services where a minimum age applies, should urgently implement a robust age verification solution to avoid very heavy fines.”

4. How will the OSA be enforced?

Not without difficulty. The OSA will be enforced by Ofcom, the UK’s communications regulator. Ofcom will prepare and monitor a register of providers covered by the law, which may include up to 100,000 companies.

The government funded Ofcom in advance to ensure an immediate start. However, providers will soon have to finance the new measures themselves through regular fees to Ofcom.

Ofcom will not be pursuing individual breaches of the law. It will instead focus on assessing how a provider is meeting the new requirements overall at the risk of being fined, as detailed above. Ofcom will have powers of entry and inspection at a providers’ offices.

In the most extreme cases, with the agreement of the courts, Ofcom will be able to require payment providers, advertisers and internet service providers to stop working with a site, preventing it from generating money or being accessed from the UK.

Criminal action will be taken against those who fail to follow information requests from Ofcom. Senior managers can be jailed for up to two years for destroying or altering information requested by Ofcom, or where a senior manager has “consented or connived in ignoring enforceable requirements, risking serious harm to children.”

The new law will come into effect in a phased approach:

Phase 1: illegal harms duties.

      1. Codes of practice are expected to be published soon after the Act becomes law.

Phase 2: child safety duties and pornography.

      1. Draft guidance on age assurance is due to be published from autumn 2023.

Phase 3: transparency and user empowerment

    1. This is likely to lead to further laws covering additional companies.

5. How should businesses be preparing for the OSA?

While the OSA mainly targets social media platforms and search engines, its measures are of general application. In other words, any business could face enforcement if its actions fall within the scope of the new law.

Businesses concerned about the OSA are advised to carry out a risk assessment covering products and services, complaints procedures, terms of service, and internal processes and policies. Companies should also assess how likely their platforms/products are to be accessed by children.

In particular, businesses will need to identify potential development work, as many obligations imposed by the OSA will require technical solutions and backend changes. Further advice from a legal perspective is available here.

Conclusions

The Wild West nature of the internet is notoriously difficult for any one country to tame. Things will be easier for the UK now that the EU’s Digital Services Act has come into effect, forcing more than 40 online giants including Facebook, X, Google, and TikTok to better regulate content delivered within the EU.

Nevertheless, the UK faces a lonely battle with leading providers, especially those concerned about a part of the Act aimed at identifying terrorism or child sexual exploitation and abuse (CSEA). Until very recently, it had been expected that Ofcom will would be able to insist that a provider uses “accredited technology” to identify terrorism or CSEA content. In other words, a service like WhatsApp – that allows users to send encrypted messages – must develop the ability to breach the encryption and scan the messages for illegal content.

No surprise then that WhatsApp isn’t happy at what has been described as a ‘backdoor’ into end-to-end encryption measures. In April, the platform threatened to leave the UK altogether. Signal and five other messaging services expressed similar concerns. In response, the government has assured tech firms they won’t be forced to scan encrypted texts indiscriminately. Ofcom will only be able to intervene if and when scanning content for illegal material becomes “technically feasible.”

Ofcom will also be able to compel providers to reveal the algorithms used in selecting and displaying content so that it can assess how platforms prevent users from seeing harmful material.

These challenges notwithstanding, campaigners from all sides agree that something is needed even if some remain sceptical about the OSA. Modifications were made to the Act in June, in part guided by the inquest into the death of Molly Russell. In 2017, Molly died at the age of 14 from an act of self-harm after seeing online images that, according to the coroner, “shouldn’t have been available for a child to see.” The OSA may not be perfect. But for the sake of children across the country, it’s at least a step in the right direction.

Want to know more?

Luciditi’s Age Assurance technology can help meet some of the challenges presented by the OSA. If you would like to know more, Contact us for a chat today.

Get in touch

Categories
Other News Technology

UK’s first PASS approved Digital Proof of Age Card set to reduce fraud and retailer prosecution

Loading the Elevenlabs Text to Speech AudioNative Player...

 

The threat of retailers facing prosecution by accepting fake ID cards at the point of purchase is a step closer to being eradicated this Autumn.

Luciditi Age Proof is an accredited Proof of Age Standards Scheme (PASS) digital card that is entering its live testing phase and has been designed by digital identity platform Luciditi to safeguard young people needing to verify their age.

Providing added peace of mind to retailers, the digital Age Proof cards offer much greater security from fraud, unlike physical ID which is routinely forged.

The move towards digital is set to be well received by young people, with 94% of respondents in favour of a digital form of ID that they can use on their phone, according to an auditor community survey by Serve Legal, the UK market leader in age-verification auditing. A further 90% felt it would be more desirable than carrying a physical ID card such as a driving license or passport.

Ian Moody, co-founder and CEO for Luciditi, commented: “We’re very excited at the prospect of launching the UK’s first PASS digital proof of age card as it will completely transform the way age is verified at the point of purchase. Age Proof provides enhanced data security and convenience for young people, whilst eliminating the current threat of prosecution caused by retailers accepting fraudulent physical cards.”

Age Proof cards will be able to be accessed via a smartphone app and harness QR technology to provide real-time verification – offering speed, convenience and greater data protection to young people aged 16+ and 18+.

The 16+ digital cards provide a host of benefits, allowing this age group to legally purchase energy drinks, age-restricted computer games or music and over-the-counter medication such as paracetamol, through to buying a pet or getting body piercings without parental consent.

The 18+ card includes features such as legal entry into a pub or gambling venue, the purchase of cigarettes or vapes and purchasing a tattoo. The only current restriction on the card is that the Licensing Act for the sale of alcohol only allows physical cards bearing the PASS hologram and logo as acceptable proof of age when purchasing alcohol. Changes to the law are anticipated in 2024 which will add digital PASS and make it permittable to buy alcohol using Age Proof.

Ian Moody from Luciditi, added: “Fake ID cards, whether they be forged physical identity cards, driving licenses or even passports, have become more and more sophisticated in recent years and have proven a major headache for retailers. We’ve developed a standards-based digital solution which brings the UK a giant step closer to eradicating the problem completely.”

Tony Allen is executive director for Age Check Certification Scheme, the world’s first dedicated conformity assessment body on age assurance and the PASS-appointed independent auditors. He commented: “The Luciditi Age Proof system has been subject to a rigorous audit to the published PASS standards, including ensuring security, privacy, accuracy and the inclusion of anti-fraud measures.

“It’s also a significant boost to the retail sector and is a precursor to enhanced automation, with work already underway to establish a universal transaction method across all retailers that avoids consumers needing multiple apps to verify their age. The implementation of digital verification technology across all self-checkouts and electronic point of sale (EPOS) systems will be key to this and is expected to be rolled out as soon as the Home Office complete their forthcoming consultation on digitally enhanced transactions for alcohol.”

Prior to the official launch of Age Proof, Luciditi is inviting one thousand 16-25 year olds to receive a free digital card in return for their feedback. The company is also working with Serve Legal to roll-out a testing programme on products such as vapes, energy drinks and gambling scratch cards across retailers ranging from supermarkets to convenience stores.

Want to know more?

Visit the dedicated Age Proof or to learn more about how Digital Credential technology can help your business, contact us for a chat today.

Get in touch

Categories
Legislation Other News

Online retailers warned of the impact of upcoming Online Safety Bill

Loading the Elevenlabs Text to Speech AudioNative Player...

 
Online retailers selling age-restricted products are exploiting a current lack of regulation and face being unprepared for the online safety bill due this Autumn.

The warning to firms comes as a study conducted by digital identity platform, Luciditi reveals that weapons, alcohol and vapes are just some of the items that children as young as seven years old can purchase online without having to securely verify their age.

Although the forthcoming government Online Safety Bill anticipated later this year will tighten regulations, there is currently no robust law enforcement in place to prevent companies selling to under-age people online.

Iain Corby, executive director for the Age Verification Providers Association, a politically neutral trade body representing all areas of the age-assurance ecosystem, commented: “For too long, regulators have neglected enforcement of age restrictions online. We are now seeing their attention shift towards the internet, and those firms which offer goods and services where a minimum age applies, should urgently implement a robust age verification solution to avoid very heavy fines.”

Ian Moody, co-founder and CEO for Luciditi, commented: “The law is very clear in that it is an offence to sell items such as weapons, alcohol and vapes to under-age children online and yet our study reveals that many online retailers still don’t have either the technological capabilities or the appetite to adhere to it.”

The research conducted coincides with the launch of Luciditi’s new online age check solution called Age Assurance. It can be deployed across an online retailer’s website or mobile app and simply requires shoppers to take a selfie prior to accessing the site. Within seconds it will confirm if they are over a certain age, does not reveal their identity and is the digital equivalent of being age estimated at the check-out in a supermarket or off-license by a member of staff.

Ian added: “Under-age online purchases are going largely unchallenged and so we feel we’ve developed a solution that enables online companies to protect young people, whilst simplifying the transaction process.

“We will naturally welcome the Online Safety Bill when it is unveiled but it will take at least 18 months for the new regulations to be enforced. We’d urge businesses across online retailing to take steps now to safeguard young people online rather than wait until the eleventh hour to take decisive action.”

Want to know more?

If you would like to learn more or try out the Luciditi Age Assurance Plugin for WordPress or understand how Luciditi Age Assurance technology can help your website or business, contact us for a chat today.

Get in touch

Categories
Other News Technology

Age Assurance for Online Vaping Sites

Loading the Elevenlabs Text to Speech AudioNative Player...

 

Easier age assurance comes ahead of anticipated government crackdown

A smart, new plugin from Luciditi can help suppliers of online vape products stay on the right side of the law. It comes amid growing pressure to stop children accessing e-cigarettes, amid concerns about the long-term impact on young people’s lungs, hearts, and brains.

Data from campaigners Action on Smoking and Health shows that experimental use of e-cigarettes among 11-to-17-year-olds is up 50% on last year. The Royal College of Paediatrics and Child Health has repeatedly expressed concerns about the potential damage to young people’s health caused by vaping.

On June 6, the RCPCH called for an outright ban on single-use e-cigarettes. Disposable vapes can be bought for just £1.99 and are especially popular among young people. Dr Mike McKean, the RCPCH vice-president and a paediatric respiratory consultant, said concerns stem from the “epidemic” of child vaping and the small but growing numbers of children with respiratory problems.

It’s illegal for anyone under 18 to buy or use a vape in the UK. But restrictions are easily sidestepped, both on the high street and online. In April, health minister Neil O’Brien launched a £3 million ‘illicit vapes enforcement squad’, targeting high street retail. Clamping down on illicit online sales is an inevitable next step.

Ground-breaking age assurance plugin

Responsible online suppliers face a dilemma. They depend on an easy sales process where adults can quickly find and buy what they want. Simultaneously, they must also identify and exclude youngsters.

Many websites have little choice but to use a self-asserting button, asking potential customers whether they’re aged 18 or over. Kids click yes, and they’re in. It’s like a door in a nightclub where instead of doormen checking ID there’s a small sign asking ‘are you 18?’

Now, a ground-breaking plugin from Luciditi offers a proven and cost-effective alternative. The first of its kind, the plugin gives websites a reliable, quick and streamlined route to age assurance.

The Luciditi Age Assurance plugin uses AI that is currently trusted by clients in a range of age-restricted industries.   Using military-grade security and built by a team whose roots can be traced to software that protects the clinical documents of two-thirds of the UK population, Birmingham-based developers Arissian.

Until now, businesses wanting to embed age assurance tech in their website have had to go the long way round, involving additional coding.  Attempting this work in-house has huge implications in time and money. By downloading the Luciditi plugin for WordPress and setting up a Luciditi business account, companies looking for a simpler solution can now upgrade their age assurance without embarking on an IT project.

Simple as a selfie

“Setup is a piece of cake”, says Ian Moody, Arissian’s MD and co-founder, “once you’ve set the minimum age, and added the API key and company logo, your entire site is age-restricted.”

The plugin generates a landing page, with a greeting (written by the client) and a request for a selfie. Luciditi’s digital identity technology examines the image and estimates whether the person appears older than 18 (or whatever age you choose), in a process that takes just a few seconds. Adults are then allowed unrestricted access to the site.

Luciditi’s cutting-edge ‘liveness’ technology prevents attempts to spoof a selfie. So-called presentation attacks are weeded out by counter-measures that spot printed photos, cutout masks, digital and video replay attacks, and 3D latex masks.

“The plugin operates with minimum friction”, explains Ian Moody, “only challenging visitors for proof when necessary. Age assurance is bypassed for future visits using the same browser.”

Only people who look around the specified age are likely to be asked for proof of age.  They are guided through an extra step which confirms their date of birth by supplying an image of government-approved ID. A match against the ID document photo confirms that they are the document holder.   However, this data goes no further than Luciditi.

Anonymity guaranteed 

Having verified a person’s age, the app then simply sends a nod of approval to the plugin without revealing any further information, it doesn’t even release the date of birth. The plugin doesn’t store selfies, biometric data that enables age-estimation, names, addresses, dates of birth, or any ID document details.  Despite supplying a selfie or ID data, anyone buying vape products from a protected site will remain anonymous.

Users who have already verified their identity in the Luciditi app can access a protected website with a few taps, without needing a selfie or ID documents. Having given their data to the app, users can access any age-restricted website that relies on Luciditi’s tech, knowing that their personal details are securely protected and will never be sent to a third party.

For a modest annual subscription, paid monthly, vendors can find peace of mind knowing that they’ve put themselves and their business on the right side of the law, amid mounting pressure for action against sites that put children at risk.

A catalyst for change

The RCPH’s call for a ban on disposable vapes would bring the UK more closely in line with comparable countries. New Zealand has recently banned most disposable vapes, Australia has made vaping prescription-only, and tight restrictions have been implemented in Scotland, France, Germany and Ireland.

The RCPCH’s call came on the final day of a two-month trawl for ideas on restricting under-age vaping. The consultation, led by the Department of Health, should be “the catalyst for change that is so urgently needed”, said the children’s commissioner for England, Rachel de Souza. Ministers are now considering this new evidence ahead of potential further steps in tightening the law.

Evidence submitted to the government includes data from compliance auditing firm Serve Legal. Having conducted over 22,000 e-cigarette purchases since January 2021, in-store and online, Serve Legal found that “one in four auditors are being sold e-cigarettes without being asked for age verification or identification.”

Three days after the RCPCH’s call for a ban on disposable vapes, Rachel de Souza echoed their demand and urged ministers to crack down on the marketing of vapes to young people. De Souza said: “We urgently need stricter regulation of this ‘wild west’ market. It is insidious that these products are intentionally marketed and promoted to children, both online and offline.”

In the face of mounting pressure, it’s a matter of when rather than if the government will take tougher action against websites that continue to put children’s health at risk. Luciditi’s plugin is an easy solution for vendors who share their concern.

Want to know more?

If you would like to learn more or try out the Luciditi Age Assurance Plugin for WordPress or understand how Luciditi Age Assurance technology can help your website or business, contact us for a chat today.

Get in touch

Categories
Other News Technology

Cut online sales of vape products to children

Loading the Elevenlabs Text to Speech AudioNative Player...

 

Digital ID platform Luciditi is supporting moves to restrict vaping among children and teenagers – which is at ‘epidemic’ levels according to a senior health official. Doctors fear that without urgent action a generation could develop long-term addictions and lung damage. Health ministers are calling for ideas on how to clamp down on the illegal sale of e-cigarettes to under 18s. In response, Luciditi is ready to suggest solutions based on its pioneering age assurance tech.

Vaping is becoming an epidemic among teenagers, according to Dr Mike McKean, vice-president of policy for the Royal College of Paediatricians and Child Health. In 2021, a survey from campaign group Action on Smoking and Health (ASH) of 2,109 children and teens found that 11.2% of 11-17 year olds had tried vaping. By 2022, this had risen to 15.8%. Among 18-year-olds, 40.8% said they have tried an e-cigarette.

The ASH survey stresses that the vast majority of children (83.8%) have never tried or are unaware of e-cigarettes, nevertheless the figures suggest that more than 800,000 children in the UK have illegally accessed a product that can contain nicotine. According to the US government’s Centers for Disease Control, ‘most e-cigarettes contain nicotine. Nicotine is highly addictive and can harm adolescent brain development, which continues into the early to mid-20s.’

In recent years, under-age use of tobacco cigarettes in the UK has been falling, in line with long-term downward trends. Yet little is being done to effectively curb the marketing and selling of vape products to teenagers. Regulations restricting the age-group’s online access to e-cigarettes are not easily enforced.

Illicit vapes enforcement squad

While the sale of tobacco and e-cigarettes to people under the age of 18 is an offence, ASH found that children continue to be sold both. Among children buying both products online, ASH found that purchases of e-cigarettes (10%) are much more common than tobacco cigarettes (4%).

Children would be better protected by tougher enforcement of online age restrictions, which could be supported by Luciditi’s developments in artificial intelligence. Luciditi would allow easy access to vape sites for anyone over 18 while helping to safeguard children’s health.

In April 2023, the government announced that £3m will be invested in an ‘illicit vapes enforcement squad‘, led by Trading Standards officers. This will initially conduct test purchases and remove banned products from shops and at borders, but the government is also looking for other ways of restricting sales.

The Health Department is calling for evidence to “identify opportunities to reduce the number of children accessing and using vapes, while ensuring they remain available as a quit aid for adult smokers.” In response, Ian Moody, Luciditi CEO, said: “The need for a workable twin-track approach, assisting adults while helping children, is something that Luciditi is well equipped to support.”

The government’s new measures are backed by the UK Vaping Industry Association (UKVIA). John Dunne, Director General of the UKVIA, said: “The law is absolutely clear – it is illegal to sell vapes to U18 year olds and therefore it is a criminal offence to do so…There is no doubt that action directed at those illegally selling vape products to children is the way forward.”

Using age assurance to restrict under-age sales

Online sales to children could be reduced by requiring vape websites to restrict access via a secure, digital gateway. This would be better than the revolving door that many currently favour, in the form of a flimsy ‘security question’ along the lines of ‘are you of legal smoking age?’ When it comes to online age restrictions, kids are effectively being asked to mark their own homework.

A digital gateway such as Luciditi would give vendors the assurance they need that users were aged 18 or older. Currently supporting businesses in a range of sectors, Luciditi’s configurable user interface tech can be readily dropped into any website or app.

Luciditi offers two options in managing age assurance. The quickest way of automatically recognising adults and allowing them access is via age estimation. The user simply takes a selfie through their webcam or mobile and the Luciditi Age Estimation feature takes care of the rest.

Supported by an ethical and continually trained machine-learning model, the system is highly capable of telling the difference between children and adults. By verifying ‘liveness’ of the individual, Luciditi is able to recognise and reject manipulated images, latex masks and other spoofing attempts.

Once a vendor has adopted Luciditi’s tech, users can quickly and freely access the website. Some businesses might be able to qualify more than two-thirds of users through estimation, depending on their average age.

A rounded approach to restrictions

Estimation technology decides whether a user fits into the appropriate age-range. Working with a tolerance of plus or minus six years, Luciditi grants swift access to the vast majority of adults. For those at the younger end of the range, Luciditi’s second option in age assurance is initiated automatically. This relies on verification, confirmed via proof of age.

Using a suitable form of ID, the app scans the document, verifies that it’s genuine and that it belongs to the holder in a quick and free process proving their age. Protected by Luciditi’s military-grade security, the data is never forwarded to anyone or anywhere else. Vendors would never see customers’ data. They simply receive instant assurance from Luciditi that the customer is older than 18.

Luciditi Age Assurance

Users who already hold a free Luciditi account, can securely access websites requiring age verification simply by giving approval to a notification sent to their device – without having to scan ID documents or release personal data. Given the often Wild West ways of the internet, the secure protection of data is essential. Thin regulations are hard to enforce and users can’t always be sure who they’re buying from. By restricting access to their personal data, users cut the risk of identity fraud by making it harder for unauthorised third parties to get hold of personal information, whether accidentally or deliberately.

Luciditi’s capabilities could help sites that are struggling to reduce vape sales to children. For health minister Neil O’Brien, the call for action is designed to “clamp down” on those businesses that are “getting children hooked on nicotine. Our call for evidence will also allow us to get a firm understanding of the steps we can take to reduce the number of children accessing and using vapes.”

However, health officials say more still needs to be done. Dr Mike McKean said: “We’re relieved that the UK government has started to focus on the rising levels of children and young people picking up e-cigarettes, but an enforcement squad is just the tip of the iceberg.”

While the government’s announcement initially targets shops and borders, this may prove a hollow victory if online sales are allowed to continue untouched. Luciditi’s Ian Moody believes that: “Age assurance tech can support an online component of the government’s new measures.” A rounded approach is necessary, working both in retail and online. Without it, kids will continue to experiment, sparking further concerns about their long-term health.

Want to know more?

If you would like to find out how Luciditi Age Assurance technology can help contact us for a chat today.
Restrict e-Cigarettes to adults

Get in touch

Categories
Other News Technology

Safe and Reliable Age Assurance

Loading the Elevenlabs Text to Speech AudioNative Player...

 

Digital ID Platform Luciditi is setting a new standard in age assurance, in support of tighter internet regulation. The Online Safety Bill, currently going through parliament, aims to restrict anonymous access to adult content. The planned new law has led to concerns about an era of ‘Big Brother’ oversight, along with new risks of ID fraud. Luciditi offers a solution to both these fears, paving the way in age assurance and trust.

The Online Safety Bill promises to tackle a range of potentially harmful content, including trolling, and underage access to pornography. The measures would create a new duty of care for online platforms, which if breached could lead to fines of up to £18 million or 10% of their annual turnover, whichever is higher.

Digital Secretary Oliver Dowden said the “ground-breaking laws” would “usher in a new age of accountability for tech and bring fairness and accountability to the online world”. Dowden said the measures” will protect children on the internet, crack down on racist abuse on social media and through new measures to safeguard our liberties, create a truly democratic digital age.”

However, campaigners have argued that the bill flies in the face of the government’s attempts to strengthen free speech elsewhere in Britain. Mark Johnson, legal and policy officer at civil liberties and online privacy group Big Brother Watch, said: “The Online Safety Bill introduces state-backed censorship and monitoring on a scale never seen before in a liberal democracy.”

Yet, until the bill is passed, the need for accountability across the internet in the UK continues to go unanswered. Current levels of protection for children and vulnerable people are inadequate, as demonstrated by the tragic case of 14-year-old Molly Russell.

After Molly took her own life in 2017, an inquest found that she “subscribed to a number of online sites…some of these sites were not safe as they allowed access to adult content…The way that the platforms operated meant that Molly had access to images, video clips and text concerning or concerned with self-harm, suicide or that were otherwise negative or depressing in nature. The platform operated in such a way using algorithms as to result, in some circumstances, of binge periods of images, video clips and text some of which were selected and provided without Molly requesting them.”

The new laws would aim to stop children seeing such content, by requiring platforms to restrict access to underage users. Melanie Dawes, chief executive of Ofcom, the agency that will enforce the new measures, believes the regulations will take “us a step closer to a world where the benefits of being online, for children and adults, are no longer undermined by harmful content.”

At the moment, the half-hearted ‘I’m 18’ checkbox entry points used by adult websites imply a thin veneer of responsibility. In truth, they are ineffective at stopping children and vulnerable individuals accessing inappropriate content.

Once the bill becomes law, platforms offering adult content, or age-restricted goods or services, will need to be better assured that users meet minimum age requirements. Verifying age often involves an individual sending images of personal documents. For this reason, an increased demand for personal data created by the bill is a concern for campaigners. It also increases the risk of online identity fraud.

The mishandling of sensitive data could potentially be open to abuse. Individuals can only hope that their personal information will be safeguarded once it’s in the hands of the online company they sent it to. However, not all online businesses are transparent about their ownership, processes or values, raising questions of trust.

Once the new law is passed, platforms and their users will need a quick, painless sign-up process that provides assurance on age yet is as risk-free as possible. In facilitating this level of assurance, while still preserving trust, Luciditi has a gold-standard pedigree.

Developed in-house over four years by UK tech company Arissian, Luciditi benefits from experience gained from their previous business which focused on the UK Healthcare market. Docman which handles clinical documents for two-thirds of the UK population, gives Arissian proven experience in bringing iron-clad security to sensitive personal data in volume.

Already supporting clients across a range of sectors, Luciditi is set to lay down new standards in age assurance. The Age Verification Providers Association (AVPA) has broken down age assurance into two categories, estimation and verification. Offering a smart approach to both, Luciditi delivers the holy grail – of strong assurance (including verification when necessary) without requiring individuals to give their personal data to a random online business.

Luciditi addresses the first of AVPA’s two categories, age estimation, through AI analysis of a selfie. Cutting out the need for sensitive personal data, the user simply takes a selfie using their web cam or mobile, and the Luciditi Age Estimation feature – embedded within the app – takes care of the rest.

Using a restricted, ethical and continually trained machine-learning model, Luciditi checks for a good quality image. Highly proficient in differentiating between children and adults, the app looks for ‘liveness’ of the individual, guarding against manipulated images, latex masks and spoofing attempts. Once image composition checks are complete, the app estimates an individual’s age, with the whole process normally taking just a few seconds.

With AI, there’s no need for someone aged 52 to give away their passport details simply to prove they’re not 15. Allowing a tolerance for plus or minus seven years, Luciditi can quickly and reliably show that the majority of users fall into the appropriate age range. Some businesses might be able to register more than two-thirds of users through estimation, depending on their average age.

Ian Moody, Luciditi CEO, says “Age estimation is useful in situations where you need to introduce a ‘low friction age check’ without needing an exchange of personal data.”

When Luciditi can’t be sure that someone meets minimum age requirements, verification is needed. Here, Luciditi meets AVPA’s second – more stringent – level of age assurance. Users in this category will need to send personal data, but they reduce risk by sending it only to Luciditi.

Serving as a trusted middleman, Luciditi receives personal data from an individual and then assures a client business that age has been verified, giving them no more information than that. Luciditi simply gives the business a true/false decision on proof of age, the same degree of confirmation that comes with age estimation.

Already Luciditi is supporting clients such as IDGO (pronounced I’dgo), who will soon be delivering proof-of-age identity cards aimed at people aged 18 – 38 who need proof of age at 18+ activities and events. At the moment, people attending ‘no ID, no entry’ events will often take a passport or driving licence to a crowded venue, documents that are full of information useful to anyone looking to steal an identity.

Such events are often followed by countless #lostpassport posts. They’re a goldmine for fraudsters seeking to take out a loan or open a bank account with an overdraft facility – both easily done online. The UK government reports that just under 400,000 passports are lost or stolen each year, with a million driving licences lost in 2017 (the latest available figure).

IDGO’s card is a safer alternative. New clients will be asked for ID verification, which they securely supply to Luciditi who in turn provide assurance to IDGO. The UK’s first proof of age and identity card, IDGO also doubles as a contactless pre-paid payment card, allowing users to leave their regular bank card at home too, along with their passport or driving licence.

The Online Safety Bill has not had an easy passage through parliament. Campaigners have a range of concerns, and tech platforms are a powerful force to contend with. But action is necessary, as the death of Molly Russell demonstrates. Innovations such as Luciditi offer a credible, safe and trusted way forward. Without them, the Wild West side of the internet will continue to threaten our children, while adults risk tangling with fraudsters. Luciditi can alleviate concern on all sides. Wild corners of the internet will continue unabated, but users can now find comfort in new levels of protection.

Want to know more?

If you would like to find out how Luciditi can get you ready to meet the Age Assurance responsibilities of the new Online Safety Bill contact us for a chat today.

Get in touch