Categories
Latest News Legislation Technology

Digital ID to be accepted as proof of age for alcohol sales in the UK

Digital ID to be accepted as proof of age in alcohol sales. Apps like Luciditi offer safer alternative to physical ID documents.

It will soon be possible to buy alcohol using a digital identity to confirm your age, the government has announced. The decision means that consumers will be able to use trusted ID apps like Luciditi rather than take their passport or driving licence to pubs, clubs, and supermarkets.

The move comes in response to a consultation that ran from January to March, 2024. This asked stakeholders whether young people should be allowed to use a digital identity service to prove they’re old enough to buy alcohol.

A clear majority of respondents (72%) said that existing legislation should be updated to allow consumers to use digital ID in retail settings such as supermarkets, off-licences, restaurants, pubs, and clubs. Digital identity apps like Luciditi can facilitate a quick and easy process at the point of sale, similar to contactless payments or scanning a QR code.

Luciditi relies on evidence scanned from personal identity documents uploaded by the user. These are stored in a digital wallet and are securely protected by high-grade security. Typically they’re not shown to a third party, the app simply assures a retailer that the user has been verified as 18 or over.

Digital identity trust framework

Age assurance depends on trust. A digital identity service will only be accepted in the sale of alcohol if it has been certified against government standards.

At the moment, trusted digital ID providers, including Luciditi, are voluntarily certified against the UK’s digital identity trust framework, a set of rules defining a good digital ID service.

The framework will soon be put on a statutory footing by the Data (Use and Access) Bill, (DUA), currently passing through Parliament. After the DUA Bill receives royal assent, expected sometime later this year, the framework will underpin a new Digital Verification Service. This will oversee a public register of certified digital ID providers.

The government has also said that DUA will be amended once it comes into law, expanding the scope of digital identities so they can be used as proof of age in alcohol sales.

Verification, rather than estimation

The consultation process, initiated by the previous government, focused on whether the Licensing Act 2003 should be updated to allow the use of digital ID. A total of 251 complete responses were received from licensing authorities, the alcohol and hospitality industries, policing, trading standards, technology companies, delivery partners, civil society organisations, and members of the public.

Most respondents thought the proposed changes would have a positive impact, though some raised concerns about data protection and the potential for digital identities to be hacked or faked.

Similar concerns raised in the past prompted the decision to create the Digital Verification Service, putting the trust framework on a more formal footing. Only verification will be acceptable in the sale of alcohol. Other technologies, such as age estimation, currently fall outside the framework and won’t be accepted.

Next steps

The government is currently looking at the minimum level of service required from a digital ID provider. At the moment, anyone checking an ID document when selling alcohol needs to confirm three things:

1. Does it show that the person is over 18?
2. Does it belong to the person presenting it?
3. Is it a genuine document?

In practice, this means that a retailer must look at the date of birth to work out the person’s age, compare a photo to the person standing in front of them, and look for security features like holograms or ultraviolet marks to make sure the documents are real.

Digital identity tech needs to do the same, quickly, reliably, and securely. The Department for Science, Innovation and Technology (DSIT) is working with the Home Office on three main requirements.

Firstly, to be certified and included on the new register, a provider must be able to securely use ID data to verify that someone is 18 or over. Secondly, the tech must confirm that the digital identity belongs to the person presenting it. On this, the Office for Digital Identities and Attributes (part of DSIT) said that:

“Digitally, this can be done using biometric authentication. For example, a user can scan their face with their smartphone to access their digital identity. The scan of their face is bound to the photo on the original document. This allows them to securely prove that the identity belongs to them. By logging into the app in this way, the person can prove that the identity belongs to them.”

Thirdly, the identity must be verified as genuine. This means that a digital ID has to be scanned by a device rather than simply assessed by a person, similar to an e-ticket being scanned at a venue. Age assurance may involve scanning a QR code or using NFC technology similar to contactless payments.

Remote sales of alcohol

The Licensing Act was passed in 2003, since then however the way that people buy alcohol has changed. The consultation also looked at using digital IDs in remote sales, when alcohol is bought in a setting that’s not face-to-face.

Drink can be bought online, or in other ways that don’t involve face-to-face contact, for example at supermarket self-checkout tills or in a restaurant that accepts orders via an app. Age checks are currently required at the point of sale, but not at the point of delivery.

The consultation asked whether age checks at the point of delivery should be introduced. This is complicated by the fact that it’s an offence to sell alcohol to a person who’s drunk, and so the consultation also asked whether there should be mandatory checks at the point of delivery to determine whether someone is already intoxicated.

These questions raise practical difficulties, for example in the ability to leave goods in a safe place for the customer to collect. While a majority of respondents (58%) agreed that the Licensing Act should be updated to make it an offence to deliver to someone who’s drunk, there were concerns about how this would work in practice.

On these complicated issues, the government said “this is an area that requires further consideration. We will undertake further work in this area in due course.”

Safer checks for young people

A key priority is to safeguard the aims of the Licensing Act 2003 including the need to protect children from harm. This led to a requirement for age checks, which in practice means that young people have to carry valuable personal documents into pubs, clubs, and restaurants.

When they hand over their document to be checked by whoever’s at the door, people are potentially exposing their name, sex, current address and date of birth. Digital ID promises to maintain the integrity of the Licensing Act while allowing faster, safer and more secure verification.

James Hawkins, from the British Beer and Pub Association, said “this welcome change brings the Licensing Act in line with current technology and will make a visit to the pub easier for both customers and staff.”

Digital ID services generated £2.05 billion in 2023/2024, and employed over 10,000 people – half of them outside London. ID tech, from trusted providers, helps to boost the economy, protect young people, and perhaps even cut queues at the bar.

Categories
Legislation Other News

New Data Bill set to radically expand use of digital ID in the UK

Potential boost to UK economy by £10 billion over 10 years

Digital identity providers are to be regulated in a sweeping overhaul of the law. The Data (Use and Access) Bill will introduce oversight of digital ID, replacing the current voluntary system with a government register that could bring a £10 billion boost to the economy over 10 years.

Digital identities are an increasingly common way of accessing age-restricted online services or products. Providers like Luciditi offer security at the level demanded by banks, allowing individuals to securely verify their identity and/or age and helping to minimise the risk of identity fraud.

Trust is a key feature of an industry that serves as a ‘middle man’ between online consumers and suppliers. At the moment, digital ID providers – including Luciditi – may be voluntarily certified against the UK’s digital identity trust framework, a set of rules defining a good digital ID service.

However, in some cases where identity needs to be verified, for example employers looking to check someone’s criminal record or right to work, trust is needed at a deeper level. Similarly public agencies would be able to operate far more efficiently if identity verification were better regulated.

Setting standards, regulating providers

The Data (Use and Access) Bill – abbreviated to DUA – will transform certification, putting it on a more formal footing. DUA proposes wide-ranging reforms that will expand trust in four main areas:


1. UK digital identity and attributes trust framework
DUA will bring legal standing to the trust framework, expanding it and transforming it into the Digital Verification Service, giving it a broader and more structured regulatory foundation.

2. Register of digital identity services
DUA will establish a publicly available register of digital ID providers, listing organisations that have been independently assessed and certified against the trust framework. Under DUA, ministers will assess applications to join the register and potentially refuse applications or de-list providers.

3. Trust mark
Under DUA, registered providers will be able to display a designated ‘trust mark’ to distinguish their services in the market.

4. Information sharing
DUA will lay the foundations of a new information gateway, which in time will allow public authority data to be shared with registered services to enable identity and eligibility to be checked.  

Reusable ‘smart data’

Key parts of DUA are inherited from a Bill that began life under the previous government but failed to get through parliament before the general election. In the King’s Speech in July, the new Labour government introduced the Digital Information and Smart Data Bill, (DISD).

DISD was updated over the summer, and the Bill was renamed as DUA to reflect its broader scope. Nevertheless, ‘smart data’ remains at its heart – a reference to standard identity data managed in a smart way.

For example, Luciditi’s app allows individuals to upload their personal data to a digital identity wallet. After they’ve done this once, they can then reuse the data as many times as they want whenever they sign up for online services or products that require identity or age checks.

While online suppliers may ask for identity verification or proof of age, Luciditi’s app typically does not release personal data to a third party. It simply gives them basic assurance that the individual is who they say they are and the age they claim to be. The app gives users a choice. If they choose to, they can release the data to a third party that needs to see it, for example when opening a bank account. The key point here is that Luciditi allows users to choose whether to share information, every time.

Standardised reliance on digital identity throughout a particular industry is known as a smart data scheme, though at the moment only online banking comes close to this.

Outside banking, suppliers need to believe that assurance is reliable. For them, digital identity providers who are voluntarily certified against the trust framework can be expected to securely support online access to restricted services and products. Regulation of this system into broader areas of the economy will speed up processes such as renting a home or starting a new job.

Embedding ID tech into public agencies

Under DUA, identity assurance will only be available to an authorised third-party supplier upon an individual’s request. But it paves the way for something more.

The government has said it wants to “harness the power of data for economic growth, to support a modern digital government, and to improve people’s lives.”

Government agencies, managing the data of millions of individuals, are currently bogged down in paperwork. By embedding smart data and identity assurance in public services, DUA will make it easier and quicker to manage people’s personal data, cutting down on laborious bureaucratic procedures.

The government believes that DUA “will free up 1.5 million hours of police time and 140,000 NHS staff hours every year speeding up care and improving patients’ health outcomes.” The Bill would allow for healthcare information – like a patient’s pre-existing conditions, appointments and tests – to be easily accessed in real time across all NHS trusts, GP surgeries and ambulance services, no matter which IT system they were using.

The new Bill will also allow births and deaths to be recorded online instead of via the current paper-based system. Registrations could also be carried out over the phone, not just in person.

For some people, making it easier to release their personal health data will be a cause for concern. In Luciditi’s case, the developers’ previous product already safeguards health data for millions of people across the UK. The Luciditi app benefits from this same level of trusted security.

The government has confirmed that the Bill does not include a mandatory national digital ID card, or any requirement to possess a digital identity. By simply creating a legislative structure of standards, governance and oversight for providers, DUA will demand that all registered digital ID providers meet a similar standard.

Managing oversight of DUA


The trust framework and register of providers will be overseen by a newly created team known as the Office for Digital Identities and Attributes (OfDIA) which sits within the Department for Science, Innovation and Technology. DUA’s provisions will be carried out, under the authority of the DSIT Secretary of State.

OfDIA staff created the trust framework, in collaboration with industry, academia, and civil society groups, and intend to review and refresh it every year. OfDIA believe that, under the framework, hundreds of thousands of digital identity checks are already taking place each month.

Beyond digital ID, the new Bill also supports a national chart of the UK’s underground infrastructure. The National Underground Asset Register is a new digital map that will shape the way pipes and cables are installed, operated and repaired. It will give planners and excavators standardised, secure, instant access to the information they need, reducing excavation accidents that can quickly disrupt business.

Next steps


DUA was introduced in the House of Lords on 23 October and will take a year or so to work its way through Parliament. Welcoming its introduction, Technology Secretary Peter Kyle said: “This Bill will help us boost the UK’s economy, free up vital time for our front-line workers, and relieve people from unnecessary admin so that they can get on with their lives.”

The new legislation won’t force anyone to use a digital identity. But a legally-protected structure of standards will give consumers and online suppliers new confidence in the broader use of smart data schemes.

Data is critical for UK business: 77% of UK companies handle some form of digital data, increasing to 99% for businesses employing more than 10 people. DUA has the potential to unlock much needed national growth and ensure that, in the push towards digitally-based business, Britain isn’t left behind.

Categories
Legislation Other News

New age assurance guidelines for user-to-user and search platforms

Loading the Elevenlabs Text to Speech AudioNative Player...

 

Ofcom’s second consultation offers early insight into new rules

New guidelines protecting children from harmful content bring search engines and user-to-user platforms a step closer to mandatory age assurance. The draft regulations from Ofcom, the UK’s online safety regulator, are open to consultation. But they provide an early glance at the tough new rules that will restrict access to content from 2025.

The proposed guidelines are Ofcom’s latest response to the Online Safety Act. Passed last year, the Act will give Britain one of the toughest online regulatory systems in the world. Social media apps, search engines and other online services will need to adopt robust age checks and stop their algorithms recommending harmful content to children.

What is harmful content?

This is the second of Ofcom’s four consultation exercises on finalising the regulations that will flesh out the Act’s skeleton framework. The first, which closed in February, focused on protecting people from illegal content. The current discussions will lead to new rules designed to stop children accessing harmful content. The Act divides harmful content into three broad categories:

Primary priority content (PPC) that is harmful to children:

Pornographic content, and content which encourages, promotes, or provides instructions for suicide, self-harm, and eating disorders.

Priority content (PC) that is harmful to children:

Content which is abusive or incites hatred, bullying content, and content which encourages, promotes, or provides instructions for violence, dangerous stunts and challenges, and self-administering harmful substances.

Non-designated content that presents a material risk of harm to children:

Any types of content that do not fall within the above two categories which presents “a material risk of significant harm to an appreciable number of UK children.”
 
Based on these definitions, Ofcom has published draft Children’s Safety Codes which aim to ensure that:

  1. Children will not normally be able to access pornography.
  2. Children will be protected from seeing, and being recommended, potentially harmful content.
  3. Children will not be added to group chats without their consent.
  4. It will be easier for children to complain when they see harmful content, and they can be more confident that their complaints will be acted on.

 

Creating a safer online environment

In a four-week period (June-July 2023), Ofcom found that 62% of children aged 13-17 encountered PPC/PC online. Research also found that children consider violent content ‘unavoidable’ online, and that nearly two-thirds of children and young adults (13-19) have seen pornographic content. The number of girls aged 13-21 who have been subject to abusive or hateful comments online has almost tripled in 10 years from 20% in 2013 to 57% in 2023.

To create a safer online environment for children, Ofcom has outlined a series of steps that search services and user-to-user platforms will be expected to take.

Online services must determine whether or not they are likely to be accessed by children. To help in this, Ofcom has posted an online tool, here. Platforms that are likely to be accessed by children must:

  1. Complete a risk assessment to identify risks posed to children, drawing on Ofcom’s ‘children’s risk profiles’.
  2. Prevent children from encountering primary priority content relating to suicide, self-harm, eating disorders, and pornography. Services must also minimise children’s exposure to other serious harms defined as ‘priority content’, including violent, hateful or abusive material, bullying content, and content promoting dangerous challenges.
  3. Implement and review safety measures to mitigate the risks to children. Ofcom’s Safety Codes include more than 40 measures such as robust age checks, safer algorithms, effective moderation, strong governance and accountability, and more information and support for children including easy-to-use reporting and complaints processes.

 

Highly effective age assurance

There is no single fix-all measure that services can take to protect children online. But the package of measures recommended by Ofcom prominently relies on age assurance. Ofcom anticipates that most digital services not using age assurance are likely to be accessed by children. Once the final draft of the new rules comes into force, age assurance will be mandatory.
 
In practice, this will mean that all services will have to ban harmful content or introduce what Ofcom describes as “highly effective age-checks” restricting access to either the whole platform or parts of it that offer adults-only content. Ofcom defines “highly effective” as age assurance capable of technical accuracy, robustness, reliability, and fairness, with further details here.
 
Regulated services will no longer be able to get away with an ineffective ‘I am 18’ button. They will need to commit to age assurance technology to ensure their services are safer by design.
 
The quickest way of doing this is to adopt a proven digital ID product, like Luciditi. Ian Moody, Luciditi co-founder and CEO, says, “Easier and more cost-effective than starting from scratch, Luciditi can be easily embedded in web sites or apps, either by using a pre-built plugin or by using our Software Development Kit.”
 
Ofcom have specifically said their measures will apply to all sites that fall within the scope of the Act, irrespective of the size of the business. ‘We’re too small to be relevant’, won’t wash as an excuse.
 
Services cannot refuse to take steps to protect children simply because the work is too expensive or inconvenient. Ofcom says, “protecting children is a priority and all services, even the smallest, will have to take action as a result of our proposals.”
 

“Don’t wait for enforcement and hefty fines” – Tech Sec

According to Ofcom, children who have encountered harmful content experience feelings of anxiety, shame or guilt, sometimes leading to a wide-ranging and severe impact on their physical and mental wellbeing.
 
The lawlessness exploited by some of the world’s leading social media platforms has contributed to the deaths of children like 14-year-old Molly Russell. The coroner’s report concluded that watching content promoting suicide and self-harm had contributed to Molly’s death by suicide.
 
“We want children to enjoy life online”, said Dame Melanie Dawes, Ofcom Chief Executive, “but for too long, their experiences have been blighted by seriously harmful content which they can’t avoid or control. Many parents share feelings of frustration and worry about how to keep their children safe. That must change.”
 
The consultation exercise closes on July 17, 2024. Ofcom says, “We will take all feedback into account, as well as engaging with children to hear what they think of our plans. We expect to finalise our proposals and publish our final statement and documents in spring 2025.”
 
Welcoming Ofcom’s proposals, Technology Secretary Michelle Donelan said, “To platforms, my message is engage with us and prepare. Do not wait for enforcement and hefty fines – step up to meet your responsibilities and act now.”
 
The Online Safety Act doesn’t pull its punches. Repeat offenders will potentially be fined up to £18 million or 10% of global revenue, whichever is greater, and company managers risk going to jail for up to two years. In the coming months, platforms will need to be proactive in committing to the age assurance products that will help them stay on the right side of the law.
 
In Britain at least, the carefree distribution of harmful content is about to change. Ofcom’s proposals go much further than current industry practice and demand a step-change from tech firms in how UK children are protected online.
 

Want to know more?

Luciditi’s Age Assurance technology can help companies meet these strict new guidelines.  If you would like to know more, Contact us for a chat today.

Get in touch