Categories
Latest News Legislation Technology

New Data Bill set to radically expand use of digital ID in the UK

Potential boost to UK economy by £10 billion over 10 years

Digital identity providers are to be regulated in a sweeping overhaul of the law. The Data (Use and Access) Bill will introduce oversight of digital ID, replacing the current voluntary system with a government register that could bring a £10 billion boost to the economy over 10 years.

Digital identities are an increasingly common way of accessing age-restricted online services or products. Providers like Luciditi offer security at the level demanded by banks, allowing individuals to securely verify their identity and/or age and helping to minimise the risk of identity fraud.

Trust is a key feature of an industry that serves as a ‘middle man’ between online consumers and suppliers. At the moment, digital ID providers – including Luciditi – may be voluntarily certified against the UK’s digital identity trust framework, a set of rules defining a good digital ID service.

However, in some cases where identity needs to be verified, for example employers looking to check someone’s criminal record or right to work, trust is needed at a deeper level. Similarly public agencies would be able to operate far more efficiently if identity verification were better regulated.

Setting standards, regulating providers


The Data (Use and Access) Bill – abbreviated to DUA – will transform certification, putting it on a more formal footing. DUA proposes wide-ranging reforms that will expand trust in four main areas:


1. UK digital identity and attributes trust framework
DUA will bring legal standing to the trust framework, expanding it and transforming it into the Digital Verification Service, giving it a broader and more structured regulatory foundation.

2. Register of digital identity services
DUA will establish a publicly available register of digital ID providers, listing organisations that have been independently assessed and certified against the trust framework. Under DUA, ministers will assess applications to join the register and potentially refuse applications or de-list providers.

3. Trust mark
Under DUA, registered providers will be able to display a designated ‘trust mark’ to distinguish their services in the market.

4. Information sharing
DUA will lay the foundations of a new information gateway, which in time will allow public authority data to be shared with registered services to enable identity and eligibility to be checked.  

Reusable ‘smart data’


Key parts of DUA are inherited from a Bill that began life under the previous government but failed to get through parliament before the general election. In the King’s Speech in July, the new Labour government introduced the Digital Information and Smart Data Bill, (DISD).

DISD was updated over the summer, and the Bill was renamed as DUA to reflect its broader scope. Nevertheless, ‘smart data’ remains at its heart – a reference to standard identity data managed in a smart way.

For example, Luciditi’s app allows individuals to upload their personal data to a digital identity wallet. After they’ve done this once, they can then reuse the data as many times as they want whenever they sign up for online services or products that require identity or age checks.

While online suppliers may ask for identity verification or proof of age, Luciditi’s app typically does not release personal data to a third party. It simply gives them basic assurance that the individual is who they say they are and the age they claim to be. The app gives users a choice. If they choose to, they can release the data to a third party that needs to see it, for example when opening a bank account. The key point here is that Luciditi allows users to choose whether to share information, every time.

Standardised reliance on digital identity throughout a particular industry is known as a smart data scheme, though at the moment only online banking comes close to this.

Outside banking, suppliers need to believe that assurance is reliable. For them, digital identity providers who are voluntarily certified against the trust framework can be expected to securely support online access to restricted services and products. Regulation of this system into broader areas of the economy will speed up processes such as renting a home or starting a new job.

Embedding ID tech into public agencies


Under DUA, identity assurance will only be available to an authorised third-party supplier upon an individual’s request. But it paves the way for something more.

The government has said it wants to “harness the power of data for economic growth, to support a modern digital government, and to improve people’s lives.”

Government agencies, managing the data of millions of individuals, are currently bogged down in paperwork. By embedding smart data and identity assurance in public services, DUA will make it easier and quicker to manage people’s personal data, cutting down on laborious bureaucratic procedures.

The government believes that DUA “will free up 1.5 million hours of police time and 140,000 NHS staff hours every year speeding up care and improving patients’ health outcomes.” The Bill would allow for healthcare information – like a patient’s pre-existing conditions, appointments and tests – to be easily accessed in real time across all NHS trusts, GP surgeries and ambulance services, no matter which IT system they were using.

The new Bill will also allow births and deaths to be recorded online instead of via the current paper-based system. Registrations could also be carried out over the phone, not just in person.

For some people, making it easier to release their personal health data will be a cause for concern. In Luciditi’s case, the developers’ previous product already safeguards health data for millions of people across the UK. The Luciditi app benefits from this same level of trusted security.

The government has confirmed that the Bill does not include a mandatory national digital ID card, or any requirement to possess a digital identity. By simply creating a legislative structure of standards, governance and oversight for providers, DUA will demand that all registered digital ID providers meet a similar standard.

Managing oversight of DUA


The trust framework and register of providers will be overseen by a newly created team known as the Office for Digital Identities and Attributes (OfDIA) which sits within the Department for Science, Innovation and Technology. DUA’s provisions will be carried out, under the authority of the DSIT Secretary of State.

OfDIA staff created the trust framework, in collaboration with industry, academia, and civil society groups, and intend to review and refresh it every year. OfDIA believe that, under the framework, hundreds of thousands of digital identity checks are already taking place each month.

Beyond digital ID, the new Bill also supports a national chart of the UK’s underground infrastructure. The National Underground Asset Register is a new digital map that will shape the way pipes and cables are installed, operated and repaired. It will give planners and excavators standardised, secure, instant access to the information they need, reducing excavation accidents that can quickly disrupt business.

Next steps


DUA was introduced in the House of Lords on 23 October and will take a year or so to work its way through Parliament. Welcoming its introduction, Technology Secretary Peter Kyle said: “This Bill will help us boost the UK’s economy, free up vital time for our front-line workers, and relieve people from unnecessary admin so that they can get on with their lives.”

The new legislation won’t force anyone to use a digital identity. But a legally-protected structure of standards will give consumers and online suppliers new confidence in the broader use of smart data schemes.

Data is critical for UK business: 77% of UK companies handle some form of digital data, increasing to 99% for businesses employing more than 10 people. DUA has the potential to unlock much needed national growth and ensure that, in the push towards digitally-based business, Britain isn’t left behind.

Categories
Legislation Other News

New age assurance guidelines for user-to-user and search platforms

Loading the Elevenlabs Text to Speech AudioNative Player...

 

Ofcom’s second consultation offers early insight into new rules

New guidelines protecting children from harmful content bring search engines and user-to-user platforms a step closer to mandatory age assurance. The draft regulations from Ofcom, the UK’s online safety regulator, are open to consultation. But they provide an early glance at the tough new rules that will restrict access to content from 2025.

The proposed guidelines are Ofcom’s latest response to the Online Safety Act. Passed last year, the Act will give Britain one of the toughest online regulatory systems in the world. Social media apps, search engines and other online services will need to adopt robust age checks and stop their algorithms recommending harmful content to children.

What is harmful content?

This is the second of Ofcom’s four consultation exercises on finalising the regulations that will flesh out the Act’s skeleton framework. The first, which closed in February, focused on protecting people from illegal content. The current discussions will lead to new rules designed to stop children accessing harmful content. The Act divides harmful content into three broad categories:

Primary priority content (PPC) that is harmful to children:

Pornographic content, and content which encourages, promotes, or provides instructions for suicide, self-harm, and eating disorders.

Priority content (PC) that is harmful to children:

Content which is abusive or incites hatred, bullying content, and content which encourages, promotes, or provides instructions for violence, dangerous stunts and challenges, and self-administering harmful substances.

Non-designated content that presents a material risk of harm to children:

Any types of content that do not fall within the above two categories which presents “a material risk of significant harm to an appreciable number of UK children.”
 
Based on these definitions, Ofcom has published draft Children’s Safety Codes which aim to ensure that:

  1. Children will not normally be able to access pornography.
  2. Children will be protected from seeing, and being recommended, potentially harmful content.
  3. Children will not be added to group chats without their consent.
  4. It will be easier for children to complain when they see harmful content, and they can be more confident that their complaints will be acted on.

 

Creating a safer online environment

In a four-week period (June-July 2023), Ofcom found that 62% of children aged 13-17 encountered PPC/PC online. Research also found that children consider violent content ‘unavoidable’ online, and that nearly two-thirds of children and young adults (13-19) have seen pornographic content. The number of girls aged 13-21 who have been subject to abusive or hateful comments online has almost tripled in 10 years from 20% in 2013 to 57% in 2023.

To create a safer online environment for children, Ofcom has outlined a series of steps that search services and user-to-user platforms will be expected to take.

Online services must determine whether or not they are likely to be accessed by children. To help in this, Ofcom has posted an online tool, here. Platforms that are likely to be accessed by children must:

  1. Complete a risk assessment to identify risks posed to children, drawing on Ofcom’s ‘children’s risk profiles’.
  2. Prevent children from encountering primary priority content relating to suicide, self-harm, eating disorders, and pornography. Services must also minimise children’s exposure to other serious harms defined as ‘priority content’, including violent, hateful or abusive material, bullying content, and content promoting dangerous challenges.
  3. Implement and review safety measures to mitigate the risks to children. Ofcom’s Safety Codes include more than 40 measures such as robust age checks, safer algorithms, effective moderation, strong governance and accountability, and more information and support for children including easy-to-use reporting and complaints processes.

 

Highly effective age assurance

There is no single fix-all measure that services can take to protect children online. But the package of measures recommended by Ofcom prominently relies on age assurance. Ofcom anticipates that most digital services not using age assurance are likely to be accessed by children. Once the final draft of the new rules comes into force, age assurance will be mandatory.
 
In practice, this will mean that all services will have to ban harmful content or introduce what Ofcom describes as “highly effective age-checks” restricting access to either the whole platform or parts of it that offer adults-only content. Ofcom defines “highly effective” as age assurance capable of technical accuracy, robustness, reliability, and fairness, with further details here.
 
Regulated services will no longer be able to get away with an ineffective ‘I am 18’ button. They will need to commit to age assurance technology to ensure their services are safer by design.
 
The quickest way of doing this is to adopt a proven digital ID product, like Luciditi. Ian Moody, Luciditi co-founder and CEO, says, “Easier and more cost-effective than starting from scratch, Luciditi can be easily embedded in web sites or apps, either by using a pre-built plugin or by using our Software Development Kit.”
 
Ofcom have specifically said their measures will apply to all sites that fall within the scope of the Act, irrespective of the size of the business. ‘We’re too small to be relevant’, won’t wash as an excuse.
 
Services cannot refuse to take steps to protect children simply because the work is too expensive or inconvenient. Ofcom says, “protecting children is a priority and all services, even the smallest, will have to take action as a result of our proposals.”
 

“Don’t wait for enforcement and hefty fines” – Tech Sec

According to Ofcom, children who have encountered harmful content experience feelings of anxiety, shame or guilt, sometimes leading to a wide-ranging and severe impact on their physical and mental wellbeing.
 
The lawlessness exploited by some of the world’s leading social media platforms has contributed to the deaths of children like 14-year-old Molly Russell. The coroner’s report concluded that watching content promoting suicide and self-harm had contributed to Molly’s death by suicide.
 
“We want children to enjoy life online”, said Dame Melanie Dawes, Ofcom Chief Executive, “but for too long, their experiences have been blighted by seriously harmful content which they can’t avoid or control. Many parents share feelings of frustration and worry about how to keep their children safe. That must change.”
 
The consultation exercise closes on July 17, 2024. Ofcom says, “We will take all feedback into account, as well as engaging with children to hear what they think of our plans. We expect to finalise our proposals and publish our final statement and documents in spring 2025.”
 
Welcoming Ofcom’s proposals, Technology Secretary Michelle Donelan said, “To platforms, my message is engage with us and prepare. Do not wait for enforcement and hefty fines – step up to meet your responsibilities and act now.”
 
The Online Safety Act doesn’t pull its punches. Repeat offenders will potentially be fined up to £18 million or 10% of global revenue, whichever is greater, and company managers risk going to jail for up to two years. In the coming months, platforms will need to be proactive in committing to the age assurance products that will help them stay on the right side of the law.
 
In Britain at least, the carefree distribution of harmful content is about to change. Ofcom’s proposals go much further than current industry practice and demand a step-change from tech firms in how UK children are protected online.
 

Want to know more?

Luciditi’s Age Assurance technology can help companies meet these strict new guidelines.  If you would like to know more, Contact us for a chat today.

Get in touch