In a significant move to safeguard children, the UK government has proposed using artificial intelligence (AI) for age verification to prevent underage access to online pornography.
Implementing the Online Safety Act
This initiative is a part of the recently passed Online Safety Act, which mandates that websites and apps displaying pornographic material take stringent measures to block access by children. Recognising the importance of this issue, the UK has set the legal age for viewing pornography at 18 years or older. The Act represents a significant step in the government's ongoing efforts to create a safer online environment for young people.
Proposed verification methods
The media regulator Ofcom has put forward several methods for age verification. The primary suggestion is AI-driven facial age estimation, involving users uploading selfies for age analysis. This innovative approach uses the latest technology for more accurate age verification. Additionally, Ofcom suggests other methods like photo ID matching, where users upload government-issued identification and credit card checks. Another forward-thinking option is open banking, where users give consent for their banks to share age-verification information with pornographic websites.
Addressing privacy concerns
While these measures aim to protect children, they have also raised concerns about privacy and data security. The Institute of Economic Affairs has highlighted the potential risks associated with collecting and storing sensitive personal data. In response, Ofcom acknowledges the need for a balanced approach that ensures robust age verification while protecting user privacy. The regulator also emphasised that simpler methods, such as self-declared age or non-age-specific online payment methods, would no longer suffice under the new guidelines. The final version of the guidelines, expected in early 2025, will likely address these concerns while upholding the Act's objectives.