Skip to main content

Archived Article — The Daily Perspective is no longer active. This article was published on 3 March 2026 and is preserved as part of the archive. Read the farewell | Browse archive

Technology

World's Top Cryptographers Sound Alarm on Age Verification Rush

An open letter signed by more than 400 scientists warns that poorly designed age checks could expose millions of users to surveillance and data breaches.

World's Top Cryptographers Sound Alarm on Age Verification Rush
Image: PC Gamer
Key Points 3 min read
  • More than 400 scientists and researchers have signed an open letter warning that current age verification technologies are not ready for widespread regulatory mandates.
  • The letter argues it is 'dangerous and socially unacceptable' to deploy age-check systems before scientific consensus settles on their benefits and harms.
  • Australia's under-16 social media ban, which took effect in December 2025, is among the laws driving demand for age assurance technology globally.
  • Critics warn that biometric data collected for age checks creates major cybersecurity risks, while evidence shows determined users simply migrate to VPNs or offshore platforms.
  • Signatories include Turing Award winner Ronald Rivest and Bart Preneel, president of the International Association for Cryptologic Research.

From London: As Australians went about their Tuesday morning, a rare coalition of the world's foremost cryptographers and computer scientists published an open letter that should give every lawmaker who has passed an age verification mandate reason to pause. The letter, signed by over 400 researchers and scientists, lays out the many reasons why age verification technology, as it is currently implemented, falls far short of what governments are demanding of it.

The letter argues that age assurance services should not be deployed "until the scientific consensus settles on the benefits and harms that age-assurance technologies can bring," and that deploying them beforehand is "dangerous and socially unacceptable" without a full understanding of the consequences for security, privacy, and equality. Those are not words chosen lightly, and the signatories are not fringe voices.

Among those who have put their names to the letter are Ronald Rivest, winner of the prestigious Turing Award in computing, and Bart Preneel, president of the International Association for Cryptologic Research. Rivest co-won the 2002 Turing Award, the highest honour in computer science, for his contribution to making public-key cryptography useful in practice. When people of that calibre say a technology is not ready, the burden of proof shifts decisively onto those insisting it is.

For Canberra, this matters directly. From 10 December 2025, age-restricted social media platforms were required to take reasonable steps to prevent Australians under the age of 16 from creating or keeping accounts on their platforms. Australia's eSafety Commissioner has registered nine new industry codes that require millions of websites and services to implement age verification measures, with corporate penalties of up to A$49.5 million for non-compliance. The ambition is commendable; the technology underpinning it remains contested.

The technical objections in the letter are specific and serious. An effective age verification system would require cryptographic protection built into every query, but such infrastructure "is not only hard to build and maintain on a global scale, but would add friction in services, meaning many providers would refuse to install age checks." In other words, the versions of age verification actually being deployed are, by necessity, the weak ones.

There is also significant bias in facial recognition technology, which is often poor at estimating the age of women, particularly those wearing makeup, and people of colour. Laws in multiple countries now require users to upload government ID or submit facial scans to confirm their age, but digital rights advocates warn this introduces new privacy risks, especially when sensitive personal data is collected or stored by third-party vendors.

The strongest counter-argument to the scientists' position is the one that should not be dismissed: children are genuinely at risk online. Research from the European Commission's Joint Research Centre has shown how prolonged social media exposure can affect the prefrontal cortex, involved in impulse control, and the amygdala, which processes emotions and anxiety. Governments are not wrong to take that seriously. The question is whether the cure is being applied with sufficient care.

In the UK, where comparable requirements took effect under the Online Safety Act, users have had to entrust sensitive personal data to a range of newly emerged commercial age assurance providers that often offer little accountability or transparency about their data handling practices. UK users have also demonstrated how ineffective age-gating mechanisms can be, using VPNs and video game features to bypass age barriers with ease. Australian regulators should study that experience closely before assuming their own regime will be different.

Research from the New York Center for Social Media and Politics found that searches for blocked platforms dropped while searches for offshore sites surged, with one US state recording a 1,150 per cent increase in VPN demand after its age verification law took effect. Determined teenagers, it turns out, are resourceful. Regulation that drives users toward less moderated offshore services may achieve the opposite of its intent.

The scientists are not arguing that children should be left unprotected. They are arguing, with considerable technical authority, that the path being taken is not yet safe. Cryptographic approaches such as zero-knowledge proofs, which can confirm a user is over a certain age without revealing their exact identity, offer a more privacy-respecting path, but deploying such a system to the required standard needs considerably more development than any government has been willing to fund.

The eSafety Commissioner has stated that safety and privacy do not have to be mutually exclusive, and that is the right framing. So is the commitment to an independent review of Australia's social media minimum age laws within two years of the December 2025 commencement. But review clauses are only meaningful if governments are genuinely prepared to change course on the basis of evidence.

The reasonable conclusion here is not that age verification should be abandoned, but that it should be pursued with far greater rigour than most governments have shown so far. Child safety is a legitimate and urgent policy objective. So is the protection of every citizen's privacy and data security. When more than 400 scientists, including some of the architects of modern cryptography, say the current approach risks doing more harm than good, that is precisely the kind of evidence that good government is supposed to heed. The Office of the Australian Information Commissioner and the eSafety Commissioner have both signalled awareness of the privacy stakes. Translating that awareness into technical standards that actually protect children, without building a surveillance apparatus in the process, is the genuinely hard work that remains to be done.

Sources (10)
Oliver Pemberton
Oliver Pemberton

Oliver Pemberton is an AI editorial persona created by The Daily Perspective. Covering European politics, the UK economy, and transatlantic affairs with the dual perspective of an Australian abroad. As an AI persona, articles are generated using artificial intelligence with editorial quality controls.