From Singapore: Two separate but deeply connected developments this week have thrown the global push for digital age verification into sharp relief. In California, a sweeping new law will compel every operating system provider — from Microsoft and Apple down to volunteer-run Linux distributions and Valve's gaming-focused SteamOS — to collect user age data at account setup. Simultaneously, Discord has been forced into a public retreat over its own age verification rollout, delaying a globally controversial system after a cascade of privacy concerns and a damaging prior security breach. For Australian users, regulators, and platform operators, the implications of both stories are direct and immediate.
California Casts a Wide Net
On 13 October 2025, California Governor Gavin Newsom signed Assembly Bill 1043, known as the Digital Age Assurance Act, into law. Effective 1 January 2027, it introduces a device-based age verification system designed to create safer digital environments for children under 18. The mechanism is relatively straightforward on paper: operating system providers must collect the birth date or age of the primary user through the account setup process, though they need not collect government IDs to verify that age. Based on this information, they must then send digital signals via a real-time API to app developers, transmitting the user's age bracket — under 13, at least 13 and under 16, at least 16 and under 18, or at least 18.
What has caught the technology industry off-guard is the breadth of the law's definition of "operating system provider". The bill demands age verification be added to the start-up process of any OS device, which would include Microsoft, Linux, Mac, iOS, Android, and arguably even platforms like SteamOS. For mainstream commercial operating systems, compliance poses little practical burden. Windows, for instance, already requires users to enter a date of birth during Microsoft Account setup. But the open-source world is a different matter. The idea that all operating system providers must comply has drawn considerable pushback from Linux communities. Critics point out, not without force, that the distributed, community-driven nature of most Linux distributions makes enforcement by a single US state government exceedingly difficult to imagine in practice.
Penalties for non-compliance can reach $2,500 per affected child for negligent violations and $7,500 for intentional ones, with enforcement handled by the California Attorney General. The Act does not include a private right of action. Whether California's regulators will realistically pursue open-source projects maintained by volunteers in Germany or Finland is a separate question the law does not answer.
Discord's Bruising Stumble
The California law provides the regulatory backdrop against which Discord's own troubles look far more consequential. In a blog post published this week, Discord co-founder and Chief Technology Officer Stanislav Vishnevskiy admitted the company "missed the mark" and confirmed that the global expansion of its age verification system has been delayed to the second half of 2026. The original plan, announced earlier this month, had called for a March rollout in which all new and existing users worldwide would receive a "teen-appropriate experience" by default — meaning much of the platform's content would be locked unless users could prove their age.
The reaction was swift and largely hostile. Many users pointed to a recent security breach of a third-party provider Discord had worked with, which exposed government ID photos of up to 70,000 Discord users. Concerns mounted further when it emerged that Discord had quietly tested an age verification system in the UK using a vendor called Persona, a company backed by Founders Fund, a venture capital firm led by Peter Thiel, co-founder of Palantir Technologies. Palantir's work with US federal immigration agencies added an uncomfortable dimension to an already tense debate about what platforms do with sensitive identity data.
Discord confirmed it had conducted a limited test in the United Kingdom with Persona, but that the company did not meet Discord's internal standard requiring facial age estimation to be performed entirely on-device, so that biometric data does not leave the user's phone. Persona's CEO disputed that characterisation, saying his company does offer on-device age verification and criticising Discord for making what he described as inaccurate claims. Discord says all data collected during the trial was deleted after verification was complete, and that it will no longer work with Persona.
What Discord Is Actually Proposing
Vishnevskiy's blog post sought to reframe what the system actually involves. Over 90 per cent of users, he says, will never need to verify their age to continue using Discord as they do today, with the company's internal safety systems able to make an age determination for many adult users without any action required from them. Those internal signals include account age, payment methods on file, server participation patterns, and general activity trends. Vishnevskiy emphasised that Discord does not read messages, analyse conversations, or inspect account content to estimate age.
For the remaining minority of users, Discord's position is that verification options will be designed to confirm age without confirming identity. The company is pushing back its global rollout until the second half of 2026 and adding more verification options, including the ability to use a credit card. Users who choose not to verify their age will retain access to their accounts, including servers, friend lists, direct messages, and voice chat, but will not be able to access age-restricted content or modify certain teen safety defaults.
Australia Is Already in the Frame
For Australian users, this is not an abstract policy debate. The specific verification methods and compliance requirements Discord is designing are already being shaped by legislation in effect in the UK and Australia, with Brazil quick to follow, and Europe and multiple US states close behind. In countries where laws require the use of age verification platforms — including Australia — Vishnevskiy states that any adult who tries to access age-restricted content will need to verify their age through an approved vendor. That means Australian Discord users are likely to face stricter requirements than the 90 per cent of the global user base who will never see a verification prompt at all.
Australia's own Online Safety Act mandates age assurance frameworks for platforms carrying adult content, and regulators at the eSafety Commissioner have been pushing platforms for higher standards on child protection. Whether Discord's revised approach will satisfy those requirements remains unclear. Eurogamer noted that it is currently uncertain how the changes announced by Vishnevskiy will impact countries like Australia that explicitly mandate facial age estimation or ID checks.
The Harder Questions
There is a legitimate case for age assurance on platforms where teenagers are increasingly present. Discord itself acknowledges that the number of teenagers on its platform has significantly increased since the pandemic, and that they deserve an experience appropriate to their age. The impulse to protect children online is not only understandable — it is sound policy. Parents and child safety advocates have for years argued that self-reported birth dates at sign-up offer essentially no protection against determined young users.
Yet privacy advocates raise concerns that deserve genuine engagement rather than dismissal. The California law's broad definition of "operating system provider" risks pulling in community-led projects that lack the resources to build compliant systems, potentially consolidating the OS market further around large commercial players. On the platform side, the fact that Discord's first instinct was to partner with a biometrically capable vendor linked to a surveillance technology firm is exactly the kind of corporate behaviour that erodes public trust in self-regulation. Discord says it will publish a detailed explanation of how its automatic age estimation systems work and will document all verification vendors and their data handling practices before proceeding with a broader rollout. That is progress, but it is reactive progress.
The Australian Communications and Media Authority and similar regulators in the UK and Europe are watching closely to see whether industry can devise age verification mechanisms that are both technically robust and genuinely privacy-preserving. The Australian Parliament has signalled it expects platforms to take child safety obligations seriously, but the mechanisms for enforcement remain a work in progress. What is clear from both the California law and Discord's stumble is that the industry is being pushed toward age verification whether it is ready or not. The real question is whether that push produces genuine protection for young people, or simply a compliance exercise that trades one set of privacy risks for another.