If you've got a teenager in Australia who streams or watches Twitch, they've probably noticed something this week: the platform stopped letting them log in. That's not a bug. As of March 9, Australia's expanded online safety regulations came into effect, and Twitch is now classified as an age-restricted social media service. No new accounts for under-16s. Existing accounts? Getting deactivated.
Let's be real: this is part of something bigger. Australia's eSafety Commissioner has rolled out Phase 2 of new industry codes that treat online platforms like never before. We're not just talking about social media bans anymore. The regulations now touch gaming platforms, messaging services, email providers, and app stores. Age verification has become the new normal for accessing anything with an under-18 audience.
For platforms like Roblox, the requirement is less a total ban and more a barrier: age-verification tech to access chat features. Games such as Minecraft and Fortnite haven't been banned outright, but platforms now face legal obligation to verify age before showing 18-plus content. Companies that don't comply face fines up to A$49.5 million per breach.
Here's where it gets complicated. Critics from the University of Sydney point out that online games are crucial social spaces for young people, not just entertainment. Isolating them from these spaces could harm vulnerable teenagers who rely on gaming communities for connection and support. Others worry that age verification itself creates new privacy risks: collecting biometric data, government ID scans, or credit card information exposes young people to data breaches and normalises surveillance.
The government's case is straightforward: young people need protection from exploitation, harmful content, and algorithm-driven addiction. Fair enough. But the implementation raises real questions about whether blanket age bans are the right answer, especially when the regulation sweeps up platforms with entirely different purposes under one umbrella. Twitch, which runs esports tournaments watched by millions of Australians, gets treated the same as Instagram. Games designed for family play get tangled in age-verification requirements meant for adult content.
By March 9, the eSafety Commissioner's register of online safety codes had registered six new standards. The practical impact is still rolling out. Some platforms are scrambling to comply, while others are pushing back quietly. For Australian gamers, the landscape just got more complicated, not clearer. And whether that's a net win for young people's safety depends entirely on whether protecting them from social media actually means isolating them from community.