Australian secondary schools are using artificial intelligence at a rate that would make any technology rollout envious. By mid-2025, 78 percent of secondaries had already adopted AI tools. Yet here's the paradox: zero percent of those schools feel fully prepared to use them safely or effectively.
The gap between adoption and readiness reveals a familiar pattern in Australian education technology. Schools rush to implement because competitors are, because vendors are selling, because policy frameworks say they should. What they don't have is the funding, training, and oversight to do it well.
The numbers paint a sobering picture. Sixty-one percent of Australian schools currently have no AI policy at all. Three-quarters of school boards haven't been properly briefed on AI governance. And whilst Australian teachers use AI at nearly double the international average, most lack formal training in privacy protection, algorithmic bias, or data governance.
The Australian Framework for Generative AI in Schools was endorsed in June 2025 by all education ministers. It's comprehensive on paper. It emphasises ethical use, student safety, and responsible implementation. But it came with no dedicated funding, no implementation support, and no enforcement mechanism. It was aspirational but toothless.
The training that does exist is fragmented. Microsoft and the Education Services Australia partnership offers a free 180-minute training module, and NSW has developed NSWEduChat, a secure AI tool for staff and students. These initiatives matter, but they're basics, not comprehensive capability building. Teachers need ongoing professional development in how AI systems work, where they fail, and what happens when they're deployed in high-stakes educational settings.
The equity risk is real. Rural and remote schools face infrastructure constraints that make AI adoption uneven. Data privacy becomes a minefield when schools use commercial AI tools with unclear data practices. And algorithmic bias in educational AI — from automated essay scoring to predictive student tracking — has documented consequences for marginalised students.
2026 is a pivot point. Schools can either stumble forward with piecemeal adoption, hoping nothing goes wrong, or they can pause long enough to build genuine capability. That requires funding. It requires board-level governance. It requires teachers who understand both what AI can do and what it can't.
The good news: the framework is in place and governments recognise the stakes. The hard part: turning that framework into real change in schools that are already stretched thin managing everything from teacher shortages to mental health crises. Adopting technology is easy. Implementing it responsibly is the work ahead.