Skip to main content

Archived Article — The Daily Perspective is no longer active. This article was published on 1 March 2026 and is preserved as part of the archive. Read the farewell | Browse archive

Technology

Tesla's Self-Driving Tech Arrives in Australia, Warts and All

Full Self-Driving is winning fans on Australian roads, but roundabouts and hook turns are exposing the limits of Silicon Valley's confidence.

Tesla's Self-Driving Tech Arrives in Australia, Warts and All
Image: Sydney Morning Herald
Key Points 4 min read
  • Tesla's Full Self-Driving software is now being used by Australian drivers on public roads.
  • Early users report impressive highway and urban performance, but the system struggles with roundabouts and Melbourne hook turns.
  • Regulators and safety advocates are watching closely as autonomous vehicle technology enters the mainstream.
  • The technology remains supervised automation, not true self-driving, requiring an attentive driver at all times.

There is something quietly surreal about sitting in a car on a Sydney street, hands resting in your lap, watching the steering wheel turn itself. For a growing number of Tesla owners across Australia, that is no longer a thought experiment. It is Tuesday morning.

Tesla's Full Self-Driving (FSD) software, long available in the United States and Canada, has begun appearing on Australian roads. The system uses a network of cameras and onboard processing to handle steering, acceleration, and braking across a wide range of driving conditions. Drivers who have used it locally are, by and large, enthusiastic. "You can tell I'm not steering," one user told the Sydney Morning Herald, capturing the slightly uncanny experience with admirable economy.

What FSD actually does (and what it doesn't)

Here's what that actually means for Australian drivers: FSD is not autonomous driving in the legally meaningful sense. Tesla's own documentation requires the person in the driver's seat to remain alert and ready to take control at any moment. The company classifies it as a Level 2 driver assistance system, which puts it in the same broad category as adaptive cruise control, only considerably more capable and considerably more likely to make you feel like a passenger in your own vehicle.

The distinction matters. When things go wrong with a Level 2 system, legal responsibility remains firmly with the human in the driver's seat. That is not a technical footnote. It is the entire liability framework underpinning how these vehicles operate on public roads today.

Where the system shines, and where it stumbles

By most user accounts, FSD handles freeways and well-marked urban arterials with genuine competence. Lane changes, traffic light detection, and following distance management all draw positive reviews. The software has been trained on an enormous volume of driving data from American roads, and that foundation shows in conditions that resemble them.

Australian roads, however, have their own personalities. Roundabouts, which are ubiquitous across suburban Australia and operate on give-way rules that differ from American four-way stops, have proven a consistent stumbling block. Melbourne's hook turns, a local innovation that would confuse most interstate visitors let alone a neural network trained in California, present an even steeper challenge. These are not edge cases. They are everyday infrastructure.

The software also reportedly struggles with some unmarked rural roads and intersections where line markings have faded, conditions that are common across regional Australia and that human drivers navigate largely through experience and contextual reading of the environment.

The regulatory picture

Australia does not yet have a unified national framework for autonomous vehicle deployment. The Department of Infrastructure, Transport, Regional Development, Communications and the Arts has been developing policy in this space for several years, and the National Heavy Vehicle Regulator and state road authorities each carry pieces of the oversight puzzle. For now, FSD-equipped Teslas operate under existing road rules, with the human driver bearing full responsibility.

That arrangement will be tested as the technology becomes more common. The Australian Academy of Technology and Engineering and various transport researchers have argued that clear national standards are overdue, particularly as the gap between what these systems can technically do and what regulators have formally assessed continues to widen.

From a purely fiscal standpoint, there is also a question about road infrastructure. If FSD systems perform poorly on faded line markings and poorly signposted intersections, that is partly a maintenance and investment problem, not just a software one. Australian governments at all levels have run persistent road maintenance backlogs. A future in which autonomous vehicles demand higher infrastructure standards may, ironically, be good for roads.

Enthusiasm with eyes open

The case for driver assistance technology on safety grounds is not trivial. Road trauma remains a serious public health issue in Australia, and human error is a factor in the vast majority of serious crashes. If systems like FSD can reduce fatigue-related incidents or improve reaction times in highway driving, the potential benefit is real. The Australian Road Safety Foundation and others have long argued that technology has a role to play alongside enforcement and education.

The real question is whether the transition period, the years when partially automated vehicles share roads with fully human-driven ones, is managed carefully enough to capture those benefits without introducing new categories of risk. A driver who over-trusts an FSD system through a poorly mapped roundabout is not safer than one paying full attention. They may be less safe.

Tesla would prefer you focused on the impressive parts of the demo reel. And to be fair, the impressive parts are genuinely impressive. But the technology is still learning Australia, and Australia is still learning what to do with it. Both processes will take longer than the enthusiasm of early adopters might suggest, and that is probably fine. Watching a steering wheel turn itself is remarkable. Knowing exactly who is responsible when it turns the wrong way is more important.

Sources (1)
Tom Whitfield
Tom Whitfield

Tom Whitfield is an AI editorial persona created by The Daily Perspective. Covering AI, cybersecurity, startups, and digital policy with a sharp voice and dry wit that cuts through tech hype. As an AI persona, articles are generated using artificial intelligence with editorial quality controls.