Skip to main content

Archived Article — The Daily Perspective is no longer active. This article was published on 27 February 2026 and is preserved as part of the archive. Read the farewell | Browse archive

Opinion Opinion

When a Photo Proves Nothing But Destroys Everything

The Isaiah Iongi case exposes a crisis every Australian athlete now faces: AI-generated imagery that can end careers before the truth catches up.

When a Photo Proves Nothing But Destroys Everything
Image: Instagram
Key Points 4 min read
  • Unverified photographs circulating on social media appear to show Eels fullback Isaiah Iongi handling a rolled cigarette, triggering an NRL integrity investigation.
  • In 2026, photorealistic AI tools are freely available, meaning any image of any athlete can be fabricated by almost anyone with a laptop and a motive.
  • Australian defamation law offers limited protection when the creator of a deepfake is anonymous and the viral spread of an image outpaces any legal remedy.
  • The NRL Integrity Unit is doing its job, but sporting codes globally must update their evidentiary standards to account for the collapse of photographic proof.
  • Every professional athlete now faces a new kind of vulnerability: reputational destruction from manufactured content they had no part in creating.

From Sydney: The images arrived, as these things always do now, through the social media ecosystem without warning and without context. Photographs, apparently showing Parramatta Eels fullback Isaiah Iongi relaxing in a bathtub with a rolled cigarette, and seated at a kitchen table in a similar pose, spread across platforms within hours. By the time the NRL Integrity Unit had confirmed an investigation, millions of people had already formed a view.

But here is the question that should have preceded every word of commentary: how does anyone actually know those photographs are real?

Iongi is 22 years old, trying to establish himself in the most competitive rugby league competition on earth. The images were released without his knowledge or consent. That alone should prompt caution. What compounds the situation is something that sports administrators appear dangerously slow to reckon with: in 2026, a photograph is no longer reliable evidence of anything.

Artificial intelligence tools capable of generating photorealistic images of any person, in any setting, doing anything at all, are not restricted to film studios or intelligence agencies. They are available to anyone with a laptop and a grievance. The technical sophistication required to fabricate a convincing image of a professional athlete has dropped to near zero. The barrier to causing catastrophic reputational harm has never been lower.

The law and its limits

Australian defamation law exists, in principle, to protect individuals from false material that damages their reputation. A publication is defamatory if it would cause ordinary, reasonable members of the community to think less of the subject. But the law carries a critical weakness in the age of deepfakes: it generally requires an identifiable defendant. When the creator of a fabricated image is anonymous, the legal remedy is largely theoretical. The retraction, if it ever comes, never travels as far as the original lie.

There is a developing body of law around image-based abuse in Australia, constructed primarily with intimate imagery in mind, but grounded in a principle that applies equally here: a person has a legitimate interest in controlling how their image is used and how they are represented to the public. Non-consensual distribution of images, real or manufactured, causes documented psychological harm and measurable professional damage. Even if the photographs of Iongi prove entirely authentic, their release without his consent raises serious questions about the conduct of whoever chose to distribute them.

The situation is further complicated by the World Anti-Doping Code, which places the burden of proof on the athlete rather than the accuser. An athlete facing an allegation based on fabricated imagery could find themselves subject to provisional suspension while an investigation is conducted. The institutional machinery moves before the evidence is verified.

The other side of the argument

To be fair to the NRL and its Integrity Unit, there is a legitimate counterargument here. Sporting codes have an obligation to protect the integrity of competition and the welfare of the sport's community. Choosing not to investigate a complaint because the evidence might be fabricated sets a different and equally dangerous precedent. If administrators dismiss every complaint involving photographic material on the grounds that deepfakes exist, bad actors learn quickly to exploit that hesitation.

Progressive voices in sports law and athlete welfare have also argued, with some force, that the answer is not to discard investigations but to upgrade the evidentiary standards that govern them. An investigation that proceeds on the implicit assumption that photographs are authentic, without commissioning forensic analysis, is not a robust process. It is an outdated one. The presumption of innocence that every Australian citizen holds is not a courtesy to be extended selectively; it is a foundational principle of a just system, and it applies with particular force when the authenticity of the evidence itself is unresolved.

Those who advocate for stronger digital protections for athletes point to comparable cases abroad. The reputational damage suffered by figures including Dane Swan and Erin Andrews, both victims of the non-consensual release of real images, was severe and lasting. When the images are not even genuine, the harm is potentially worse, because there is no underlying truth that can eventually be established to restore a reputation.

An industry-wide reckoning

The Iongi case is not an isolated incident. It is an early and visible example of a challenge that every major sporting code in Australia and worldwide will face with increasing frequency. As eSafety Commissioner Julie Inman Grant has warned repeatedly, the tools for creating and distributing harmful content are accelerating faster than the legal and institutional frameworks designed to contain them.

Any disgruntled acquaintance, anonymous bad actor, or rival with basic technical skills can construct what amounts to a reputational time-bomb and detonate it at the moment of maximum inconvenience. For professional athletes whose livelihoods depend on sponsor relationships, club contracts, and public standing, the window between an image going viral and irreversible damage being done can be measured in hours.

The honest answer to the question of what happened with Isaiah Iongi is that we do not know. And that uncertainty is precisely the point. Sports administrators, media organisations, and governing bodies that treat unverified imagery as established fact are not doing their jobs; they are outsourcing judgment to whoever uploaded the file. That is not integrity. It is the opposite of it.

Reasonable people can disagree about where to draw the line between institutional vigilance and institutional recklessness. But the starting point must be this: in 2026, a photograph is not evidence. It is a hypothesis. Treating it as anything more, before forensic examination, before the presumption of innocence has been properly honoured, is a failure that no sporting code, and no journalist, should be comfortable with. The Parliament of Australia and sporting bodies alike have serious catching up to do.

James Callahan
James Callahan

James Callahan is an AI editorial persona created by The Daily Perspective. Reporting from conflict zones and diplomatic capitals with vivid, immersive storytelling that puts the reader on the ground. As an AI persona, articles are generated using artificial intelligence with editorial quality controls.