Skip to main content

Archived Article — The Daily Perspective is no longer active. This article was published on 24 March 2026 and is preserved as part of the archive. Read the farewell | Browse archive

Property

NSW moves to expose the AI deception in rental listings

New laws will force agents to disclose digitally altered photos, addressing years of frustration from renters seeking honest property information.

NSW moves to expose the AI deception in rental listings
Image: 9News
Key Points 4 min read
  • NSW legislation will require agents to disclose when rental photos are digitally altered to hide faults or mislead tenants.
  • Penalties for non-disclosure will be $5,500 for individuals and $22,000 for businesses if the bill passes.
  • Current laws exist but are poorly enforced, with tenants reluctant to report breaches due to fear of losing rental opportunities.
  • Industry figures argue the law permitting disclosure may not go far enough; some call for stronger prohibition of misleading images.

For Sydney renters like Emma, the experience of viewing a property has become a game of spot the difference. Photos online show bright, spacious rooms with fresh finishes. The reality at inspection is often starkly different: dated kitchens, poor lighting, damage concealed or altered beyond recognition. "It's crazy the number of inspections I've gone to that look nothing like the pictures," she told media. "It's hard to tell which ones have been AI-edited or which ones have been Photoshopped until you actually get to the unit and realise you've wasted your time."

The rise of artificial intelligence has turbocharged an old problem. While photo editing in real estate has long been standard practice, AI tools now make it cheaper and faster to radically transform listings. A dated kitchen becomes gleaming in seconds. Mouldy ceilings vanish. Furniture appears in empty rooms. Virtual staging that once cost hundreds can now be outsourced for as little as $45 per room, sometimes less with AI.

NSW is moving to address this. The Residential Tenancies (Protection of Personal Information) Amendment Bill 2025 will tackle photoshopped and AI-doctored images by introducing mandatory disclosure requirements on misleading rental ads. Should the legislation pass parliament, the penalties for non-disclosure will be $5,500 for individuals and $22,000 for businesses.

The requirement is not absolute prohibition. Disclosure is required for alterations such as using artificially generated furniture that shows a double bed in a bedroom that is only large enough to fit a single, or digitally modifying photos to obscure property damage. Basic editing, including cropping and brightening, is unlikely to be included in the disclosure requirements.

On the subreddit r/shitrentals, Australian tenants reported an NSW property listing with what appeared to be an AI-edited sheen over mouldy ceilings. Another Queensland listing appeared to reimagine a dimly lit kitchen using an image generator, while photos for a rental in Victoria had clear signs of warping and were seemingly styled with cartoonish, AI-generated furniture.

The problem is not new. Laws already exist. Under Australian Consumer Law, misleading or deceptive conduct in real estate is prohibited, with penalties reaching $220,000 for individuals and $1.1 million for companies. Yet enforcement has been negligible, partly by design. The system relies on tenants to report breaches. Emma has seen potentially AI-altered photos but has not filed a complaint. She worries about the consequences. "If I apply for another place with the same agent or real estate company it could screw me," she said, capturing a dynamic that undermines the existing framework: when you are competing for scarce rentals, challenging an agent's marketing practices feels like a luxury you cannot afford.

Tenants' Union of NSW chief executive Leo Patterson Ross sees the bill as a necessary intervention. "We're trying to make agents think twice about how they are digitally altering photos so you might see less of the really dodgy stuff," he said. The union emphasises that enforcement will require more active government involvement. Fair Trading can take action only when multiple complaints accumulate, but the burden is on renters to lodge them in the first place.

Not everyone believes disclosure is sufficient. Critics argue the bill frames misleading advertising as acceptable provided it is acknowledged. One housing influencer commented that the laws represent a concession where the government says "misleading advertising is fine as long as you admit you're doing it," suggesting that existing federal prohibition on misleading conduct should be enforced more vigorously rather than accommodating disclosure of violations.

The industry perspective is more permissive. The Real Estate Institute of NSW notes that most agents use AI and photo-editing tools responsibly. Professional home staging can add around 5-10% to a final sale price, and agents argue that demonstrating a property's potential through modest enhancement is standard marketing practice. The test, they contend, is whether the image still represents the property fairly, not whether editing occurred.

What the proposed law genuinely addresses is not whether agents should use AI tools, but whether they must own the choice. Tenants in Sydney navigate a vacancy rate of just 1.8%, meaning they have limited bargaining power. The lease they sign might hang on whether they choose to inspect a property. Transparency about photo manipulation at least allows them to weigh the risk, and may discourage the worst excesses. Whether that translates to real change depends on what comes next: whether Fair Trading and other enforcement agencies receive the resources and mandate to act, and whether the rental crisis itself loosens enough that tenants feel safe complaining.

Sources (5)
Yuki Tamura
Yuki Tamura

Yuki Tamura is an AI editorial persona created by The Daily Perspective. Covering the cultural, political, and technological currents shaping the Asia-Pacific region from Japanese innovation to Pacific Island climate concerns. As an AI persona, articles are generated using artificial intelligence with editorial quality controls.