Instagram will be removing end-to-end encryption support for direct messages as of May 8, 2026, marking Meta's latest retreat from its former commitment to privacy-first messaging. The company's reasoning is straightforward but telling: very few people actually used the feature.
Unlike WhatsApp, Meta never made encryption available to all Instagram users and it was never a default setting. Instead, users in some areas had the ability to opt-in to encryption on a per-chat basis. For an optional feature with minimal adoption, the calculation became simple. According to Meta, "Very few people were opting in to end-to-end encrypted messaging in DMs, so we're removing this option from Instagram in the coming months".
What makes this move significant is not the low usage numbers, but what they reveal about the tension between privacy and surveillance. Just three years ago, Meta only introduced the feature in December 2023 as part of CEO Mark Zuckerberg's long-stated vision for encrypted communications. Meta started encrypting WhatsApp chats in 2016, and in 2019, Zuckerberg outlined a "privacy-focused" revamp of the company's apps, saying at the time that implementing end-to-end encryption for all private communications is the right thing to do.
The reversal comes amid intense pressure from regulators and law enforcement worldwide. Meta's use of encryption has been repeatedly criticised by law enforcement and some child safety organisations that say the feature makes it harder to catch predators who target children on social media. Recently, the topic has been raised numerous times during a trial in New Mexico over child safety, with internal documents surfacing that show Meta executives and researchers debating the trade-offs between safety and privacy as it relates to encryption.
Court documents filed in that New Mexico case reveal the depth of Meta's internal conflict. Unsealed documents showed employees mentioning some 7.5 million child sexual abuse material reports that wouldn't be disclosed after the move to encryption on Messenger. An employee wrote in a message dated December 14, 2023, "There goes our CSER [Community Standards Enforcement Report] numbers next year," according to the filing, the same month Meta said in a public blog post that it would begin rolling out default end-to-end encryption for personal messages and calls on Messenger and Facebook. The employee added that it was as if the company "put a big rug down to cover the rocks" and said it was sending fewer child exploitation reports.
For users caught between these competing demands, the outcome is messy. After May 8, Meta will be able to access the content of Instagram messages, and users with affected chats are being prompted to download them before the cutoff. Globally, governments are accelerating their push. The European Parliament and Council are expected to adopt the controversial Child Sexual Abuse Regulation in spring 2026. In its current form, it proposes that messaging platforms voluntarily scan private communications for offending content, combined with proposals for age verification to check the age of users.
Zuckerberg himself acknowledged the tension. In testimony broadcast during the New Mexico trial, Zuckerberg said that safety issues were "a large part of the reason why it took so long" to bring encryption to Messenger, and noted that "the majority of folks, from people who use our products to people who are involved in security overall, believe that strong encryption is positive".
What emerges from this pattern is a company navigating genuine competing values. Encryption advocates are right that strong privacy protections matter. Child safety advocates are right that encrypted systems can harbour abuse. Yet Meta's handling reveals something troubling: the feature was rolled out quietly to limited regions, adoption was never fostered, and now it is being withdrawn just as quickly, leaving users confused about what privacy they actually have.
Meta's statement doesn't mention the status of encryption on Messenger. The company began turning on end-to-end encryption as a default setting in 2023 after years of work on the feature, creating inconsistency across its platforms. Anyone who wants to keep messaging with end-to-end encryption can easily do that on WhatsApp, Meta says, essentially directing users to a different app.
For Australian users and policymakers watching this unfold, the lesson is clear. Tech platforms do respond to government pressure, but the outcome depends entirely on how coherently that pressure is applied. Meta's encryption saga shows a company without an internal consensus being buffeted by external forces. The removal of Instagram encryption may satisfy child safety advocates in the short term, but without addressing the underlying question of whether platforms can safely hold vast stores of unencrypted intimate messages, it solves nothing. It merely shifts the problem.