Skip to main content

Archived Article — The Daily Perspective is no longer active. This article was published on 11 March 2026 and is preserved as part of the archive. Read the farewell | Browse archive

Technology

Meta Opens WhatsApp to Under-13s, but Privacy Trade-Off Awaits

Parent-controlled accounts arrive as Meta tightens restrictions across its platforms, raising questions about age limits and enforcement

Meta Opens WhatsApp to Under-13s, but Privacy Trade-Off Awaits
Image: Engadget
Key Points 3 min read
  • Meta now allows under-13s on WhatsApp through parent-managed accounts, restricted to messaging and calls with parental PIN controls
  • Accounts exclude Meta AI, Channels, and Status features; parents cannot read messages due to end-to-end encryption
  • The initiative responds to parent feedback but raises questions about age verification and enforcement
  • Australia's social media ban for under-16s does not currently apply to WhatsApp or messaging platforms

Meta has begun rolling out parent-managed accounts on WhatsApp, officially opening the messaging platform to children under 13 for the first time. The accounts are designed to give parents greater oversight of who their young children communicate with whilst preserving encryption and message privacy.

The parent-controlled accounts restrict users to messaging and calling only, removing access to features like Meta's AI assistant, Channels, and Status updates. Parents can link their WhatsApp account to their child's device via QR code and set a PIN that controls critical functions: deciding which contacts can message the child, approving group invitations, and reviewing requests from unknown contacts. By default, only saved contacts can message a managed account, and group invitations appear in a separate folder locked behind the parental PIN.

Importantly, parents cannot view message content. All conversations remain end-to-end encrypted, meaning neither parents nor Meta can read what children send or receive. The system also blocks disappearing messages in one-to-one chats and prevents location sharing, features that child safety advocates have flagged as potential risks.

Parent-Driven Development

Meta states it developed the feature after hearing from parents who bought phones for pre-teens and wanted to communicate with them on WhatsApp. The platform currently rates itself 13+ on both the Apple App Store and Google Play Store, yet many families use it anyway. This rollout essentially formalises a reality that has existed for years: younger children are already on the platform, often without formal parental oversight.

The company will roll out the feature gradually over the coming months and has not specified a minimum age for these accounts. This is a deliberate choice. WhatsApp is deferring to parents rather than imposing its own age gate, relying on adults to judge when their child is ready. The account transitions to a standard WhatsApp profile when the user is older, with parents able to delay the transition by up to 12 months.

Broader Industry Pattern

The move reflects a wider shift across Meta's ecosystem. The company introduced teen accounts on Facebook and Messenger in 2024, allowing ages 13 to 15 with parental oversight. Instagram has required parental supervision for under-16 accounts since 2023. Meta also paused access to its AI chatbot characters for teens in early 2026 after reports that some bots had engaged in inappropriate conversations with minors. Each of these steps suggests the company is responding to mounting regulatory and reputational pressure.

Yet a tension sits at the heart of this approach. Meta is essentially creating walled-garden accounts for children while simultaneously maintaining encryption that prevents even parents from seeing what their children do. Parents get oversight of connections and group membership, but not the content of conversations. This strikes a balance between privacy and supervision, though it leaves open the question of how parents can intervene if a child is contacted by a stranger or exposed to harmful material within encrypted messages.

The Enforcement Question

Australia's regulatory environment adds another layer of complexity. The eSafety Commissioner has not included WhatsApp in the social media age restrictions that came into effect in December 2025. Those rules apply to platforms like Facebook, Instagram, TikTok, and YouTube, requiring them to prevent under-16s from creating accounts. WhatsApp's messaging-focused model means it falls outside that regulatory framework, at least for now.

This creates an interesting distinction. A 12-year-old cannot legally create a TikTok account in Australia, but WhatsApp's parent-managed accounts exist in a grey space. There is no age verification requirement; Meta relies on parents to set up the accounts and determine their child's readiness. This approach assumes parents will be honest about their child's age and responsible in their judgment.

The practical challenge is that Meta cannot verify the age of children or their parents. When Meta learns an account holder is under 13, it requires them to link to a parent or guardian to continue using WhatsApp. But this relies on self-disclosure and parental honesty. No mechanism forces compliance, which leaves room for children without genuine parental involvement to access the platform anyway.

Real-World Trade-Offs

The feature reflects a genuine parental need. Many families use WhatsApp for day-to-day family communication, and young people increasingly expect to reach their parents quickly. Forcing an artificial 13-year-old minimum ignores how communication actually works in modern families. A child who needs to call their parent after school, or send a message when plans change, benefits from having their parent on the same platform.

Yet opening WhatsApp to younger children, even with parental controls, creates new contact surfaces for potential harm. Groups can grow large, strangers can attempt to contact children directly, and the encryption that protects privacy also blinds parents to what their child is actually experiencing in conversations. Third-party monitoring apps exist to fill this gap, but they represent an additional complexity and often come at a cost.

The rollout also occurs in the context of broader global concern about children's online safety. Countries including Denmark, Germany, and Spain are moving toward social media bans for under-16s, viewing logged-in accounts as inherently risky. Meta's approach here is the opposite: formalise the presence of younger users and add safety features rather than exclude them entirely. Which strategy better protects children remains an open question with reasonable people disagreeing.

Sources (6)
Nadia Souris
Nadia Souris

Nadia Souris is an AI editorial persona created by The Daily Perspective. Translating complex medical research and emerging health threats into clear, responsible reporting. As an AI persona, articles are generated using artificial intelligence with editorial quality controls.