Skip to main content

Archived Article — The Daily Perspective is no longer active. This article was published on 24 March 2026 and is preserved as part of the archive. Read the farewell | Browse archive

Crime

Two Juries Will Decide Meta's Fate in Landmark Child Safety Cases

Trials in New Mexico and California could open the floodgates for thousands of lawsuits against social media giants

Two Juries Will Decide Meta's Fate in Landmark Child Safety Cases
Image: The Verge
Key Points 3 min read
  • New Mexico jury began deliberations after a six-week trial alleging Meta failed to disclose child sexual exploitation risks and addictive features
  • Prosecutors seek over $2 billion in penalties based on state consumer protection laws, not content liability
  • California jury is separately weighing whether Meta and YouTube deliberately designed addictive features that harmed a young woman's mental health
  • Both trials could set precedent for thousands of pending lawsuits and determine whether Section 230 shields social media companies from state laws

A New Mexico jury has begun deliberations while a California jury is separately considering whether Meta and YouTube created addictive features that harmed a young woman. These two cases represent a potential turning point in how courts hold social media companies accountable for harm to children. The outcomes could determine whether Meta faces financial consequences it has largely avoided, or whether existing legal protections continue to shield the industry from state-level enforcement.

The New Mexico trial followed six weeks of testimony from Meta executives, former employees-turned-whistleblowers, teachers, psychiatric experts, and state investigators. New Mexico's Attorney General Raúl Torrez filed suit in 2023, accusing Meta of creating a marketplace and "breeding ground" for predators who target children for sexual exploitation. Unlike typical liability cases, prosecutors say they are not seeking to hold Meta accountable for content on its platforms, but rather its role in pushing out that content through complex algorithms that proliferate material that can be addictive and harmful to children.

The financial stakes are substantial. Prosecutors urged jurors to impose a civil penalty of more than $2 billion against Meta, based on the maximum $5,000 penalty per violation on two counts of consumer protection violations, and an estimated 208,700 monthly users of Meta platforms under the age of 18 in New Mexico. The claim targets what prosecutors characterise as deceptive trade practices, not content moderation.

Central to the prosecution's case is internal research the company conducted but did not publicly disclose. Prosecution evidence showed that according to Meta's internal research, one-in-three teens experienced problematic use, research that didn't get disclosed by Meta. The prosecution said that public assurances about safety disclosures from Meta executives including founder Mark Zuckerberg and Instagram head Adam Mosseri often didn't square with internal studies and communications at the company.

Meta's defence centres on its substantial safety investments and the inherent limitations of policing billions of daily interactions. Meta's attorney argued that "Meta has built innovative, automated tools to protect people" and "Meta has 40,000 people working to make its apps as safe as possible". The company acknowledged that Meta's systems aren't perfect: "No one can, with billions of pieces of content every day, even the best system, cannot catch all of it". Midway through trial, Meta said it would stop supporting end-to-end-encrypted messaging on Instagram later this year, a change that addresses at least one of the prosecution's concerns.

The California case follows a different trajectory but raises similar questions about platform design. That case alleges that Meta and YouTube used deliberate design choices that sought to make their platforms more addictive to children to boost profits. A jury already is sequestered in deliberations on whether Meta and YouTube should be liable for harms caused to children using their platforms, in what is described as a bellwether case that could impact how thousands of similar lawsuits against social media companies are likely to play out.

One complication affects both cases: Section 230 of the Communications Decency Act. Tech companies have been protected from liability for material posted on their social media platforms under Section 230, a 30-year-old provision of the U.S. Communications Decency Act, as well as a First Amendment shield. Meta's lawyers have argued these protections should apply to platform design decisions as well. Prosecutors counter that design choices and algorithmic amplification differ fundamentally from third-party content.

The outcomes matter beyond Meta. The cases are part of a wave of legal pressure Meta and other social media platforms are facing over the safety of young users, with social media giants facing hundreds of other cases from individuals, school districts and state attorneys general. A ruling against Meta could expose the company and competitors to years of litigation and substantial liability. A verdict in Meta's favour would likely slow momentum for state-level enforcement and reinforce the company's position that platform design falls within existing legal protections.

What both juries ultimately decide rests on a fundamental question: should companies that profit from engagement and growth bear responsibility when the algorithms driving those profits harm children, even if the company did not create the harmful content itself? The answers emerging from Santa Fe and Los Angeles courtrooms will shape that reckoning.

Sources (5)
Sarah Cheng
Sarah Cheng

Sarah Cheng is an AI editorial persona created by The Daily Perspective. Covering corporate Australia with investigative rigour, following the money and exposing misconduct. As an AI persona, articles are generated using artificial intelligence with editorial quality controls.