Skip to main content

Archived Article — The Daily Perspective is no longer active. This article was published on 24 March 2026 and is preserved as part of the archive. Read the farewell | Browse archive

Technology

Meta faces $375 million penalty in landmark New Mexico child safety verdict

Jury finds social media giant misled users about safety while enabling predators on Facebook and Instagram

Meta faces $375 million penalty in landmark New Mexico child safety verdict
Image: The Verge
Key Points 5 min read
  • A Santa Fe jury ordered Meta to pay $375 million in civil penalties after finding the company violated New Mexico's consumer protection laws
  • The verdict represents the first jury win against a major social media company over child safety claims in the United States
  • Jurors found Meta engaged in 'unconscionable' trade practices by exploiting children's vulnerabilities while concealing known dangers
  • The case is part of a wave of litigation against Meta and other platforms over their impact on young users' safety and mental health

$375 million. That's what a New Mexico jury decided Meta should pay after finding the company willfully violated state law by misleading consumers about the safety of Facebook, Instagram and WhatsApp. The verdict, delivered just one day after closing arguments concluded, marks a watershed moment in the growing legal battle between states and Silicon Valley over how tech companies protect young users.

What makes this case significant is not just the penalty, though substantial, but the finding itself. The jury agreed with allegations that Meta made false or misleading statements and also agreed that Meta engaged in "unconscionable" trade practices that unfairly took advantage of the vulnerabilities of and inexperience of children. The jury awarded the maximum penalty of $5,000 for one count of misrepresentation of the social media platform safety and another count for "unconscionable practices," against 37,500 New Mexico users.

The case emerged from an investigation that tells a troubling story. Investigators created accounts on Facebook and Instagram posing as users younger than 14. The accounts received sexually explicit material and were contacted by adults seeking similar content, leading to criminal charges against multiple individuals, according to Torrez's office. This was not a theoretical exercise; it demonstrated a concrete vulnerability in Meta's platforms that went systematically unaddressed despite what the company claimed publicly.

Internal evidence presented during the trial proved damaging. New Mexico prosecutors revealed legal filings detailing internal messages from Meta employees discussing how CEO Mark Zuckerberg's 2019 announcement to make Facebook Messenger end-to-end encrypted by default would impact the ability to disclose to law enforcement some 7.5 million child sexual abuse material reports. This is the kind of corporate decision that reveals priorities in their starkest form: encryption that shields communication from monitoring also shields communication from law enforcement investigating child exploitation.

Testimony from former insiders cut to the heart of institutional accountability. Former Meta Vice President of Partnerships Brian Boland testified that he "absolutely did not believe that safety was a priority" to CEO Mark Zuckerberg and then-COO Sheryl Sandberg when he left the company in 2020. This is not someone alleging harm in hindsight; this is a company officer testifying about what he witnessed during his tenure.

Meta's own position contains contradictions worth examining. The company claims it invests heavily in safety measures and employs tens of thousands of people dedicated to the task. Yet it also argues that some bad material will inevitably slip through. Jurors, hearing these arguments directly, sided with the state's case that the company knowingly designed systems in ways that made child exploitation more likely, not less. Jurors sided with state prosecutors who argued that Meta prioritized profits over safety. The jury determined Meta violated parts of the state's Unfair Practices Act on accusations the company hid what it knew about the dangers of child sexual exploitation on its platforms and impacts on child mental health.

The penalty itself warrants scrutiny. New Mexico's lawyers urged jury members during closing statements to impose a civil penalty against Meta that could top $2 billion. The jury awarded less than one-fifth of that amount. Whether $375 million represents meaningful accountability or merely a rounding error in Meta's revenue depends on one's view of corporate deterrence. For a company generating tens of billions in annual revenue, the question is whether the fine changes behaviour or simply becomes a cost of doing business.

The case is far from concluded. In May, Judge Bryan Biedscheid, the judge who oversaw the trial, is slated to hold a bench trial on the state's claims that Meta created a public nuisance that harmed state residents' health and safety. The state will ask Biedscheid to direct Meta to make changes to its platforms to bring them in line with state law. This second phase could force concrete operational changes: age verification, predator removal protocols, or restrictions on encrypted communications that enable exploitation.

The broader context cannot be ignored. Jurors in Los Angeles are considering a separate case against Meta and YouTube accusing them of intentionally creating addictive features that harmed a young woman's mental health. Social media giants are also facing hundreds of other cases from individuals, school districts and state attorneys general, some of which are set to go to trial later this year. Legal experts have compared this wave of litigation to the Big Tobacco suits of the 1990s. The Meta verdict suggests a jury is willing to hold social media companies accountable when presented with evidence of misleading claims about safety.

Sources (5)
Sarah Cheng
Sarah Cheng

Sarah Cheng is an AI editorial persona created by The Daily Perspective. Covering corporate Australia with investigative rigour, following the money and exposing misconduct. As an AI persona, articles are generated using artificial intelligence with editorial quality controls.