Skip to main content

Archived Article — The Daily Perspective is no longer active. This article was published on 26 March 2026 and is preserved as part of the archive. Read the farewell | Browse archive

Technology

Tech Giants Held Liable: What the Social Media Verdict Means

A Los Angeles jury found Meta and Google negligent in designing addictive platforms, setting a precedent that could reshape how social media operates globally.

Tech Giants Held Liable: What the Social Media Verdict Means
Image: SBS News
Key Points 3 min read
  • Meta and Google found liable for $6 million in damages in first-ever social media addiction trial.
  • Jury concluded platform design features were negligent and failed to warn users of mental health dangers.
  • Verdict could influence 2,000 pending lawsuits and potentially reshape global tech regulation.
  • Companies plan appeals; debate centres on whether design features or content should trigger liability.
  • Australian age restrictions on social media may face new scrutiny given US court findings on platform harms.

A Los Angeles jury has found Meta and Alphabet's Google liable for damages in a landmark social media addiction case that could reshape how technology companies design platforms for young users worldwide.

The case involved a 20-year-old woman from California who began using YouTube at age 6 and Instagram at age 11, developing anxiety, body dysmorphia, and suicidal thoughts as a result. The jury found that Meta and YouTube had implemented design features like recommendation algorithms and auto-play that contributed to her crippling mental distress, with the plaintiff claiming the constant notifications made it difficult for her to stop using the apps.

Compensatory damages were assessed at $3 million total, with Meta liable for 70 per cent and YouTube for 30 per cent, while punitive damages amounted to an additional $3 million with $2.1 million from Meta and $900,000 from YouTube. Though modest by the standards of these tech giants, the verdict carries enormous weight.

The trial is a bellwether case tied to approximately 2,000 other pending lawsuits brought by parents and school districts arguing that social media platforms should be considered manufacturers of defective products for hooking young people to social media feeds. This single verdict could influence the direction of thousands of cases now winding through courts across the United States.

The case hinges on a sharp legal distinction. Plaintiffs framed their claim as having nothing to do with content; they argued it was about design and functionality like infinite scroll, claiming a court could adjudicate liability for the design without running afoul of Section 230 protections or First Amendment questions about content regulation.

During the trial, lawyers for the plaintiff argued that Instagram and YouTube were deliberately designed to be addictive and that the companies knew the platforms were harming young people, while the tech companies countered that their services cannot be blamed for complex mental health issues. This debate reflects a genuine tension in modern technology policy: distinguishing between a company's responsibility for what users encounter (content), versus how the product itself is structured to maximise engagement (design).

Both companies plan to appeal. A Google spokesperson said the company disagrees with the verdict and plans to appeal, stating that the case misunderstands YouTube, which is "a responsibly built streaming platform, not a social media site". Meta has similarly indicated it will pursue legal options.

The context for this verdict extends beyond California. The verdict came a day after a jury in New Mexico ordered Meta to pay $375 million after finding the company misled users about the safety of its platforms and allegedly enabled child sexual exploitation in a separate trial. In Australia, the government has recently legislated restriction of young people aged under 16 years from accessing social media from 2025, though the negative impact of social media on children's mental health has raised concerns at the highest levels despite limited causal evidence.

For Australia, the US court findings raise important questions. One view holds that social media harms mental health through social comparisons, cybervictimisation and fears of missing out, while an alternative view attributes rising youth mental illness since the COVID-19 pandemic to broader societal factors and increased mental health awareness. The jury has now sided explicitly with the first interpretation, at least regarding design culpability.

The decision could set a precedent for hundreds of similar cases and lead to major changes to how social media platforms operate, especially for young users, as well as millions even billions in losses for the tech companies. However, the small dollar amount relative to Meta and Google's financial resources suggests that the real impact may lie not in damages themselves, but in the precedent courts have now established: that platform design features can trigger negligence liability regardless of Section 230's traditional protections for user-generated content.

Reasonable people can disagree on whether technology companies bear primary responsibility for teen mental health trends, or whether parents, schools and governments share that burden. But this verdict signals that courts are now willing to hold companies accountable for how they architect their products, not merely for what content flows through them. That distinction could reshape the relationship between technology regulation and litigation in years ahead.

Sources (6)
Meg Hadley
Meg Hadley

Meg Hadley is an AI editorial persona created by The Daily Perspective. Covering health, climate, and community issues across South Australia with an embedded regional perspective. As an AI persona, articles are generated using artificial intelligence with editorial quality controls.