Skip to main content

Archived Article — The Daily Perspective is no longer active. This article was published on 25 March 2026 and is preserved as part of the archive. Read the farewell | Browse archive

Technology

Meta, YouTube found liable for teen addiction and mental health harm

Los Angeles jury orders tech giants to pay $3 million in first major trial establishing platforms knew of design dangers

Meta, YouTube found liable for teen addiction and mental health harm
Image: GameSpot
Key Points 3 min read
  • Meta and YouTube found liable on all counts after seven-week trial; jury awarded $3 million in compensatory damages
  • Meta ordered to pay 70% of damages, YouTube 30%; jury found negligent design and failure to warn users of risks
  • Case is bellwether trial; outcome may influence more than 1,500 pending lawsuits and potentially billions in future damages

A California jury has found Meta and YouTube liable on all counts in a landmark case that accused the tech giants of intentionally addicting a young woman and injuring her mental health. The verdict, announced on 25 March 2026 in Los Angeles Superior Court, concludes what amounts to the first significant civil trial in which a jury has held major social media platforms accountable for the specific design choices that encourage compulsive use.

The clinical dimensions of this case merit careful examination. The plaintiff, a 20-year-old woman identified as KGM in documents, or Kaley as her lawyers have called her during the trial, began using YouTube at age 6 and Instagram at age 9. She testified in court that her nearly nonstop use of social media caused or contributed to depression, anxiety and body dysmorphia. During her early teenage years, she developed what her legal team characterised as compulsive engagement with these platforms, and psychological assessment identified patterns consistent with behavioural addiction.

The jury's findings were unambiguous about causation. Meta and YouTube were negligent in the design of their platforms, knew their design was dangerous, failed to warn of those risks and caused substantial harm to the plaintiff, the jury found. This is clinically significant because it moves beyond mere correlation; the jury determined that specific platform features contributed substantially to the plaintiff's mental health deterioration. During the six-week trial in LA, jury members were tasked with determining whether Meta and YouTube implemented certain design features in their apps like recommendation algorithms and auto-play that contributed to K.G.M.'s crippling, mental distress. Evidence presented included Meta documents showing the company had decided to allow filters that manipulate a user's appearance despite internal concerns they could be harmful.

However, the defence arguments deserve fair consideration. Meta had claimed that it was Kaley's difficult childhood, not social media, that caused her mental health challenges. The companies presented evidence of parental neglect and family trauma in her background. The critical point the jury ultimately accepted was this: rather than excusing the platforms, such vulnerability raised the ethical and legal obligation to protect her. Jurors deliberated in a Los Angeles courtroom for nine days for a total of more than 40 hours, at one point telling the judge that they were struggling to reach a consensus on one of the defendants. Although the jurors were not unanimous in their decision, a majority voted to hold both companies liable.

Meta bears 70% of the responsibility for the Kaley's harms and YouTube 30%, jurors found. The companies have already announced opposition to the verdict. "We respectfully disagree with the verdict and are evaluating our legal options," a Meta spokesperson said.

The scope of this case extends far beyond one plaintiff. Kaley's was the first of more than 1,500 similar cases against the social media companies to go to trial; Wednesday's outcome won't determine but could help guide how those other cases are resolved. The decision could set a precedent for hundreds of similar cases and lead to major changes to how social media platforms operate, especially for young users, as well as millions, even billions, in losses for the tech companies. Additionally, this ruling comes one day after a separate jury decision; the verdict comes after jurors in a separate trial in New Mexico held Meta liable for failing to protect children from online predators and sexual exploitation on Facebook and Instagram. The New Mexico jury found on Tuesday that Meta violated the state's consumer protection laws and ordered the company to pay $375 million in civil penalties.

What clinicians need to understand is that this case does not settle the question of whether social media inherently causes mental illness. The evidence establishes causal contribution in this individual's case, but mental health outcomes result from multiple intersecting factors. The jury's role was not to pronounce on general epidemiology, but to determine whether these two companies, having designed specific features they understood carried risks, failed in their duty to warn and protect. On that narrower but crucial question, the jury found them liable.

A second phase of these proceedings will now determine punitive damages. The jurors will soon deliberate on whether and how much should additionally be awarded in punitive damages, based on the net worth of each company. For the mental health community, the significance extends beyond monetary damages. This verdict establishes that juries are willing to hold platforms liable for design choices made in pursuit of engagement metrics, at the cost of known risks to young users' psychological wellbeing. It may influence the outcome of 2,000 other pending lawsuits.

The case also marks a watershed moment for social media, following years of concerns from parents, advocates and lawmakers about online harms to children ranging from mental health concerns to sexual exploitation. For health professionals advising families, the clinical takeaway is straightforward: platforms have been judicially found to prioritise engagement over user wellbeing. Evidence-based approaches to managing young people's social media use remain essential. Visit the Australian Medical Association for clinical guidance on adolescent mental health assessment and the Royal Australian and New Zealand College of Psychiatrists for specialised resources on technology-related mental health concerns.

Sources (6)
Helen Cartwright
Helen Cartwright

Helen Cartwright is an AI editorial persona created by The Daily Perspective. Translating complex medical research for general readers with clinical precision and an evidence-first approach. As an AI persona, articles are generated using artificial intelligence with editorial quality controls.