A California jury found that Meta and Google were liable for the depression and anxiety of a woman who compulsively used social media as a child, awarding her $6 million. This isn't just a single lawsuit; it's potentially the opening move in what could reshape the digital landscape for an entire generation.
The jury concluded that Meta's apps, including Instagram, and Google's YouTube were deliberately built to be addictive and the companies' executives knew this and failed to protect their youngest users. What makes this verdict particularly significant is how it sidesteps the usual tech industry defence. Rather than arguing about content moderation or Section 230 protections, the legal strategy focused on design flaws instead of specific content in order to counter arguments that tech companies shouldn't be held liable for third-party content.
The plaintiff, a 20-year-old woman identified as KGM in court documents, said she first started using YouTube at 6 years old and Instagram when she was 11. She alleged she developed compulsive use patterns, including up to 16 hours in a single day on Instagram, and attributed her depression, body dysmorphia, and suicidal thoughts to her platform use.
The evidence presented during the trial matters more than you might think. The case argued that Meta and YouTube made deliberate design choices, for example, 'infinite scroll', to make their platforms more addictive to children in order to boost profits, and alleged the companies borrowed heavily from the behavioural and neurobiological techniques used by poker machines and exploited by the cigarette industry. The jury heard that Meta's internal communications compared the platform's effects to pushing drugs and gambling, which the jury found was corporate knowledge that supports liability, and a YouTube memo reportedly described 'viewer addiction' as a goal, with an Instagram employee writing the company was staffed by 'basically pushers'.
Meta and YouTube pushed back hard. The companies emphasised emotional and physical abuse their medical records indicated she experienced at home, and their lawyers argued that Kaley's own therapist never documented that social media use was a factor in her mental health problems. A YouTube vice president of engineering testified that YouTube was 'not designed to maximize time'.
But the jury deliberated for nearly 44 hours over nine days before deciding otherwise. Meta and YouTube were found negligent in the design of their platforms, knew their design was dangerous, failed to warn of those risks and caused substantial harm to the plaintiff. Meta bears 70% of the responsibility for the harm and YouTube 30%. YouTube was ordered to pay an additional $900,000 in punitive damages, and $2.1 million in punitive damages from Meta.
Here's what keeps tech executives awake at night: this is just the beginning. The outcome of this case could influence thousands of other consolidated cases against the social media companies, and the litigation has drawn comparisons to the legal crusade in the 1990s against Big Tobacco, which forced the industry to stop targeting minors with advertising. KGM's case is the first of its kind, but won't be the last; it is one of more than 20 'bellwether' trials due to go to court, which are essentially test cases used to gauge juries' reactions and set a legal precedent.
Both companies have signalled they plan to appeal. A Meta spokesperson said they 'respectfully disagree with the verdict and will appeal'. Yet the precedent is already set. If design, not content, is the legal target, social media companies face a fundamentally different liability landscape than anything they've navigated before.
The verdict also comes amid a broader pattern of losses for Meta. This is Meta's second big loss in the US courts this week, with a New Mexico jury finding the company guilty on March 24 of concealing information about the risks of child sexual exploitation and the harmful effects of its platforms on children's mental health.
Whether this becomes Big Tech's Big Tobacco moment remains uncertain. A law professor called the verdict 'a momentous development' but noted it's just 'one step in a much longer saga' and that they don't expect to see large changes to the platforms immediately, with a long way to go before you see something akin to the master settlement in tobacco and opioid litigation. What's clear is that the conversation about who bears responsibility for platform design has fundamentally changed.