Outside the Los Angeles courtroom in February, parents who had travelled from across the country stood gripping paper lottery tickets, nervous about whether they would get seats inside. They said their children had been hurt or died because of social media, describing a company they believed preyed on and exploited their children. Among them was Julianna Arnold, whose daughter died at 17 after being lured by a predator she met on social media.
The tickets determined who would witness Meta CEO Mark Zuckerberg defend his company against allegations that Instagram was deliberately designed to be addictive to children and teens. This was the first time Zuckerberg would answer similar questions in front of a jury.
The trial raises a straightforward question with sweeping implications: did tech companies knowingly engineer products to be addictive to young users, understanding the mental health consequences? For the first time, an American jury is being asked to decide whether platform design itself can give rise to product liability, not because of what users post on them, but because of how they were built.
The lawsuit was brought by a now 20-year-old woman identified as "Kaley" and her mother, who allege she was exposed to addictive design features as a child. Kaley began using social media young, YouTube at age 6, then Instagram at 9, and was also a frequent user of TikTok and Snap. Kaley claims the app's addictive features led her to develop anxiety, body dysmorphia and suicidal thoughts and that she experienced bullying and sextortion on Instagram.
Zuckerberg's testimony was adversarial. He appeared testy, often responding by saying things like, "I think you're mischaracterizing me," or "That's not what I said at all," or "I think you're taking this document out of context." When asked whether Meta wants people to be addicted to its social media platforms, Zuckerberg said he was "focused on building a community that is sustainable."
The plaintiff's legal team presented evidence suggesting Zuckerberg's fingerprints were all over decisions to keep young users engaged. Plaintiffs presented alleged internal documents from Instagram head Adam Mosseri in 2022 saying the evidence shows Instagram's "primary goal" is to ensure people stay engaged with the app, especially among teens. The attorney also submitted into evidence an email from Zuckerberg in 2016 claiming they aimed to increase time spent on apps by 12% over three years. A 2020 internal Meta document showed that 11-year-olds were four times as likely to keep coming back to its apps, compared to older users.
Zuckerberg's legal strategy focused on casting doubt on the causal link between Instagram use and mental health problems. His lawyers pointed to his 2024 congressional testimony that "the existing body of scientific work" has not shown a link between social media and worse mental health outcomes for young people. Meta's lawyer argued that it was Kaley's difficult family life, rather than social media, that caused her mental health challenges.
This argument contains a grain of truth. The Diagnostic and Statistical Manual of Mental Disorders does not classify social media use as an addictive disorder. Researchers like Amy Orben have found that large-scale studies show small average associations between social media use and reduced well-being, though Orben herself has cautioned that these averages might mask severe harms experienced by a subset of vulnerable young users, particularly girls ages 12 to 15. The scientific picture is genuinely complex.
Yet the plaintiffs have found a legal angle around decades-old legal protection. By treating social media apps as unsafe products under product liability law, they argue that tech companies deliberately designed social media sites as harmful and dismissed internal warnings that the services could be problematic for teenagers. The jury needs three-fourths agreement, so 9 out of 12 jurors, to side with either the plaintiff or the tech companies.
The stakes extend far beyond this single case. The landmark trial is the first of a consolidated group of cases from more than 1,600 plaintiffs, including over 350 families and over 250 school districts. A win for the family could lead to serious monetary damages and platform-wide changes to social media apps, and the outcome is anticipated to open the door to settlement talks for hundreds of other suits.
Speaking to reporters outside the courtroom during a break, Julianna Arnold said it was "surreal" to see Zuckerberg testify, after years of calling on the company to make greater changes. Though she is not a plaintiff in the Los Angeles case, she told reporters that hearing Zuckerberg's statements did not give her "any satisfaction." "I can't say there's anything new I heard," she said. "I just know that more children have died since my daughter died in 2021, so things are not any better."
The reasonable disagreement here runs deep. Technology firms argue that mental health issues among teenagers have multiple causes, that their platforms include safety tools, and that holding them liable for design choices amounts to punishing free expression. Plaintiffs counter that internal documents show Meta knew the risks and pursued profit anyway, making this a question of corporate negligence rather than speech.
For Australian observers, this trial carries particular weight. Social media companies have revoked access to about 4.7 million accounts identified as belonging to children since Australia banned use of the platforms by those under 16, a law that provoked fraught debates about technology use, privacy, child safety and mental health and has prompted other countries to consider similar measures. Whether Meta and YouTube will face meaningful financial consequences for their design choices in California may influence whether governments elsewhere pursue similar restrictions or litigation.