After years of allegations of harm by youth advocacy groups and corresponding denials from corporate, Meta and YouTube have been found liable for causing mental duress to an adolescent girl—identified only as “Kaley” or “K.G.M.”—who brought her action 2023 and has since turned 20. The jury in the civil suit brought in Los Angeles Superior Court deliberated for eight days before reaching its verdict.
The LA jury awarded Kaley $3 million in punitive damages—veritable pocket change for companies like Meta and YouTube parent Alphabet, which generated over $200 billion and $403 billion in revenue last year, respectively.
The larger issue here is precedent.
With the current case having established responsibility on the part of two leading social media platforms for causing mental affliction to a young user, the likelihood of success for 1,600 other California plaintiffs who’ve brought similar suits has, in theory, increased.
Today’s verdict may also affect the outcome for 235 additional plaintiffs who are suing TikTok, Snapchat, Meta, and Google in the federal courts, an effort supported by 32 bipartisan state attorneys general.
In a separate development earlier this week, a New Mexico jury found Meta responsible for “unconscionable” behavior in designing platforms that harmed children, ordering it to pay $375 million in damages. A key difference between the two suits is that the LA trial looked at the issue of platform features such as infinite scroll, whereas the New Mexico case focused on the online content itself.
Either way, social media companies long thought to be shielded by Section 230 of the 1996 Communications Decency Act—which has historically held internet companies harmless for the content users post on their sites—now find themselves facing a new era of potential vulnerability.
“We respectfully disagree with the verdict and are evaluating our legal options,” said a Meta spokesperson after the verdict.
In her testimony, Kaley recounted that shortly after she began watching YouTube at the age of six, it became a “gateway” to other platforms like Instagram. Attorney Mark Lanier said his client swiftly developed an addiction that led her to eschew friendships in favor of a 16-hour-a-day smartphone habit that left her feeling “ugly” and “insecure” when she didn’t rack up enough likes. Instagram’s “beauty filters”—which can digitally slim down the bodies of those who post—made her feel “fat,” Lanier said.
Social media sites have consistently denied that their products can addict and mentally harm users. During a contentious appearance before the U.S. Senate Judiciary Committee in 2024, Meta chief executive Mark Zuckerberg testified that his company “work[s] hard to provide support and controls to reduce potential harms.”

