Commentary

The lawyer who beat Meta and Google — and why the social media addiction verdict might be existential for the industry

Mar 30, 2026

Key Points

  • A California jury found Meta and YouTube liable for social media addiction, ordering $6 million in damages to a plaintiff and establishing that addictive design features—infinite scroll, algorithmic feeds, autoplay, notifications—are independently actionable independent of content.
  • Attorney Mark Lanier, who won a $4.69 billion talcum powder verdict, built the case on platform architecture rather than user-generated content, circumventing Section 230 protections and creating a template for thousands of pending lawsuits.
  • OpenAI's failed Sora platform, which deployed identical addictive mechanics but collapsed due to poor content quality, suggests features alone may not drive addiction, potentially undercutting the verdict's core causal logic on appeal.

Summary

A California jury found Meta and YouTube liable for social media addiction, ordering each company to pay $3 million in compensatory and punitive damages to Kaylee, now 20, who alleged the platforms caused anxiety, depression, and body dysmorphia. The verdict is the first win among thousands of consolidated lawsuits filed by teenagers, school districts, and state attorneys against Meta, YouTube, TikTok, and Snap. It could reshape how platforms are held accountable for product design.

Mark Lanier and the case

Texas-based plaintiff's attorney Mark Lanier led the charge. His track record includes a $4.69 billion verdict in 2018 over talcum powder asbestos claims and one of the first major cases against Merck over the painkiller Vioxx. In the courtroom, Lanier uses hand-drawn roadmaps, props like needles dropped into hay bales to visualize asbestos fibers, and jars of M&Ms to demonstrate how multibillion-dollar fines are fractional relative to tech companies' market value. He wears the same two suits during trials and burns them when finished.

Layier's legal strategy exploited a critical distinction. Social media platforms are largely protected from liability for user-generated content under Section 230 of the Communications Decency Act. Lanier instead focused on platform features themselves rather than content. The jury found specific design patterns to be addictive: infinite scroll, algorithmic recommendation feeds, autoplay, notifications, beauty filters, and the like button. Each exploits psychological needs, whether for stopping points, engagement, validation, physical comparison, or social approval.

Liability at scale

Eric Goldman, at Santa Clara University of Law, flagged the broader implications: "Whether we will even have social media in the future." Lanier has secured $6 billion verdicts before and could win $50 billion or more. Thousands of similar cases are in the pipeline, some potentially consolidating into class actions. Individual verdicts or a single massive action could generate immense liability exposure.

The verdict creates a template. Any platform using infinite scroll, algorithmic feeds, autoplay, notifications, or gamification is vulnerable to identical claims. Removing content will not solve the problem. Platforms would need to redesign their core product architecture.

The Sora problem

OpenAI's defunct Sora social network used the identical feature set: infinite scroll, algorithmic recommendations, notifications, like buttons, and generative avatar filters. Yet it failed. Engagement declined over time. AI-generated videos lacked the creator voice and intentionality that made user-generated content sticky. When the innovation—AI-generated content—lost human curation and intent, the addictive mechanics did not compensate.

This suggests features alone do not drive addiction. Content quality and creator authenticity matter more. The jury verdict assumes features are the primary driver. Sora's collapse suggests otherwise. The legal and strategic distinction cuts deep: if content is the real driver, features are secondary, and Section 230 protections may provide the actual shield. If features drive addiction independent of content, platform liability becomes nearly inescapable.

The tension remains unresolved. Lanier's verdict is factually correct. Sora's failure is also real. Whether the verdict survives appeal and whether future cases replicate the same causal logic on features remain open questions. What is certain: Meta and YouTube now face liability exposure that extends far beyond content moderation.