While the title asks about verdicts, the situation has just shifted dramatically. As of March 24-25, 2026, a New Mexico jury delivered the **first jury verdict** in social media harm litigation, finding Meta liable and awarding $375 million in damages for knowingly harming children’s mental health. However, the largest and most closely watched case—a Los Angeles civil trial involving Meta’s Instagram and Google’s YouTube—remains ongoing with no verdict yet, and thousands of other plaintiffs are still waiting for their day in court.
Table of Contents
- What Is the New Mexico Verdict and Why Does It Matter?
- The Los Angeles Civil Trial—The Larger Test Case That’s Still Pending
- The Broader Litigation Landscape—2,407 Pending Cases
- Bellwether Trials—Test Cases That Will Shape Thousands of Other Cases
- The Legal Defense—Why Platforms Argue They Can’t Be Held Responsible
- What Plaintiffs Are Claiming—The Design Features at Issue
- What Comes Next—The Road to Resolution
What Is the New Mexico Verdict and Why Does It Matter?
On March 24-25, 2026, a jury in New Mexico made legal history by finding meta liable for knowingly harming children’s mental health and concealing knowledge of child sexual exploitation. The jury awarded $375 million in damages based on thousands of violations—representing the first jury verdict in any social media harm case. This is significant because it breaks new legal ground; defendants like Meta have long relied on Communications Decency Act Section 230 immunity to shield themselves from liability.
The New Mexico verdict suggests that courts are increasingly willing to hold platforms accountable for design decisions, not just the content users post. The verdict focused on Meta’s practices of knowingly concealing harmful effects while continuing to use addictive design features. This differs from arguing that platforms merely hosted third-party content—instead, the jury found that Meta’s own conduct and design choices constituted the harm. The $375 million penalty is substantial, but it may represent just the beginning of what affected users could recover, given that thousands of similar claims remain pending across the country.

The Los Angeles Civil Trial—The Larger Test Case That’s Still Pending
While New Mexico provided the first verdict, the most closely watched case is still unfolding in Los Angeles. The trial involves Meta and Google’s YouTube as defendants (TikTok and Snapchat settled before trial began), and the lead plaintiff is Kaley G.M., a 20-year-old California woman. Her case—and the 2,406 other cases consolidated in the same Multidistrict Litigation (MDL)—alleges that Instagram and YouTube deliberately designed features like infinite scroll, auto-play functionality, algorithmic recommendations, and push notifications to addict young users since childhood.
The Los Angeles trial has not yet concluded, meaning no verdict has been reached in this larger case. However, the stakes are enormous: a verdict here could set the tone for how the remaining cases proceed. The fact that platforms are being sued for their design choices—not for the content users encounter—represents a new legal frontier. Plaintiffs argue that Meta knew these features were addictive and deliberately deployed them to maximize engagement and revenue, despite evidence of mental health harms to children.
The Broader Litigation Landscape—2,407 Pending Cases
As of March 2, 2026, the MDL (Multidistrict Litigation) consolidated 2,407 pending cases related to social media harm. These cases represent a wave of litigation that started emerging as parents and mental health advocates documented rising rates of anxiety, depression, and self-harm among young social media users. The pending cases include claims from teens and young adults across multiple states, all alleging similar harm from Instagram, YouTube, TikTok, and Snapchat.
It’s important to understand that most of these cases are still at the pleading or early discovery stage—meaning evidence is still being gathered, and no trials have been scheduled for the vast majority. The New Mexico verdict and the ongoing Los Angeles trial serve as the “leading edge” cases that will inform how all these other lawsuits progress. If plaintiffs win in Los Angeles, it could accelerate settlements or encourage other platforms to negotiate. If defendants win, many of the pending cases could face dismissal or become much harder to prove.

Bellwether Trials—Test Cases That Will Shape Thousands of Other Cases
To manage 2,407 cases efficiently, courts typically use “bellwether” trials—a small group of representative cases that go to trial first, with results informing the broader litigation. The parties have scheduled bellwether trials for **June 15, 2026** and **August 6, 2026**. These will serve as additional test cases in the MDL beyond the Los Angeles trial currently underway.
The outcomes of these bellwether trials matter enormously. If plaintiffs prevail, it signals to remaining defendants and plaintiffs that settlements or favorable verdicts are likely, potentially accelerating resolution of the ~1,600+ remaining cases. If defendants win, it could signal that the claims face significant legal hurdles, which might lead to dismissals or lower settlement offers. Many pending cases will be “stayed” (temporarily paused) pending the outcome of these bellwethers—meaning thousands of plaintiffs are essentially waiting to see how the test cases turn out before their own lawsuits move forward.
The Legal Defense—Why Platforms Argue They Can’t Be Held Responsible
Meta, YouTube, and other platforms have a powerful legal shield: **Communications Decency Act Section 230**. This federal law has historically protected online platforms from liability for third-party content posted by users. However, the New Mexico verdict suggests a crucial limitation: Section 230 may not protect platforms from liability for their own design choices and conduct. Plaintiffs argue they’re not suing over content; they’re suing over deliberately addictive features that platforms knowingly deployed.
This distinction is critical. If a platform is sued for hosting harmful user-generated content, Section 230 applies. But if a platform is sued for designing features specifically to addict children—in other words, for the platform’s own conduct rather than user conduct—the legal shield may not hold. The New Mexico verdict and the Los Angeles trial are testing exactly this boundary. Both defendants and plaintiffs are watching closely to see whether courts will allow cases to proceed based on design-liability theories, or whether Section 230 will continue to provide broad protection.

What Plaintiffs Are Claiming—The Design Features at Issue
The central claim in these lawsuits is that Instagram, YouTube, TikTok, and Snapchat used specific design features to deliberately addict young users. According to the Los Angeles case, these features include infinite scroll (which removes the visual cue of reaching the end of content, encouraging endless scrolling), auto-play (which automatically loads the next video without user action), algorithmic recommendations (which amplify engaging content regardless of whether it’s healthy), and push notifications (which interrupt the user throughout the day). Plaintiffs argue that these features were not accidental design choices but deliberate strategies informed by internal research showing that children are particularly vulnerable to addiction.
The internal research that Meta and other platforms conducted revealed the mental health harms caused by these features, yet the platforms continued deploying them anyway. For example, a young user might open Instagram intending to spend 5 minutes browsing but instead spend an hour because infinite scroll eliminates the natural stopping point. The lawsuit alleges this is not a flaw in the product—it’s the product working exactly as designed.
What Comes Next—The Road to Resolution
The next critical junctures are the bellwether trials scheduled for June 15 and August 6, 2026. Outcomes from these trials could trigger settlement discussions or further litigation strategy adjustments. Beyond that, the 2,407 pending cases face a long timeline, likely years, before most are resolved. Some may be dismissed if courts rule in favor of the defendants. Others may settle.
A few may go to trial. The New Mexico verdict just changed the calculus significantly. It showed that juries are willing to find platforms liable for harm to children, and it set a precedent that design-liability claims may survive legal challenges. For the thousands of plaintiffs still waiting, the New Mexico verdict is encouraging—it demonstrates that at least one jury believed the evidence and was willing to hold a major platform accountable. However, each case is different, and outcomes in California may differ from New Mexico, so uncertainty remains for most pending cases.
