A Los Angeles jury is currently in day 8 of deliberations in a landmark trial against Meta Platforms and Alphabet/Google, where a 20-year-old woman from Chico, California claims that intentional design features on Facebook, Instagram, and YouTube caused severe mental health harm, depression, and suicidal ideation. The jury began discussions on March 12-13, 2026, and as of late March, jurors have signaled difficulty reaching consensus on at least one defendant, with the judge warning that a failure to reach a verdict could necessitate a partial retrial.
This case comes amid a broader wave of legal pressure on tech giants: just days earlier, on March 24, 2026, a New Mexico jury found Meta liable on all counts in a separate child safety trial, ordering $375 million in damages—the first major jury verdict specifically holding Meta accountable for child exploitation on its platforms. For individuals affected by social media harm or those considering legal action, understanding these trials is essential.
Table of Contents
- What Is the Los Angeles Social Media Addiction Trial?
- Evidence and Testimony in the Jury Deliberation
- Why This Case Matters as a Bellwether Test
- The New Mexico Child Safety Verdict and Its Implications
- Jury Deadlock Risk and What Happens Next
- What Evidence Courts Are Examining Across These Cases
- The Broader Legal Landscape and Future Implications
What Is the Los Angeles Social Media Addiction Trial?
The Los Angeles case represents one of the first jury trials to directly challenge major social media platforms on claims of addictive design. The plaintiff, identified in court documents as “K.G.M.” or “Kaley,” alleges that Meta’s Facebook and Instagram, along with Google’s YouTube, deliberately engineered addictive features—including infinite scrolling, dopamine-driven “like” buttons, beauty filters, and push notifications—specifically to maximize user engagement and advertising revenue, regardless of psychological harm to young users. The defendants are Meta Platforms Inc. and Alphabet Inc.
(Google’s parent company). Unlike settlement agreements or regulatory fines, this jury trial requires jurors to decide liability based on competing evidence from both plaintiffs and defendants. The plaintiff must prove that the platforms’ design choices were intentional, that they caused harm, and that the companies knew of the risks. The defendants argue that while their platforms have engagement features, they also provide tools for parental controls, screen time limits, and mental health resources. For plaintiffs considering similar claims, this trial demonstrates that juries are willing to hear detailed expert testimony about algorithm design and psychological impact—but also that proving causation and intent remains challenging.

Evidence and Testimony in the Jury Deliberation
Over the course of the trial, jurors heard testimony from the plaintiff describing her personal struggle with social media use, mental health experts explaining how infinite scrolling and algorithmic feeds can contribute to anxiety and depression, and potentially former employees or technical experts discussing how platforms are engineered for engagement. The defendant companies presented their own experts and evidence about the voluntary nature of social media use, the availability of privacy controls, and research suggesting that correlation between social media use and mental health issues does not prove causation.
A critical limitation in social media harm cases is isolating the platform’s responsibility from other contributing factors—family circumstances, peer relationships, pre-existing mental health conditions, and school stress all influence a young person’s wellbeing. Jurors must weigh whether meta and Google’s design choices were the primary cause of the plaintiff’s harm or merely one factor among many. This burden of proof explains why the jury is taking considerable time: determining causation in complex psychological cases requires careful deliberation of expert testimony, competing studies, and the specific facts of this plaintiff’s situation.
Why This Case Matters as a Bellwether Test
This Los Angeles trial is widely recognized as a bellwether case—a test case that could set legal and financial precedent for thousands of similar lawsuits pending across the country. If the jury finds Meta and Google liable, plaintiffs’ attorneys will cite the verdict to support hundreds or thousands of other cases. If the jury rules in favor of the defendants, it sends a message that proving intentional addiction design is extremely difficult, which could stall other litigation.
The financial stakes are enormous: while the LA case itself may award damages specific to this plaintiff (potentially in the millions), a favorable verdict for plaintiffs could unlock settlements or judgments affecting millions of other users who claim harm from social media. The timing is significant because this verdict will arrive amid heightened regulatory scrutiny. State attorneys general, federal lawmakers, and the FTC have all increased pressure on Meta and Google over algorithm transparency and youth safety. A jury verdict holding these companies liable for addictive design could influence whether Congress passes legislation restricting how platforms target younger users or how they use engagement metrics in their algorithms.

The New Mexico Child Safety Verdict and Its Implications
While the LA jury deliberates, a separate legal battle already concluded: on March 24, 2026, a New Mexico jury returned a verdict finding Meta liable on all counts in a child safety case brought by the state’s attorney general. The jury determined that Meta engaged in unfair, deceptive, and unconscionable trade practices by downplaying its safety measures while knowing that bad actors could use its platforms to contact children. The verdict resulted in a $375 million judgment against Meta—a significant financial blow and the first major jury verdict specifically targeting Meta for child safety failures rather than a settlement.
This verdict strengthens the legal foundation for future youth-focused cases. The New Mexico jury saw evidence that Meta executives understood children’s vulnerability on their platforms, heard testimony from whistleblower employees, and reviewed findings from undercover investigations showing how easily predators could contact minors. The fact that jurors found liability “on all counts” suggests the evidence was compelling and that juries are willing to hold social media companies accountable when presented with clear proof of misconduct. For the LA addiction case, the New Mexico verdict may influence how jurors view Meta’s credibility and intentions—if the company was found liable for knowingly exposing children to harm, were its design features truly as innocent as it claims?.
Jury Deadlock Risk and What Happens Next
The judge’s recent warning that the jury may fail to reach a verdict signals a potential hung jury, meaning jurors are so divided that reaching unanimity is impossible. In that scenario, the judge could declare a mistrial. If a partial retrial is ordered, some counts might proceed to a new jury while others are dismissed or settled. A mistrial would be a partial defeat for both sides—plaintiffs would not secure a verdict, but the judge’s acknowledgment of jury difficulty also suggests the case has sufficient merit to warrant another trial rather than immediate dismissal.
The jury’s struggle likely reflects the genuine complexity of these cases. Even if jurors believe the platforms were negligent or acted carelessly, proving intentional misconduct designed specifically to addict users is a higher legal bar. Some jurors may believe Meta and Google bears responsibility but not at the level required by the law. Others may think the evidence simply doesn’t meet the burden of proof. Observers should note that a hung jury or mistrial does not mean plaintiffs have lost—it means the system is functioning exactly as designed: requiring clear consensus before imposing liability on major companies.

What Evidence Courts Are Examining Across These Cases
Both the LA addiction trial and the New Mexico child safety case rely on documentary evidence, expert testimony, and internal company communications. Jurors in the LA case have likely reviewed algorithm design documents, testimony about how Meta’s feed prioritizes engagement, and internal emails discussing the platform’s impact on user behavior. The New Mexico case similarly presented internal communications showing what Meta knew about child safety risks.
These trials represent a shift toward transparency: social media companies can no longer argue that their algorithms are inscrutable “black boxes” if jurors and judges can see the underlying intent behind design choices. One example of compelling evidence is the “like” button and engagement metrics. Expert witnesses have testified that Meta deliberately designed the “like” feature to create a reward loop similar to gambling, with unpredictable intervals of engagement that psychologically reinforce continued use. When jurors can see internal documents where engineers or product managers discuss maximizing engagement specifically to increase advertising impressions, that evidence becomes particularly damaging to the defendant’s position.
The Broader Legal Landscape and Future Implications
These jury verdicts and ongoing deliberations are part of a larger reckoning with social media’s business model. Regulators, lawmakers, and courts are increasingly challenging the assumption that platforms should optimize purely for engagement and advertising revenue without regard to user wellbeing. Several states have proposed legislation requiring social media companies to offer safer design defaults for minors—such as limiting algorithmic feeds, disabling infinite scrolling by default, or restricting targeted advertising to young users.
The outcome of the LA jury’s deliberation will likely influence settlement negotiations in hundreds of pending cases and possibly accelerate legislative action. If Meta and Google lose, expect rapid increases in settlement offers and demands to change platform design. If they win, plaintiffs’ attorneys will likely refine their legal theories and focus more heavily on child-specific claims rather than general addiction. Either way, the era of social media platforms operating without legal accountability for design choices appears to be ending.
