A jury in New Mexico has delivered a landmark verdict against Meta, ordering the company to pay $375 million in damages after finding it liable for failing to protect children and knowingly harming their mental health. Simultaneously, a federal jury in California has been deliberating for more than a week on whether Meta and YouTube should face liability for using addictive design practices that targeted young users. These jury decisions represent a watershed moment in Big Tech litigation—the first time major social media companies face actual jury verdicts rather than negotiated settlements, and the implications could reshape how courts treat tech company accountability for decades.
The New Mexico verdict, handed down March 24-25, 2026, marks among the first Big Tech social media cases to reach a full jury trial and favorable verdict for consumers. The jury found that Meta violated New Mexico’s unfair and deceptive practices act and failed to warn consumers about dangers to children, including risks of child sexual exploitation. The California case adds another dimension—it centers on whether platforms deliberately designed addictive features targeting minors, a question that could affect how 40+ state attorneys general pursue similar lawsuits against Meta and other tech giants.
Table of Contents
- What the New Mexico Jury Found Against Meta
- The California Addiction Case—Jury Discord and Ongoing Deliberations
- Why These Verdicts and Ongoing Cases Matter for Future Tech Litigation
- The Domino Effect—What New Mexico’s Verdict Means for Other State Cases
- The Hidden Complexity—What Makes Proving Tech Harm So Difficult
- Meta and YouTube Under Scrutiny—The Tech Industry Watches Closely
- The Broader Tech Regulation Landscape—What These Cases Signal for the Future
What the New Mexico Jury Found Against Meta
The New Mexico jury’s decision to find meta liable on all counts sends a powerful signal: juries are willing to hold tech companies accountable for harm to minors. The jury determined that Meta engaged in unfair and deceptive practices by failing to adequately protect children while knowing that its platform posed documented risks. Specifically, the verdict found that Meta did not warn consumers about dangers including child sexual exploitation, a particularly grave concern given the platform’s documented issues with predators targeting minors. The $375 million judgment reflects the jury’s assessment that Meta’s conduct was willful and reckless enough to warrant substantial damages. This is notably different from many tech settlements, where companies pay agreed-upon amounts without admitting wrongdoing.
Here, a jury explicitly determined fault after hearing evidence and deliberating. The verdict also carries weight because it applies New Mexico consumer protection law, opening a pathway for similar claims in other states with comparable statutes. What makes this verdict historically significant is that it succeeds where many previous attempts at tech regulation have faltered. Unlike regulatory fines, which companies often treat as a cost of doing business, jury verdicts establish legal precedent and create liability exposure that can trigger settlements in related cases. The New Mexico case shows that when regulators bring evidence of specific harms and platform negligence before a jury, jurors are capable of awarding meaningful damages.

The California Addiction Case—Jury Discord and Ongoing Deliberations
While New Mexico’s verdict has been finalized, the California federal court case involving K.G.M., a 20-year-old from Chico, remains in active deliberation. The jury has been sequestered for more than a week, a sign of the complexity and contentious nature of the questions before them. As of March 24, the jury signaled to the judge that it was having difficulty reaching consensus on one of the defendants—indicating that jurors may be split on whether Meta, YouTube, or both companies should face liability for using addictive design practices. K.G.M.’s case alleges that she was deliberately targeted by Facebook and YouTube’s “addictive practices” during her youth.
Unlike the New Mexico case, which focused on child safety and exploitation, this trial directly confronts the business model question: should platforms that engineer their algorithms and interfaces to maximize user engagement—particularly among minors—face legal liability? The jury’s hesitation on one defendant could suggest some jurors believe one company bears more responsibility than the other, or that the evidence for one platform was stronger. However, the jury’s week-long deliberation should not be read as a sign of weakness in the plaintiffs’ case. Complex cases often require extended deliberation, especially when jurors must weigh competing expert testimony about neuroscience, addiction, and platform design. The mere fact that the jury has deliberated this long on a tech addiction case signals that jurors take the claims seriously. A quick verdict would suggest either overwhelming evidence one way or another—the fact that deliberations have stretched longer reflects the genuine difficulty of proving causation and liability in tech design cases.
Why These Verdicts and Ongoing Cases Matter for Future Tech Litigation
The New Mexico and California cases represent a fundamental shift in how tech companies face legal accountability. For decades, Meta and other social media giants have largely avoided jury trials by settling cases before trial or by having regulators impose fines. Jury trials are inherently unpredictable—jurors may sympathize with harmed consumers in ways that negotiators at settlement tables do not. The New Mexico verdict proves that jurors, when presented with clear evidence of harm and negligence, are willing to award substantial damages. The California case is particularly significant because it targets the business model itself—algorithmic engagement optimization. If the jury returns a verdict holding Meta or YouTube liable for addictive design, it would establish that platforms cannot escape liability merely by claiming they are neutral technology companies.
Instead, they could be held responsible for specific design choices made to maximize engagement, even when those choices harm minors. Such a verdict would likely inspire a wave of similar lawsuits in other jurisdictions. Importantly, the New Mexico verdict already has implications for the 40+ state attorneys general lawsuits pending against Meta. Those cases are likely still in discovery or pre-trial stages, but they can now cite the New Mexico jury’s findings as evidence that Meta’s practices violate consumer protection law. Opposing counsel for Meta will argue that one jury’s decision doesn’t determine outcomes in other states, but juries in other cases will be aware of the New Mexico precedent. This creates a momentum effect that often accelerates settlement discussions.

The Domino Effect—What New Mexico’s Verdict Means for Other State Cases
The $375 million New Mexico verdict already appears to be catalyzing discussions among other state attorneys general. When one state achieves a jury verdict against a major tech company, other states recognize both the vulnerability and the opportunity. Several attorneys general have cases against Meta in various stages, and they can now argue to their juries that another state jury has already found Meta liable under similar consumer protection theories. However, the domino effect works both ways.
Meta and other tech companies will likely appeal the New Mexico verdict aggressively, arguing that the jury applied consumer protection law too broadly or that damages were excessive. A successful appeal would undermine the precedent that other states are attempting to use. Conversely, if the New Mexico verdict survives appellate review, each subsequent state case becomes easier to win because the legal question of Meta’s liability will have been partially settled by higher courts. The states with the strongest cases against Meta are likely those that can point to specific harms to their residents, comprehensive documentation of the platform’s failures to protect minors, and clear evidence that Meta knew about these risks. New Mexico’s success may also embolden private attorneys pursuing similar claims on behalf of individual users or groups of minors.
The Hidden Complexity—What Makes Proving Tech Harm So Difficult
One reason the California jury has deliberated for over a week is that proving tech harm is genuinely complex. In the New Mexico case, the jury had to evaluate whether Meta’s failure to protect children from exploitation constituted deceptive practice under state consumer protection law. That requires establishing what Meta knew, when it knew it, and whether a reasonable consumer would have accepted those risks if Meta had disclosed them. In the California case, the jury must prove causation between specific design features and addiction-like behavior, a far more technical question. Addiction is particularly difficult to prove in a jury trial because it involves neuroscience, behavioral psychology, and expert disagreement.
Meta’s expert witnesses will argue that social media engagement is not the same as addiction, that millions of people use Facebook and Instagram without developing problematic patterns, and that parental supervision and user agency play important roles. Plaintiff experts will counter with brain imaging studies and testimony about how engagement algorithms are designed to trigger dopamine responses. A jury must navigate this technical disagreement while trying to assess credibility. Also, even if a jury believes that a platform’s design features are addictive, proving that those features specifically caused harm to the plaintiff requires isolating the platform’s impact from other factors in the user’s life—family circumstances, genetics, peer relationships, and other media exposure. This is why addiction cases are so challenging: they demand proof of a direct causal link that can be difficult to establish beyond a reasonable doubt.

Meta and YouTube Under Scrutiny—The Tech Industry Watches Closely
These cases put Meta and YouTube in a particularly vulnerable position because both companies have faced years of public criticism about their handling of minors and their engagement-maximization algorithms. Internal documents that have surfaced through litigation show that Meta was aware of mental health risks to teens, a fact that strengthens arguments that the company acted with knowledge of potential harm. YouTube faces similar scrutiny about its recommendation algorithm, which has been documented to promote increasingly extreme content to young users.
Other tech companies are watching these cases closely because the legal theories being tested could potentially apply to them as well. TikTok, Snapchat, and other social platforms with significant youth user bases could face similar lawsuits if they cannot demonstrate that they have taken reasonable steps to protect minors. The question of what constitutes “reasonable” protection is still being litigated, but the New Mexico verdict suggests that juries expect more than terms-of-service disclaimers and age gates.
The Broader Tech Regulation Landscape—What These Cases Signal for the Future
The combination of the New Mexico verdict and the ongoing California deliberations signals that jury trials may become a more common path for holding tech companies accountable. Regulatory agencies have been struggling for years to impose meaningful restrictions on social media platforms, but jury verdicts create a different kind of pressure. They establish liability in ways that settlements do not, and they create legal precedents that can be applied in future cases.
Looking forward, we can expect to see more cases proceed to jury trial rather than settling. Tech companies may find that the cost of defending jury trials exceeds the cost of settling, especially as verdicts like New Mexico’s establish that damages for harm to minors can be substantial. Simultaneously, this creates an opening for meaningful changes to platform design and practices, because companies may decide that algorithmic changes to reduce engagement among minors is preferable to facing continued jury liability. The cases currently in deliberation and those pending in 40+ states may reshape the business models of the most profitable social media platforms in ways that regulation alone has failed to achieve.
