On March 24, 2026, a jury in Santa Fe, New Mexico delivered a historic verdict against Meta: $375 million in civil penalties for violating the state’s consumer protection laws through false statements about platform safety and enabling child sexual exploitation on Facebook, Instagram, and WhatsApp. This marks the first jury verdict against Meta on child safety charges—and every single count ruled against the company. It’s a watershed moment in the wave of state attorney general lawsuits targeting tech platforms over their handling of child predators and the mental health harms children face online.
This verdict doesn’t just affect Meta. More than 40 state attorneys general now have similar lawsuits pending against the tech giant, and this New Mexico win signals that juries are willing to hold Meta accountable where previous settlements and regulatory action have failed. The verdict is based on a 2023 undercover investigation where New Mexico investigators created fake social media accounts posing as children under 14 and documented how easily adults could find them, send them sexually explicit material, and solicit inappropriate responses.
Table of Contents
- What Did the New Mexico Jury Find Against Meta?
- How Did Investigators Prove Meta Enabled Child Exploitation?
- What Happens Next, and What Is Meta’s Path Forward?
- Why This Verdict Is a Turning Point for Other State Lawsuits
- What Are the Limits of This Verdict for Other Cases?
- What Should Parents and Users Know?
- What Does This Mean for Tech Regulation Long-Term?
What Did the New Mexico Jury Find Against Meta?
The jury didn’t just find meta liable—they found the company violated New Mexico’s consumer protection law on multiple fronts. Specifically, the jury determined that Meta made false or misleading statements about the safety of its platforms. More damaging still, the jury found that Meta engaged in “unfair and deceptive” trade practices and “unconscionable” trade practices that exploited children’s particular vulnerabilities. Beyond the safety claims, the jury also found that Meta’s platforms enabled child sexual exploitation and caused demonstrable harm to children’s mental health and safety.
What makes this verdict significant is that it wasn’t a narrow loss on a single claim—it was a complete loss on every count. Meta had argued that its platforms have safety features and that it works to remove predators, but the jury rejected those arguments entirely. The verdict represents a finding that Meta’s own statements about how it protects children were fundamentally false, and that the company’s practices harmed minors knowingly or recklessly. This is different from prior settlements where Meta paid money but didn’t necessarily admit wrongdoing or face a jury determination of liability.

How Did Investigators Prove Meta Enabled Child Exploitation?
The evidence came from a creative but troubling investigation. In 2023, New Mexico Attorney General Raúl Torrez’s office created undercover accounts posing as users under age 14 on Facebook and Instagram. The investigators documented what happened next: within a short period, adults found these accounts and sent sexually explicit material, attempted to engage the accounts in sexual conversations, and solicited explicit content from what the accounts presented as minors. The experience showed a systemic problem—Meta’s age verification, content filters, and predator detection tools either didn’t work or weren’t deployed effectively where minors congregated.
However, this undercover investigation doesn’t prove that Meta intentionally created these pathways for predators. Instead, it proved negligence and failure to implement safeguards despite claiming to have them. The jury’s finding of “unconscionable” practices suggests they believed Meta knew (or should have known) about these risks and failed to act proportionate to the danger. This is a key distinction that will likely feature prominently in Meta’s appeal—the company will argue that billions of users and unprecedented scale make perfect safety impossible, even as the jury concluded Meta’s conduct crossed the line into unconscionable behavior.
What Happens Next, and What Is Meta’s Path Forward?
Meta’s immediate response was to announce it “respectfully disagrees with the verdict and will appeal.” That appeal is almost certain to be lengthy and complex, likely centering on whether a jury can impose such a large penalty on a platform based on general failures to prevent all bad conduct, and whether Meta’s statements about safety were truly false or merely optimistic. The appeal will probably cite Meta’s investments in safety teams, removal of content, and law enforcement cooperation. But before the appeal even begins, there’s another critical phase: a hearing scheduled for May 4, 2026, to determine whether Meta created a public nuisance and, if so, what remedies should follow.
Public nuisance findings can lead to injunctive relief requiring specific operational changes—not just financial penalties. This means Meta could face court orders to implement specific child safety measures, conduct audits, hire monitors, or restructure how it moderates content. For Meta, a public nuisance finding might be more costly in the long run than the $375 million verdict, because it would create ongoing obligations and potential liability for non-compliance.

Why This Verdict Is a Turning Point for Other State Lawsuits
Before New Mexico’s jury verdict, no state attorney general had successfully taken a major tech company to trial and won on child safety and exploitation claims. Settlement agreements exist, sure—Meta has paid settlements to the FTC and others—but settlements don’t establish liability or create legal precedent the way jury verdicts do. This New Mexico verdict now exists in the public record and, more importantly, it signals to juries in other states that Meta’s defenses aren’t airtight. More than 40 state attorneys general have filed lawsuits against Meta using similar legal theories.
Some of these cases will now reference New Mexico’s jury verdict as evidence that courts and juries take these allegations seriously. States like California, Texas, and others are watching closely. The verdict also emboldens state AGs because it shows that despite Meta’s enormous legal resources and lobbying power, a jury of ordinary citizens can still hold the company accountable. This creates momentum for other cases moving through discovery or approaching trial, and it may pressure Meta to settle other state cases rather than risk more jury trials.
What Are the Limits of This Verdict for Other Cases?
One important caveat: this verdict is specific to New Mexico’s consumer protection law and the facts of this case. Other state laws differ, and some courts may view child safety liability differently based on Section 230 of the Communications Decency Act, which generally shields platforms from liability for user-generated content. New Mexico’s lawsuit focused on Meta’s own statements and practices, which may sidestep some Section 230 defenses, but courts in other states might reach different conclusions on similar facts.
Additionally, the $375 million penalty, while large, represents less than one week of Meta’s operating revenue—the company had roughly $150 billion in annual revenue recently. This means the financial penalty, standing alone, may not significantly change Meta’s business model or incentives unless similar verdicts accumulate. However, the reputational impact, the precedent, and the potential for injunctive relief in future cases (like the public nuisance hearing) carry greater long-term weight. If Meta loses the May 4 public nuisance hearing and is ordered to implement specific child safety measures, the operational burden could be far more consequential than the monetary fine.

What Should Parents and Users Know?
For parents and users, this verdict is a validation of concerns that have been building for years. The undercover investigation and jury finding confirm what many parents suspected: that predators can find minors on Meta’s platforms more easily than the company’s marketing materials suggest. The verdict doesn’t create new legal rights for individual users to sue Meta directly, but it does reinforce that platform safety claims should be viewed skeptically and that parents should use age restrictions, privacy controls, and monitoring rather than relying on Meta’s stated safeguards.
The verdict also shows that state attorneys general are willing to use consumer protection law to challenge tech companies. If you have concerns about Meta’s practices in your state, contacting your state attorney general’s office or documenting harmful experiences may contribute to ongoing investigations. While this verdict was won by New Mexico, the legal theories and evidence may inspire similar action in other states where similar problems exist.
What Does This Mean for Tech Regulation Long-Term?
This verdict is part of a broader shift in how lawmakers and courts are treating tech platforms. Rather than relying solely on the FTC’s settlement power or federal regulation (which Congress has not yet passed comprehensively), state attorneys general are using existing consumer protection statutes to hold platforms accountable. This approach sidesteps some of the Section 230 defenses that shield platforms from liability and focuses instead on the company’s own statements and conduct. The verdict also reinforces that there’s no liability shield for making false statements about safety, even for platforms.
If Meta claimed its systems were safe for children and the jury found that claim to be false, that false advertising can trigger liability. Looking forward, this creates a precedent that other tech companies (TikTok, YouTube, Snapchat, and others) should examine their own safety claims closely. If those claims are overstated, similar verdicts could follow. The Meta verdict may accelerate efforts to pass federal legislation on child safety online, as states and platforms grapple with the costs and uncertainty of ongoing litigation.
