As of March 25, 2026, no verdict has been reached in the Los Angeles trial where a jury is weighing evidence in a social media addiction case against Meta and YouTube. However, the landscape shifted dramatically on March 24, 2026, when a New Mexico jury awarded $375 million in civil penalties against Meta for knowingly harming children’s mental health and concealing knowledge of child sexual exploitation on its platforms—marking the first major state courtroom victory against a major tech company over child safety claims. The Los Angeles jury, which began deliberations on March 13, 2026, is still struggling to reach a unanimous verdict on at least one defendant after a month-long trial, with jurors moving beyond the liability phase and now considering financial damages.
The stakes are enormous. The Los Angeles verdict alone could influence thousands of similar lawsuits pending in federal court across the nation. Young people and their families who believe social media platforms deliberately engineered addiction-like dependencies that harmed their mental health are watching closely. Meanwhile, the platforms continue to argue that their services are not clinically addictive and that parents and users have control over their own usage through available safety tools.
Table of Contents
- What Is the Status of the Los Angeles Social Media Addiction Trial?
- New Mexico’s $375 Million Verdict: What Changed on March 24, 2026
- The Broader National Litigation Landscape: 2,407 Pending Cases
- What Do These Lawsuits Allege About Addictive Social Media Features?
- How Are Meta, Google, and Other Platforms Defending Themselves?
- The Timeline Ahead: What Happens Next in These Cases
- Why These Cases Matter Beyond the Courtroom
What Is the Status of the Los Angeles Social Media Addiction Trial?
The Los Angeles trial began in early February 2026 and centered on the case of a young woman identified as Kaley (using the initials KGM) who alleges that meta and YouTube’s platforms caused her severe depression and suicidal ideation through addictive design features. The jury began deliberations on the morning of March 13, 2026, and after signaling possible disagreement, resumed deliberations on Tuesday, March 25, 2026. As of late March 2026, jurors had moved beyond deciding whether the platforms were liable and were actively considering what financial damages to award—a significant milestone indicating they believed harm had occurred.
However, the jury has struggled to reach unanimity, with reports suggesting disagreement over at least one defendant. This is not unusual in complex civil litigation involving major corporations and novel legal theories, but it raises the question of whether a mistrial could be declared if consensus proves impossible. The jury’s deliberation time and the evidence they reviewed—spanning the month-long trial—suggests they took the case seriously and examined testimony from addiction experts, platform engineers, and the plaintiff herself.

New Mexico’s $375 Million Verdict: What Changed on March 24, 2026
On Tuesday, March 24, 2026, a New Mexico jury reached a verdict that sent shockwaves through the tech industry: Meta was ordered to pay $375 million in civil penalties under New Mexico’s consumer protection laws. This was the maximum allowed under state law at $5,000 per violation, meaning the jury found evidence of approximately 75,000 violations or chose to apply the maximum penalty calculation. The state of New Mexico had sought approximately $2.1 billion, so while the $375 million award was substantial, it was less than what prosecutors requested.
The jury found that Meta violated consumer protection laws by knowingly harming children’s mental health, concealing knowledge of child sexual exploitation occurring on its platforms, and misleading users about the safety of its platforms for children. This verdict is historic because it represents the first state to prevail at trial against a major technology company over child harm claims. Meta has stated it disagrees with the verdict and plans to appeal. The case does not end with the financial penalty; a non-jury phase will begin on May 4, 2026, where a judge will determine public nuisance liability and whether public program funding should be directed to address harms.
The Broader National Litigation Landscape: 2,407 Pending Cases
The Los Angeles and New Mexico cases are not isolated incidents but represent the visible tip of a much larger litigation wave. The Northern District of California is overseeing a multidistrict litigation (MDL) with 2,407 pending claims against multiple defendants including Instagram, Facebook, Meta, Google YouTube, TikTok, and Snapchat. Judge Yvonne Gonzalez Rogers is managing this complex litigation, which has grown by approximately 100 new cases over the last month alone, suggesting continued momentum in filings. Settlement dynamics vary by defendant.
TikTok and Snapchat have already reached settlements to resolve their exposure, but Meta and Google continue to fight the allegations rather than settle. This means the outcomes in the Los Angeles and New Mexico trials could significantly influence whether Meta and Google eventually choose to settle the broader MDL or continue defending themselves in court. The 9th U.S. Circuit Court of Appeals recently appeared skeptical of Section 230 dismissal claims in a three-judge panel decision, leaning toward allowing the MDL to proceed—a development that increases pressure on defendants to consider settlement or prepare for extended litigation.

What Do These Lawsuits Allege About Addictive Social Media Features?
The lawsuits center on specific platform design features that plaintiffs argue create addiction-like dependencies in vulnerable users, particularly minors. The alleged addictive features include infinite scrolling (continuous feeds that never end, encouraging endless browsing), push notifications (alerts that pull users back into apps), and algorithm-driven recommendations (systems that serve content designed to maximize engagement rather than user wellbeing). These mechanisms are not accidental design choices but deliberate engineering decisions, plaintiffs argue, based on internal platform knowledge that such features drive usage and engagement metrics.
The harm alleged goes beyond mere “screen time.” Plaintiffs point to documented mental health impacts including depression, anxiety, and suicidal ideation, particularly among teenagers and young adults whose brains are still developing and may be more vulnerable to dependency. The lawsuits also allege that Meta and Google knew about these harms through internal research but failed to disclose them to users or parents, and in some cases actively concealed what they knew. For example, the New Mexico verdict specifically found Meta liable for concealing knowledge of child sexual exploitation occurring on its platforms—suggesting that the harms go beyond addiction to include safety risks enabled by the platform’s design and policies.
How Are Meta, Google, and Other Platforms Defending Themselves?
The tech companies’ defense strategy centers on several key arguments. First, they contend that their services are not clinically addictive in the way that substances like alcohol or drugs are, and that users can simply choose to stop using the platforms or reduce their usage at will. Second, they point to the availability of parental controls, usage monitoring tools, and safety features that allow users and parents to manage time spent on platforms. Third, they argue that users benefit from the same recommendation algorithms that plaintiffs claim are addictive—that these systems help users discover content relevant to their interests.
However, a critical limitation of the “parental controls” defense is that it places responsibility on parents and users to manage exposure to platforms that companies have engineered to maximize engagement. For young teenagers, many of whom lack fully developed impulse control and decision-making capacity, this puts an unrealistic burden on individual choice. Meta’s response to the New Mexico verdict—stating disagreement and planning to appeal—suggests the company intends to continue this defense strategy rather than acknowledge the validity of the claims. The outcomes in Los Angeles and the appeal of the New Mexico verdict will test whether courts find the platforms’ arguments persuasive.

The Timeline Ahead: What Happens Next in These Cases
The Los Angeles jury’s ongoing deliberations remain the immediate focus. A verdict could come within days or weeks, or a mistrial could be declared if jurors cannot reach unanimity. Once the Los Angeles verdict is reached, it will likely influence settlement discussions in the broader MDL and could affect whether Meta and Google choose to continue fighting or move toward resolution. In New Mexico, the non-jury phase beginning May 4, 2026, will address public nuisance liability—a legal theory that could broaden the scope of damages beyond consumer protection violations.
If the judge finds public nuisance liability, it could open the door to damages funding programs for affected consumers. The appeals process is also critical. Meta has signaled it will appeal the $375 million New Mexico verdict, which means years of further litigation could lie ahead. However, the speed at which the New Mexico jury reached its verdict (after a full trial) and the magnitude of the penalty suggest that juries, at least in this case, found the evidence compelling. This could embolden other plaintiffs bringing similar claims and may encourage regulators in other states to pursue similar litigation.
Why These Cases Matter Beyond the Courtroom
The social media addiction lawsuits represent a fundamental question about corporate responsibility in the digital age: Can companies be held liable for harm caused by intentional design choices that maximize engagement, even if those choices cause documented mental health damage to users? The New Mexico verdict suggests at least one jury answered yes. If the Los Angeles jury reaches a similar conclusion, and if Meta and Google lose appeals, it could force significant changes to how social media platforms are designed and operated in the United States. Beyond legal liability, these cases have already influenced public perception and regulatory scrutiny.
Legislators in multiple states have proposed bills to regulate social media design, particularly regarding features targeting minors. The verdicts and pending litigation provide evidence that such regulation may be justified and that consumers themselves believe platforms have acted wrongfully. For families affected by the harms alleged—young people who experienced depression, anxiety, or worse—these lawsuits represent a potential avenue for accountability and, compensation.
