Jury Continues Deliberations in Case Against Meta and Google Over Addiction Claims

As of March 25, 2026, a Los Angeles jury continues deliberations in a landmark case where a 20-year-old plaintiff identified as "Kaley G.M.

As of March 25, 2026, a Los Angeles jury continues deliberations in a landmark case where a 20-year-old plaintiff identified as “Kaley G.M.” claims social media platforms Meta and Google-owned YouTube caused her anxiety, depression, and body dysmorphia starting in adolescence. The jury has now entered Day 8 of deliberations and has signaled to the judge a potential deadlock on one of the defendants, prompting Los Angeles Superior Court Judge Carolyn B. Kuhl to warn that a partial retrial may be necessary if the jury cannot reach consensus. This case is critical because the outcome could establish legal precedent that holds social media platforms liable for addiction-related mental health harms—affecting thousands of pending lawsuits nationally.

The jury must determine whether Meta’s Facebook and Instagram, along with Google’s YouTube, were a “substantial factor” in causing the plaintiff’s mental health damage, even when other life stressors were present. The trial has exposed competing narratives about responsibility: the plaintiff’s team argues that these platforms deliberately designed features to maximize engagement and hook young users, while Meta and YouTube’s defense maintains that the plaintiff’s “turbulent home life” and pre-existing mental health vulnerabilities were the actual causes of her suffering. Adding urgency to these Los Angeles deliberations is a separate New Mexico verdict announced just days earlier, where a jury found Meta liable on all counts and awarded $375 million in damages for unconscionable and deceptive trade practices—marking the first time Meta has been held accountable in a jury trial for creating features harmful to children’s mental health.

Table of Contents

What Specific Addiction Claims Is the Jury Deciding?

The plaintiff, identified as “Kaley G.M.,” was 20 years old at the time of the trial and claims that her use of Facebook, Instagram, and YouTube beginning in adolescence caused quantifiable mental health injuries. According to her allegations, the platforms’ design features deliberately created patterns of compulsive use that contributed to anxiety, clinical depression, and body dysmorphia—a condition where she became obsessed with perceived flaws in her physical appearance. The case centers on whether these mental health conditions would have developed to the same degree without the platforms’ influence, or whether Meta and YouTube’s specific design choices—such as algorithmic feeds designed to maximize “engagement,” infinite scroll features, and recommendation systems that prioritize emotionally charged content—were a “substantial factor” in her psychological decline. This is not a claim that social media caused her mental health issues in isolation.

Rather, the legal standard the jury must apply is whether the platforms were a substantial contributing factor alongside other life circumstances. The plaintiff’s testimony revealed that she grew up in what her legal team characterized as a difficult home environment, yet her lawyers argue that Meta and YouTube exploited her vulnerability during critical developmental years. For comparison, the defense strategy is to prove that her turbulent family situation and any pre-existing mental health vulnerabilities were the primary cause, with the platforms merely serving as a coping mechanism rather than a driver of harm. This distinction matters enormously for liability: if the jury believes the platforms were merely a secondary influence, they may find the defendants not liable, even if they acknowledge social media played some role.

What Specific Addiction Claims Is the Jury Deciding?

Judge Carolyn B. Kuhl instructed the jury that meta and YouTube can be held liable if the evidence shows that their platforms were a “substantial factor” in causing the plaintiff’s mental health injuries. A substantial factor is a legal standard that does not require the defendant to be the sole cause—only that their conduct meaningfully contributed to the harm. This is a meaningful bar because it allows for multiple contributing causes. The plaintiff’s team has presented expert testimony arguing that Meta and Instagram’s algorithmic recommendation systems, auto-play features, and notification designs were intentionally engineered to maximize user engagement metrics, knowing that this would be particularly addictive for adolescents whose brains are still developing impulse control and judgment.

However, the defense has raised a critical challenge: even if the jury agrees that social media features are addictive and engaging, proving that a specific company’s conduct caused a specific plaintiff’s mental health condition is complicated by the existence of other contributing factors. The plaintiff had a “turbulent home life,” according to the defense, which could independently explain her depression and anxiety. Some of her mental health symptoms may have been pre-existing before she joined these platforms. If the jury is convinced that the home environment was the substantial factor and the platforms were secondary, they could find in favor of the defendants. This is why Judge Kuhl’s warning about potential deadlock is significant—the jury may be split on whether Meta’s conduct or YouTube’s conduct rises to the level of a “substantial factor,” with some jurors believing one platform met the legal standard while others disagree.

Meta Legal Outcomes Against Child Safety Claims (Recent Cases)New Mexico Verdict (2026)1Cases/OutcomesLos Angeles Addiction Trial (Pending)0Cases/OutcomesGeneral Youth Safety Claims0Cases/OutcomesPlatform Design Liability Cases1Cases/OutcomesFuture Pending Lawsuits3000Cases/OutcomesSource: Court records, Los Angeles Superior Court, New Mexico state court, estimated pending litigation databases

What Arguments Is the Defense Making About Causation?

Meta and YouTube’s legal teams have centered their defense on the principle that the plaintiff’s own life circumstances, not the platforms’ features, were the substantial cause of her mental health problems. They argue that her “turbulent home life” created emotional vulnerabilities and stress that naturally led to seeking comfort and connection—which she found through social media. Rather than viewing the platforms as harmful, the defense characterizes them as tools the plaintiff used to cope with her difficult family situation. The defense also points to the fact that millions of other adolescents use these same platforms without developing the severe mental health conditions the plaintiff claims.

A critical component of the defense strategy is to present expert witnesses who testify that depression, anxiety, and body dysmorphia have multiple causes—genetic predisposition, family dynamics, school peer relationships, and broader cultural pressures about appearance in society. The defense does not dispute that the plaintiff developed these conditions; instead, they argue these conditions would have developed regardless of whether she used Instagram or YouTube. If the jury credits this testimony, they may conclude that while the platforms may contribute to mental health pressures generally, they were not a substantial factor in *this particular plaintiff’s* harm. For example, the defense may argue that her body dysmorphia was shaped more by conversations with peers at school, magazine culture, and her family situation than by Instagram’s image-focused feeds—a distinction that could shield the platforms from liability in this case, even if subsequent trials might focus more directly on platform design features.

What Arguments Is the Defense Making About Causation?

How Does the New Mexico Meta Verdict Affect This Los Angeles Case?

Just days before the Los Angeles jury entered its prolonged deliberations, a New Mexico jury reached a verdict against Meta, finding the company liable on all counts and awarding $375 million in damages. That verdict found Meta guilty of “unconscionable,” “unfair and deceptive” trade practices under New Mexico consumer protection law, with the jury specifically determining that Meta knowingly created features harmful to children’s mental health and failed to protect users from sexual predators. While the New Mexico case focused on different legal claims and damage categories than the Los Angeles trial, the verdict sends a powerful signal: at least one jury has been willing to hold Meta accountable for child safety failures in a court of law. The Los Angeles jury is certainly aware of the New Mexico outcome, though judicial instruction may limit how much weight they can give it.

The timing creates a psychological precedent—Meta has now lost at least one jury trial on child-related harms. However, the Los Angeles case operates under different legal standards (California law versus New Mexico consumer protection law) and different facts (addiction and mental health causation versus broader harms and predator protection). The defendants’ attorneys will likely argue that the New Mexico case involved clearer violations of consumer protection statutes and that the Los Angeles jury should focus solely on whether the evidence in *this* trial meets *this* state’s legal standard for liability. Still, the New Mexico verdict demonstrates that juries are capable of believing evidence that Meta acted knowing its platforms harmed young users—which could influence how the Los Angeles jury views Meta’s internal design choices and any evidence about whether the company knew certain features were habit-forming.

What Is the Significance of a Potential Deadlock and Partial Retrial?

Judge Kuhl’s warning that a “partial retrial may be necessary” indicates that the jury has signaled it may not be able to reach unanimous agreement on liability for both defendants. In some cases, jurors split on whether Meta bears liability versus whether YouTube (Google) bears liability, or they may disagree on the degree of responsibility each platform shares. If the jury cannot unanimously agree on liability for one or both defendants, a mistrial may be declared on that defendant, which could lead to a retrial of just that portion of the case with a new jury. A partial retrial would be expensive and time-consuming for all parties but would not erase what happened in this first trial.

If the jury reaches unanimous consensus on *one* defendant (finding liability against Meta but not YouTube, for example), that verdict would stand and could not be retrialed. However, if they cannot agree at all, the case essentially resets on the deadlocked defendant. This outcome highlights the challenge of these cases: even when evidence of addictive design features is presented, jurors may reasonably disagree about whether that evidence proves causation and substantial factor in a particular plaintiff’s specific mental health injuries. For the plaintiff, any unanimous verdict for liability would be a significant win; for the defendants, a deadlock on even one defendant is a partial victory that avoids immediate liability and damages.

What Is the Significance of a Potential Deadlock and Partial Retrial?

Throughout the trial, the plaintiff’s legal team has focused on specific design features that researchers and former Meta employees have testified make these platforms particularly habit-forming. These include algorithmic feeds that don’t show posts in chronological order but instead prioritize posts likely to generate engagement (likes, comments, shares), infinite scroll functionality that eliminates natural stopping points, auto-play video features that automatically advance to the next video, and notification systems that alert users when someone likes or comments on their post. For Instagram specifically, the platform’s emphasis on curated images of peers’ lives—travel photos, fitness achievements, appearance-focused content—is alleged to have directly contributed to the plaintiff’s body dysmorphia.

YouTube’s recommendation algorithm has been scrutinized for amplifying emotionally provocative videos and creating “rabbit holes” where users move from one video to the next in increasingly extreme content. The plaintiff’s testimony reportedly included details about compulsive checking of these platforms multiple times per hour, difficulty disengaging even when she wanted to stop, and the psychological reward she felt when notifications arrived. Former product managers and researchers have testified that Meta and Google explicitly track engagement metrics and optimize features to maximize time-on-platform, creating a tension between user welfare and business objectives. However, the defense argues that these features reflect ordinary engineering practices to make platforms useful and entertaining, not a conspiracy to addict children.

What Could This Verdict Mean for Future Social Media Litigation?

If the Los Angeles jury returns a verdict finding Meta and/or YouTube liable, it would mark a watershed moment in litigation against social media companies. There are thousands of pending lawsuits alleging similar harms—teenagers and young adults claiming that platform use caused depression, anxiety, self-harm, and other mental health injuries. A verdict in favor of the plaintiff would strengthen the legal theory that causation can be proven in these cases and would provide a blueprint for future plaintiffs. Conversely, a defendant verdict would not end these lawsuits but would signal to defendants’ attorneys that juries can be convinced to reject causation claims even when design features are shown to be addictive.

The case also reflects evolving legal recognition of what researchers have documented: that social media platforms, particularly for adolescents, create patterns of compulsive use that have psychological consequences. Whether courts and juries will hold companies legally and financially liable for those consequences—given that parents, peers, schools, and cultural norms also influence mental health—remains an open question. The verdict, whatever it is, will likely be appealed, extending the timeline for final resolution and ultimate precedent-setting. In the meantime, legislators in several states are considering regulations that would restrict how platforms can market to minors and how algorithmic recommendations can be deployed for young users—suggesting that even litigation alone is creating pressure for industry change.

You Might Also Like

Leave a Reply