As of mid-March 2026, a jury is actively deliberating in a landmark case alleging that Meta’s Instagram and Facebook, along with Google’s YouTube, deliberately designed addictive features that harmed a young woman’s mental health. The jury began their deliberations on March 13, 2026, and has continued asking detailed questions about damages calculations—a sign that they may be considering holding the companies liable. The case involves a 20-year-old plaintiff from Chico, California, who alleges that her use of these platforms starting at ages 9 (Instagram) and 6 (YouTube) led to addiction, depression, body dysmorphia, anxiety, and suicidal ideation. This trial matters far beyond this one case: it is the first “bellwether” trial among approximately 1,600 similar lawsuits filed against Meta, Google, TikTok, and Snap, meaning the jury’s verdict is expected to set a precedent that could shape settlements and liability findings for thousands of other plaintiffs nationwide.
This article explains where the trial stands, what the jury is deciding, what it means for similar cases, and what consumers should know about social media harm litigation. The trial itself concluded with closing arguments on March 12, 2026, after approximately six weeks of testimony from addiction experts, mental health professionals, platform engineers, and corporate executives. Judge Carolyn Kuhl presided over the proceedings, which began with jury selection on January 27, 2026, and trial testimony starting February 10, 2026. During the first week of deliberations, the jury sent multiple questions to the judge, including inquiries about the plaintiff’s family circumstances and her patterns of platform usage. By March 18-20, 2026—now their sixth day of deliberation—the jury’s questions shifted to focus on calculating damages, suggesting they are working through the legal mechanics of potentially holding the defendants responsible for financial compensation.
Table of Contents
- What Is the Jury Being Asked to Decide in This Social Media Addiction Case?
- How Does This Case Fit Into the Larger Wave of Social Media Litigation?
- What Evidence Did the Trial Reveal About Platform Design and Youth Harm?
- What Types of Damages Might the Jury Award If They Find the Companies Liable?
- What Are the Legal Hurdles the Jury Had to Clear to Find Meta and Google Liable?
- Why Did Meta and Google Choose to Fight This Case Instead of Settling?
- What Happens Next and What This Case Could Mean for Other Plaintiffs?
What Is the Jury Being Asked to Decide in This Social Media Addiction Case?
The jury must determine whether meta and Google are legally liable for knowingly designing their platforms to be addictive in ways that caused this plaintiff specific, measurable harm. The plaintiff’s legal team presented evidence that both companies knew their features—infinite scroll, algorithmic recommendations, notification systems, and engagement metrics—were designed to maximize user time and create habit-forming behaviors. Experts testified that the companies’ own internal documents and research showed awareness of these harms, particularly to young users. The defendants argued that while their platforms do engage users, they provide tools for parental controls, and the plaintiff’s harms stemmed from her own choices and her family environment, not from the companies’ intentional design.
The jury must weigh whether the evidence shows Meta and Google acted with the level of intent and negligence required by California law to hold them liable. If the jury finds liability, their next task is to calculate damages—money compensation the companies would owe to the plaintiff. This is why their recent questions focused on damage calculation methods. The plaintiff likely pursued compensation for medical expenses, mental health treatment, lost wages or educational opportunities, and “pain and suffering” damages related to her depression, anxiety, and suicidal thoughts. The jury must decide not only if harm occurred, but whether that harm can be quantified in dollars and what amount is proportional to the injuries described.

How Does This Case Fit Into the Larger Wave of Social Media Litigation?
This trial is not an isolated event but rather the opening shot in a much larger legal battle. Approximately 1,600 lawsuits making similar claims have been filed against Meta, Google, TikTok, and Snap—all alleging that the platforms’ designs harm youth mental health. The federal court system designated this case as a “bellwether trial,” meaning it was selected to go to trial first specifically because its outcome is expected to guide settlement negotiations and jury decisions in the hundreds of cases still pending. A verdict in favor of the plaintiff could accelerate settlements, as the companies may prefer to resolve cases quickly rather than face similar judgments across the country.
Conversely, a verdict in favor of the defendants might suggest that plaintiffs face an uphill legal battle in proving liability, potentially weakening use in settlement discussions. The timeline is telling: jury selection began January 27, 2026, trial started February 10, 2026, closing arguments happened March 12, 2026, and the verdict is expected sometime in spring or summer 2026. Meanwhile, a second bellwether trial is already scheduled for July 2026, giving attorneys and companies less than four months to digest the outcome of this case and adjust their strategies. Notably, some defendants have already moved to settle rather than face trial: TikTok and Snapchat have both settled similar claims before going to trial, suggesting they viewed the legal risk as too high or the cost-benefit analysis unfavorable. Meta and Google, by contrast, chose to fight this case in court—a high-stakes gamble that signals they believe their legal defenses can succeed or that the costs of settlement would be even higher.
What Evidence Did the Trial Reveal About Platform Design and Youth Harm?
Over the six-week trial, both the plaintiff’s and defendants’ experts presented competing evidence about whether the platforms’ features were deliberately designed to hook young users. The plaintiff’s expert witnesses likely included researchers who have studied social media addiction, digital engineers who explained how recommendation algorithms work, and mental health professionals who evaluated the plaintiff’s condition. They presumably presented internal company documents, emails from product managers, and research findings from the companies’ own scientists—evidence suggesting that Meta and Google understood the addictive potential of their designs and proceeded anyway. For instance, experts may have pointed to features like infinite scroll (which removes natural stopping points), algorithmic feeds that surface emotionally engaging content (which can fuel comparison and anxiety), and notification systems designed to pull users back into the app as examples of intentional addictive design.
The defendants presented counter-evidence arguing that platform features serve legitimate purposes: infinite scroll allows users to browse without clicking “next page,” algorithmic recommendations help users discover relevant content, and notifications inform users of interactions from friends. They presented evidence of parental control tools and features allowing users to set time limits, arguing that if the plaintiff experienced harm, responsibility lies with her family’s failure to use available protections or with her own choices. The jury heard from Meta and Google’s own employees and executives who testified about the companies’ policies, safety efforts, and stated commitments to protecting young users. This competing testimony means the jury had to make credibility judgments about whether the companies’ stated intentions aligned with the design choices they actually made.

What Types of Damages Might the Jury Award If They Find the Companies Liable?
Damages in this case could fall into several categories, each of which the jury would need to consider separately. Economic damages include quantifiable costs the plaintiff incurred: therapy bills, psychiatric medication, hospitalization if applicable, lost wages from time she could not work, and any educational costs or lost opportunities (such as failing a semester or having to delay college). Medical records and expert testimony would support these calculations. Non-economic damages—often called “pain and suffering”—compensate for emotional harm that doesn’t have a direct dollar cost: the suffering caused by depression, the fear related to suicidal ideation, the distress of body dysmorphia, and the loss of normal teenage experiences while struggling with mental health. These are harder to quantify but are a standard part of civil liability cases.
In some cases, juries may also consider punitive damages, which punish wrongful conduct rather than simply compensating the plaintiff—though these require a higher legal bar and depend on whether the jury finds the defendants’ conduct was particularly egregious. The jury’s recent questions about damage calculations suggest they are taking seriously the possibility of awarding significant compensation. However, there is a constraint: the amount must be justified by the evidence presented. A jury cannot award $1 billion in damages if the evidence only supports a much smaller award tied to specific harms and costs. Additionally, appeals courts can reduce jury awards if they find them excessive, and California law caps certain types of damages in specific contexts. The defendants’ lawyers will likely have argued to the jury that any award should be modest and tied directly to proven harm, while the plaintiff’s lawyers presumably asked for more substantial damages reflecting the severity of mental health injuries that persisted from childhood into young adulthood.
What Are the Legal Hurdles the Jury Had to Clear to Find Meta and Google Liable?
To hold Meta and Google liable, the jury had to find more than just that the platforms are addictive or that the plaintiff experienced mental health problems. California law requires the plaintiff to prove that the defendants owed her a legal duty of care, that they breached that duty through their actions or inactions, that the breach caused her harm, and that the harm resulted in damages. The most disputed element is likely causation: did Instagram and YouTube *cause* the plaintiff’s depression and anxiety, or did other factors (genetics, family dynamics, the general stress of adolescence) cause her harm, with social media being merely a contributing factor or symptom rather than a cause? The defendants almost certainly argued that many teenagers use Instagram and YouTube without developing serious mental health problems, suggesting that individual vulnerability rather than platform design is the determining factor. Another legal hurdle involves the question of whether Meta and Google owed a legal duty specifically to this plaintiff.
Companies generally have a duty not to knowingly sell dangerous products or provide dangerous services, but the law in this area is still developing. The defendants likely argued that social media is not “dangerous” in a legal sense, that users voluntarily choose to use the platforms, and that any duty the companies owe is limited to providing some parental controls and transparency—which they did. The plaintiff’s team presumably argued that the platforms’ designs specifically target young users whose brains are still developing and are more susceptible to addictive design, creating a heightened duty of care toward minors. The jury had to weigh these competing legal theories and decide which one California law supports.

Why Did Meta and Google Choose to Fight This Case Instead of Settling?
Unlike TikTok and Snapchat, which settled their similar lawsuits before trial, Meta and Google decided to defend themselves in court. This choice reflects several possible calculations. First, a trial verdict in their favor could create legal precedent making it harder for other plaintiffs to win similar cases, protecting them from 1,600 other lawsuits. Second, they may have believed their legal defenses are strong—that causation is difficult for plaintiffs to prove, that user choice and parental responsibility are weighty factors, and that the companies’ existing safety tools demonstrate reasonable care. Third, settlement might have established a financial liability precedent that would be expensive to extend across all pending cases, whereas a favorable verdict could shield them from large-scale settlements.
Fourth, the companies may have viewed the PR costs of settling as worse than the costs of publicly defending their business model in court, preferring to argue that their products are beneficial rather than implicitly admitting harm by paying settlements. However, this strategy carries enormous risk. A jury verdict against them not only affects the thousands of pending cases but also shapes public perception, potential regulatory action, and the political climate around tech regulation. If the jury finds liability here, it becomes much harder for Meta and Google to claim their platforms are safe, and legislators may feel emboldened to pass laws restricting social media design practices. The companies may have decided that fighting on principle was worth the risk, or they may have underestimated the strength of the plaintiff’s case and the jury’s sympathy for a young woman alleging depression and suicidal thoughts caused by platforms she has used since childhood.
What Happens Next and What This Case Could Mean for Other Plaintiffs?
The immediate next step is a jury verdict, expected sometime in spring or summer 2026. If the jury rules in favor of the plaintiff and awards damages, Meta and Google will almost certainly appeal, potentially stretching the case through years of additional litigation. An appeals court might affirm the jury’s decision, reverse it on legal grounds, or remand it back for a new trial. Even if appeals take several years, a jury verdict in favor of the plaintiff sends a powerful signal to the 1,600 other plaintiffs and their attorneys that the case can be won.
Settlement negotiations will almost certainly accelerate, as the companies will face stronger pressure to resolve cases before additional juries reach similar conclusions. If the jury rules in favor of Meta and Google, the impact flows in the opposite direction: other plaintiffs’ attorneys may struggle to convince juries to accept their theories of liability, and the companies’ position in settlement negotiations strengthens considerably. Either way, this verdict is a watershed moment in tech accountability. For the first time, a jury has sat through six weeks of testimony about social media’s design, heard from the companies’ own experts and internal documents, and will now decide whether corporations are legally responsible when their products cause documented harm to young people. That decision will shape not only these 1,600 lawsuits but also the future development of social media platforms, the regulatory landscape, and whether design practices that maximize engagement at the expense of user wellbeing can legally continue.
