New Lawsuits Claim Tech Companies Ignored Warnings About Harm to Teens

Yes. Multiple lawsuits have now established in court that tech companies including Meta, TikTok, and Snapchat ignored internal warnings about harm to...

Yes. Multiple lawsuits have now established in court that tech companies including Meta, TikTok, and Snapchat ignored internal warnings about harm to teens. Most dramatically, a New Mexico jury in March 2026 found Meta violated the state’s consumer protection laws by disregarding safety recommendations from its own engineers and concealing known risks of depression, anxiety, self-harm, and eating disorders. The $375 million verdict represents the first major judgment against a social media company for teen harm, though it’s just one case in a much larger wave of litigation.

Internal documents and employee testimony have revealed a consistent pattern: engineers and researchers at Meta warned leadership about the dangers of their products, but those warnings were dismissed or ignored. The same pattern appears across the industry—TikTok and Snapchat both settled cases just days before trials began in January 2026, suggesting their internal evidence was similarly damaging. Meanwhile, over 2,400 lawsuits are consolidated in federal court, and state attorneys general in 40 states have filed separate actions. This represents perhaps the most significant litigation wave against Big Tech since the tobacco wars.

Table of Contents

What Internal Warnings Did Tech Companies Ignore?

Meta’s internal research clearly identified mental health harms, yet the company continued designing features specifically to maximize engagement and addiction. PBS News and other sources covering the trials documented that Meta’s own safety teams raised concerns about infinite scroll, autoplay videos, algorithmic recommendation loops, and the psychology of likes and streaks—all mechanisms designed to keep users, especially young users, scrolling indefinitely. When researchers flagged these concerns to leadership, the warnings were deprioritized in favor of engagement metrics and advertising revenue. The New Mexico verdict established that this constituted deceptive business practice, since Meta marketed its platforms as safe while internally knowing they were not.

The ignored warnings weren’t vague or speculative. Meta engineers specifically documented that their features could cause depression, anxiety, disordered eating (particularly on Instagram, which heavily emphasizes appearance comparison), sleep disruption, and self-harm. Some of the most damaging evidence involved research showing that the platforms’ algorithms deliberately amplified harmful content when it drove engagement—effectively weaponizing the features to make them more addictive. The company’s own researchers called this out; executives downplayed it. This distinction matters legally, because it transforms the case from “our product caused harm” to “we knew it caused harm and lied about it,” which is fraud.

What Internal Warnings Did Tech Companies Ignore?

How Many Lawsuits Are Actually Pending?

As of March 2026, over 2,407 individual claims are consolidated in the Adolescent Social Media Addiction multidistrict litigation (MDL) in federal court. This is a single MDL—there are also separate class actions, state lawsuits, and individual suits. Forty state attorneys general have filed additional lawsuits against meta and other platforms, meaning the litigation is happening on federal, state, and individual levels simultaneously. This scale rivals the tobacco and opioid litigation waves that took decades to resolve and resulted in tens of billions in settlements. However, unlike those cases where the defendant industries were narrower, here we’re talking about companies that have become essential to modern communication—which complicates both litigation and remedies.

The three bellwether trials scheduled for January 27, March 9, and May 11, 2026 in California state court will be particularly important. These are test cases designed to establish liability and damages principles that will govern how the remaining 2,400+ claims are handled. If plaintiffs win decisively in these early trials, many more defendants will settle rather than risk similar verdicts. Meta CEO Mark Zuckerberg has been subpoenaed and is expected to testify, meaning these trials will be high-profile. The trials are projected to last 6 to 8 weeks each, making them extraordinarily expensive and public. This is deliberate strategy by plaintiff attorneys—major verdicts and media attention create settlement pressure.

Scale of Social Media Harm Litigation (2026)Pending MDL Claims2407$ (millions) / CountState Attorney General Lawsuits40$ (millions) / CountBellwether Trials Scheduled3$ (millions) / CountMeta’s New Mexico Verdict375$ (millions) / CountSource: PBS News, NBC News, CNN Business, Lawsuit Information Center

What Specific Features Did Tech Companies Design for Addiction?

The lawsuits detail a sophisticated arsenal of addictive mechanisms, each chosen because research shows it hooks users’ brains. Infinite scroll—the ability to keep swiping without ever reaching an “end”—eliminates natural stopping points. Autoplay videos move you seamlessly to the next piece of content, hijacking your intention to spend 5 minutes and turning it into 45. Likes and streaks create a social reward loop: you check back constantly to see if someone liked your post, and you maintain streaks (Snapchat’s streak feature that counts consecutive days of messaging) to avoid losing social status. Algorithmic recommendation systems don’t just show you what you follow; they actively learn what content makes you stay longest and prioritize that, even if it’s content that makes you feel inadequate or anxious.

What makes this legally significant is the intent. These features weren’t accidents or side effects. Leaked emails and internal documents show that engineers explicitly discussed the addictive properties and that algorithm teams deliberately optimized for engagement, knowing this meant keeping young users in a constant state of stimulation and social anxiety. YouTube’s autoplay and recommendation algorithms, for example, were tuned to prioritize videos that keep people watching longest—which often means increasingly extreme content. TikTok’s algorithm does the same, but with a more powerful mechanism: the algorithm learns your preferences so quickly that new users report feeling hypnotized within hours. The lawsuits argue this is predatory design aimed specifically at developing brains that are more vulnerable to behavioral addiction than adults.

What Specific Features Did Tech Companies Design for Addiction?

Plaintiffs argue that tech companies violated consumer protection statutes by engaging in “unfair and deceptive” practices. The specific deception is this: platforms marketed themselves as safe for young users while internally knowing they were not, and they concealed the known risks (depression, anxiety, eating disorders, sleep disruption, self-harm). Under consumer protection law, it’s illegal to market a product as safe when you know it’s harmful. The New Mexico verdict, which explicitly found Meta violated the state’s consumer protection law, established this principle in court. The next question is how broadly this logic applies to the other pending lawsuits.

A second legal theory argues that the platforms created a “public nuisance”—a condition that harms broad segments of the public. This is the legal theory that worked in some tobacco cases and environmental cases. A third theory is straightforward negligence: the companies had a duty to warn about known dangers and breached that duty. Some cases also invoke strict liability for defective products, though that’s harder to apply to services. The damages sought include both compensatory damages (for actual harm to individual plaintiffs) and, in some cases, punitive damages meant to punish companies for knowing misconduct. The $375 million New Mexico verdict included punitive damages, which is why it’s significant—it signals juries are willing to punish, not just compensate.

What Settlements and Verdicts Have Already Happened?

The only major jury verdict so far is the March 2026 New Mexico case: $375 million against Meta for violations of the state consumer protection law and for enabling child sexual exploitation on Facebook, Instagram, and WhatsApp. This is a state court case, not federal, which is significant because state consumer protection statutes are often broader and more plaintiff-friendly than federal law. The verdict establishes that a jury will find Meta liable and will award substantial damages—exactly the kind of precedent that drives settlement negotiations. Before that verdict, TikTok and Snap both settled their cases in January 2026, just days before trial. The terms of both settlements are confidential, so we don’t know how much either company paid.

However, the timing is telling: both companies apparently decided that the risk of a jury verdict was worse than settling, even on confidential terms. This is standard strategy in litigation—once you see a co-defendant (Meta in the TikTok/Snap cases) facing trial, your settlement calculus changes. You have to assume the jury might find against you too. YouTube/Google, despite being named in the bellwether trials, has not yet settled. This suggests either that Google believes it has a stronger defense or that it’s willing to fight the case.

What Settlements and Verdicts Have Already Happened?

How Are Regulators Responding to Teen Harm Claims?

California has already acted. AB 56, a law that took effect recently, requires onscreen warning labels for users under 18. The warning explicitly states: “Social media is associated with significant mental health harms and has not been proven safe for young users.” This is a direct regulatory acknowledgment that what the lawsuits are claiming is true—these platforms are risky for teen brains. Other states are considering similar requirements.

The warning label approach is important because it represents a regulatory pivot: instead of banning social media or forcing fundamental design changes, California is requiring disclosure. This is the “warn people and let them decide” approach that worked for cigarettes (though many argue it’s less effective for addictive products). However, for young users who don’t legally control their own accounts and whose parents may not understand the risks, a warning label has limits. A parent reading “social media can harm your teen” already probably suspects that; what they need is either alternatives or enforcement of age restrictions, neither of which the label provides.

What’s Next in These Lawsuits?

The next major milestones are the three bellwether trials in 2026. If plaintiffs win decisively, settlement negotiations will likely accelerate. The combined liability for Meta, YouTube, TikTok, and Snapchat across 2,400+ cases could easily reach billions, which would dwarf any single company’s annual litigation budget. However, if defendants win the early trials, it doesn’t end the litigation—it just shifts it. The companies would likely argue they have a defense to most claims, and the volume would decline as cases get dismissed.

The outcome of these trials will essentially determine whether this becomes the next tobacco settlement (tens of billions over years) or something smaller and more contained. Looking forward, there are broader questions beyond any single lawsuit. Will social media redesign to reduce addictive features? The lawsuits push in that direction, but so far no company has fundamentally changed its business model in response to litigation. Regulation may force that change faster than courts will. Age restrictions, design requirements, and transparency mandates are all being proposed or debated at state and federal levels. The litigation and the regulatory pressure are moving in parallel, and both could reshape how these platforms operate—especially for young users.

You Might Also Like

Leave a Reply