Amazon is facing two major class action lawsuits that were certified in 2025 over allegations that Alexa devices recorded private conversations without adequate user consent and collected voiceprint biometric data in violation of state privacy laws. One case, certified in Illinois in November 2025, includes approximately 1.2 million Amazon Voice ID users who had their voiceprints created since June 2014. Another case was certified in Washington in July 2025 under that state’s Consumer Protection Act. These lawsuits center on claims that Amazon failed to adequately disclose how it collected and used voice biometric data, and that devices often recorded conversations during unintended activations.
For example, if your Alexa device was activated by background noise and recorded a private conversation without your knowledge, you may be part of the class of users affected by these practices. The two lawsuits pursue different legal theories but both target the same core issue: whether Amazon violated consumers’ privacy rights by collecting sensitive biometric voiceprint data without explicit, informed consent. The Illinois case specifically invokes the Illinois Biometric Information Privacy Act (BIPA), which provides statutory damages of $1,000 per negligent violation or $5,000 per reckless or intentional violation. These cases remain in active discovery as of April 2026, with no settlement agreement reached. While Amazon previously settled similar allegations with the Federal Trade Commission for $30 million in 2023-2024, that settlement went to the government, not to consumers, making these current class actions potentially significant for affected users seeking compensation.
Table of Contents
- What Are the Certified Amazon Alexa Class Actions and Who Qualifies?
- How Did Amazon’s Alexa Privacy Practices Come Under Legal Scrutiny?
- What Are the Potential Damages and Compensation in These Cases?
- What Is the Current Status of Both Class Actions?
- How Does Amazon’s FTC Settlement Differ From These Class Actions?
- How Can You Determine Your Eligibility and File a Claim?
- What’s Next for These Lawsuits and What Should You Watch For?
- Conclusion
What Are the Certified Amazon Alexa Class Actions and Who Qualifies?
Two distinct class actions have been certified by federal judges. The Illinois case, certified on November 19, 2025, by Judge Franklin Valderrama, covers approximately 1.2 million Amazon Voice ID users in Illinois whose voiceprints were created on or after June 27, 2014. The Washington case was certified in July 2025 under Washington’s Consumer Protection Act, though recent rulings in April 2026 have significantly narrowed its scope. Both cases share common allegations: that Amazon recorded conversations without adequate consent, collected voiceprint biometric information without proper disclosure, and failed to inform users how their voice data would be used and stored. To determine if you’re part of the Illinois BIPA class, you need to have been in Illinois at the time your voiceprint was created through Amazon Voice ID services, with the qualifying date range starting from June 27, 2014.
The broader allegation applies to both registered Amazon users and unregistered individuals whose voices were captured during device recordings. This is a key distinction—you don’t need to have actively agreed to Amazon Voice ID for your voiceprint to have been created and collected. For instance, if you visited someone’s home where an Alexa device recorded your voice without your knowledge or consent, and that voice data was used to create a voiceprint profile, you could potentially be affected. Eligibility for the Washington class was more restrictive after the April 2026 ruling, which found that Amazon had disclosed the possibility of accidental device activations in its terms of service. The judge determined that only some unregistered users had properly asserted individual wiretap claims under Washington law. This narrowing makes the Illinois BIPA case currently more promising for class members, as it doesn’t require the same level of proof that Amazon’s disclosures were inadequate.

How Did Amazon’s Alexa Privacy Practices Come Under Legal Scrutiny?
Amazon’s Alexa privacy issues began receiving serious regulatory and legal attention after investigations revealed that the company was recording conversations beyond what users expected or authorized. The core problem centers on Alexa’s wake-word detection feature, which is designed to activate only when the device hears “Alexa.” However, audio research and user complaints documented numerous instances where the device activated accidentally in response to similar sounds, background noise, or fragments of conversation, resulting in recordings of private discussions that had nothing to do with device usage. The biometric component of these lawsuits is equally significant. Amazon Voice ID is a feature that converts a user’s voice into a unique biometric voiceprint for authentication and personalized services. According to plaintiffs’ allegations, Amazon collected and stored these voiceprints without obtaining explicit, informed consent from users—particularly from people who never agreed to Voice ID services.
Many users were unaware that Amazon was creating permanent biometric profiles of their voices. The critical limitation here is that once a voiceprint is created, it’s nearly impossible for consumers to truly delete it or control its future use, even if they stop using Alexa. This creates a permanent privacy risk tied to something as personal as a person’s voice. The lawsuits also challenge Amazon’s disclosure practices regarding what happens to voice data once it’s collected. Users often believed recordings were deleted after being used to process voice commands, but evidence suggested Amazon retained audio files and associated biometric data for far longer and used them for purposes beyond what was disclosed in user agreements.
What Are the Potential Damages and Compensation in These Cases?
The statutory damages framework in the Illinois BIPA case is the most concrete aspect of these lawsuits. Under Illinois law, plaintiffs can seek $1,000 per negligent violation or $5,000 per reckless or intentional violation of BIPA. Since the certified class includes 1.2 million Voice ID users in Illinois, the potential total damages could be enormous—ranging from $1.2 billion if all claims succeed as negligent violations to $6 billion if they’re deemed reckless or intentional. However, real-world settlements typically result in per-person payments far below the statutory maximum, after accounting for attorney fees, administration costs, and settlement negotiations.
In comparison, similar biometric privacy cases have resulted in widely varying outcomes. For example, settlements in Facebook’s facial recognition case and other BIPA violations have ranged from a few dollars per class member to several hundred dollars, depending on the strength of evidence and defendant’s liability. The wildcard in the Amazon cases is that the company has significant resources and will mount a strong defense, which typically means extended litigation before any settlement is reached. The warning here is that class members should not expect rapid compensation—these cases are in discovery stages and could take several more years to resolve, even if they ultimately succeed. The Washington case offers less clarity on potential damages because state law doesn’t have the same statutory damages framework as BIPA, making settlements in that case potentially harder to quantify or predict.

What Is the Current Status of Both Class Actions?
As of April 2026, both class actions remain in active discovery with no settlement agreements reached. The Illinois BIPA case is proceeding with the full certified class of 1.2 million users, while the Washington case has been significantly narrowed by recent court rulings. The Washington judge found that Amazon had adequately disclosed the possibility of accidental device activations in its terms of service, which weakened claims for many unregistered users in that state. This narrowing demonstrates an important limitation: even when users feel their privacy was violated, courts may interpret existing disclosures as sufficient to shield companies from liability. The timeline for these cases remains uncertain. Class action litigation of this complexity typically requires 2-4 years or more to reach settlement, and sometimes extends to trial.
During the discovery phase, both sides are exchanging documents, data, and testimony to build their cases. For plaintiffs, this is an opportunity to obtain evidence of Amazon’s knowledge of the problems and its internal communications about privacy practices. Amazon, conversely, is likely arguing that users consented to data collection through its terms of service and that the alleged harms are minimal or speculative. One important note: the separate $30 million FTC settlement from 2023-2024 did not resolve these private class actions. That settlement was between Amazon, the Federal Trade Commission, and the Department of Justice, and the funds went to the government, not to consumers. The class actions are distinct legal proceedings with separate plaintiffs, defenses, and potential outcomes.
How Does Amazon’s FTC Settlement Differ From These Class Actions?
Amazon settled with the Federal Trade Commission and Department of Justice in 2023 and 2024 by agreeing to pay $30 million in penalties and implementing new privacy safeguards for Alexa and Ring products. This settlement was based on allegations that Amazon violated the FTC Act by misrepresenting its privacy practices and the capabilities of parental controls. However, this settlement was entirely separate from the class actions and provided no direct compensation to affected consumers. Instead, the money went to government agencies. The key difference is legal authority and recovery.
The FTC settlement focused on business practices and consumer protection violations at a systemic level, while the class actions target specific harms to individual consumers—unauthorized recording and biometric data collection. The warning here is significant: a government settlement does not bar private class actions, and it does not compensate individuals. Many consumers mistakenly believed the FTC settlement resolved their privacy concerns, but those who were recorded or had voiceprints collected without consent still have potential claims in the pending class actions. The FTC settlement required Amazon to implement stronger safeguards, but it did not resolve the question of liability for past conduct regarding voiceprint collection, which is the core issue in the Illinois BIPA case. This distinction is critical for class members deciding whether to participate in the pending litigation or any future settlement.

How Can You Determine Your Eligibility and File a Claim?
For the Illinois BIPA class, the primary requirement is that your voiceprint was created through Amazon Voice ID services while you were in Illinois, on or after June 27, 2014. You can determine if you have Voice ID enabled by logging into your Amazon account, navigating to Alexa account settings, and checking for Voice ID or voice recognition profiles. If you find a voiceprint associated with your account, or if you’re an Illinois resident who used Alexa devices between 2014 and now, you likely have a claim. For unregistered users—people whose voices were recorded by Alexa devices owned by others in Illinois—claiming can be more challenging because you’ll need evidence that you were present when recordings occurred and that your voice was captured.
Keep documentation of any Alexa devices you were exposed to and approximate dates of use. When a settlement is eventually approved, the claims process will provide more detailed instructions on submitting proof of class membership. Currently, there is no active settlement to claim against, as these cases remain in pre-settlement litigation. However, you should monitor the class action docket or visit official settlement websites (which will be announced if a settlement is reached) to learn when claims can be filed. Do not rely on unsolicited emails or settlement claim websites that appear before an official settlement is announced, as these are often scams targeting class action participants.
What’s Next for These Lawsuits and What Should You Watch For?
The immediate future of these cases depends on discovery findings and settlement negotiations, which typically run parallel to litigation. If Amazon and the plaintiffs’ attorneys can reach a settlement framework that the judge finds fair, both cases could resolve relatively quickly. If not, the Illinois case could potentially proceed toward trial, though most large class actions settle before that point. The precedent these cases set will likely influence how other companies approach biometric data collection and voice recording.
One important development to watch is whether the Washington case’s narrowing by the federal judge signals that courts will increasingly rely on existing terms-of-service disclosures to shield companies from liability for unintended recordings. This could affect the broader landscape of biometric privacy litigation and what consumers can realistically claim. Additionally, as state legislators become aware of gaps in privacy protections (like the fact that voiceprints can be created without explicit consent), there may be new state laws requiring clearer disclosure and stronger user controls over biometric data. These legal and regulatory changes could make future class actions stronger, even if current cases face headwinds.
Conclusion
The Amazon Alexa privacy recording class actions represent a significant challenge to how major technology companies collect and use voice biometric data. Two certified classes—one in Illinois with 1.2 million members and one in Washington—are pursuing claims that Amazon violated privacy laws by recording conversations without adequate consent and collecting voiceprints without explicit authorization. While the potential damages under Illinois BIPA law are substantial ($1,000 to $5,000 per violation), actual per-member payouts will depend on litigation outcomes and settlement negotiations that are still in early stages.
If you believe you were affected by unauthorized Alexa recordings or had a voiceprint created without your knowledge, particularly if you are in Illinois, monitor the official class action docket and watch for settlement announcements. Do not take action on unsolicited claims websites, and be cautious about providing personal information to unverified sources. As these cases evolve, they may set important precedents for how voice and biometric data are treated under privacy law—and whether consumers have the right to control what happens to their most intimate and identifying information.
