In January 2023, MIT Technology Review revealed that iRobot’s Roomba robot vacuums—equipped with front-facing cameras—had captured images of intimate moments in customers’ homes during a testing phase. Images showed private scenes including a person on the toilet, and at least 15 photos were shared to social media by contractors working for Scale AI, a company hired by iRobot for image analysis and AI training. The disclosure shocked consumers who believed their camera data would remain private. However, it’s crucial to understand that the class actions currently active against iRobot are primarily securities fraud suits targeting investor losses, not consumer privacy claims.
This article explains what happened, who was involved, what lawsuits exist, and what options remain for affected consumers. The 2020 privacy incident exposed a gap in how companies disclose data handling to beta testers and customers. While the incident occurred years ago, it has contributed to broader regulatory scrutiny and financial consequences for iRobot, including the company’s filing for Chapter 11 bankruptcy in December 2025. Understanding the difference between investor-focused class actions and potential consumer remedies is essential for anyone affected by the privacy breach or considering involvement in litigation.
Table of Contents
- How Did Roomba Cameras End Up Photographing Private Moments in Homes?
- Why Did Contractors Have Access to Private Images, and What Controls Were Missing?
- What Class Actions Are Actually Active Against iRobot Right Now?
- How Did iRobot’s Bankruptcy Affect Legal Liability and Consumer Options?
- What About Regulatory Actions and FTC Enforcement?
- What Privacy Protections Should Consumers Expect with Smart Home Devices?
- What Has Changed in Roomba’s Data Practices Since the Incident?
How Did Roomba Cameras End Up Photographing Private Moments in Homes?
IRobot equipped certain Roomba models with front-facing cameras as part of its mapping and navigation technology. During a testing phase, the company contracted with Scale AI, a contractor firm providing data labeling and AI training services, to analyze and label images captured by these devices. Scale AI employed human workers in locations including Venezuela, India, and other countries to review and annotate photos during the training process. These workers were not directly employed by iRobot and operated under third-party contractor agreements. The specific incident that triggered media attention involved intimate scenes that Roomba cameras recorded during normal operation in testers’ homes.
At least 15 images crossed boundaries—they were downloaded, shared among contractors, and posted to social media without consent. These included images of a minor and a woman in a bathroom, raising serious questions about data access controls, third-party oversight, and disclosure to beta testers. Snopes confirmed these reports in 2023, documenting that photos taken by Roomba robot vacuums had indeed made their way online. The fundamental issue: beta testers and customers were not adequately informed that human beings—contractors outside of iRobot’s direct employment—would view raw images from their homes. Most consent forms and beta agreements did not make this third-party review explicit or transparent.

Why Did Contractors Have Access to Private Images, and What Controls Were Missing?
IRobot used contractor networks to train its visual navigation algorithms. The logic is straightforward—AI models require massive amounts of labeled training data, and using human contractors is cheaper than in-house teams. However, this created a critical vulnerability: images containing personally identifiable information, body parts, and intimate moments from customers’ homes were accessible to a dispersed workforce across multiple countries with varying privacy laws and enforcement mechanisms. Scale AI’s contractor network had varying employment relationships and data security protocols. Workers in different regions operated under different legal standards, and there was no consistent mechanism to ensure they understood privacy obligations or could not retain or share images.
The breach happened because there were insufficient technical controls—like immediate image deletion after analysis, encryption during transfer, or restricted access windows—and insufficient contractual controls to prevent sharing. One contractor or group of contractors simply downloaded images and shared them on social media, and no audit trail or access log caught this behavior in real-time. A critical limitation exists here: even with strong stated policies, global contractor networks are difficult to monitor continuously. Large-scale AI training operations frequently depend on cost-effective labor, which often means distributed teams in different time zones and jurisdictions, making real-time oversight challenging. This trade-off between scale and security was at the heart of iRobot’s problem.
What Class Actions Are Actually Active Against iRobot Right Now?
As of 2025-2026, multiple class actions have been filed against iRobot, but they are securities fraud class actions, not consumer privacy settlements. Law firms including Pomerantz LLP and Rosen Law Firm have filed suits alleging that iRobot misled shareholders and investors about the company’s financial health and stability. These class actions target investors who lost money on iRobot stock between January 29, 2024 and March 11, 2025—a period when the company allegedly concealed weakening financial conditions and operational challenges that led to bankruptcy. The critical distinction: these securities class actions are for people who owned iRobot stock, not for consumers whose privacy was violated.
If you purchased a Roomba or were a beta tester, the current active class actions do not directly compensate you for the privacy breach. Instead, they seek damages for shareholders who bought or held the stock based on allegedly misleading financial statements. Pomerantz, Rosen, and other law firms are actively seeking class members, but membership is limited to shareholders during the specified period. As of the available information, no settlement addressing the 2020 camera privacy incident has been publicly announced or filed. The incident occurred in 2020, became public in 2023, but has not yet resulted in a finalized consumer privacy class action settlement in the way other data breaches have.

How Did iRobot’s Bankruptcy Affect Legal Liability and Consumer Options?
IRobot filed for Chapter 11 bankruptcy in December 2025, with Shenzhen Picea Robotics, a Chinese robotics manufacturer, positioned to acquire 100% of iRobot’s equity. This restructuring has major implications for both consumer claims and creditor recovery. In bankruptcy, the company’s assets are managed by a trustee, and creditors (including potential consumer claimants) must file claims within specific deadlines. The acquisition by a foreign entity complicates the legal landscape further—liability for past conduct may not automatically transfer, or may be subject to new terms. Bankruptcy proceedings typically delay or limit consumer compensation.
Class action lawsuits against companies in Chapter 11 may be consolidated or stayed (paused) while the bankruptcy case is resolved. If a privacy class action were to be brought now, it would likely have to be filed in the bankruptcy court or addressed as part of the reorganization plan. The company emerging from bankruptcy under new ownership is a different legal entity, and determining whether it inherits liability for iRobot’s pre-bankruptcy conduct is a legal question that courts will address. However, if X then Y: if you were harmed by the privacy breach and want to pursue compensation, acting now is critical. Filing a claim in the bankruptcy proceeding is different from joining a class action, and deadlines for submitting claims are strict. Consulting with a lawyer about your options before bankruptcy deadlines pass is essential, as missed deadlines can bar your claim permanently.
What About Regulatory Actions and FTC Enforcement?
Beyond private class actions, the Federal Trade Commission and state attorneys general have regulatory authority over companies’ privacy practices. The FTC has previously taken action against technology companies for inadequate data security or misleading privacy disclosures. In iRobot’s case, the privacy incident could trigger regulatory scrutiny or enforcement actions, which are separate from civil class actions. FTC enforcement actions can result in consent orders, monetary penalties, and mandatory security improvements going forward.
State attorneys general may also pursue claims on behalf of their residents. These regulatory actions don’t directly compensate individual consumers, but they establish accountability and can require structural changes to data handling practices. The regulatory environment has shifted significantly since 2020; devices with cameras collecting data in private homes are now subject to heightened scrutiny by regulators and legislators. A limitation worth noting: regulatory enforcement can take years from investigation to resolution, and consumers typically do not receive direct compensation from regulatory fines the way they might from a class action settlement. However, regulatory actions establish legal precedent and pressure companies to improve practices to avoid future penalties.

What Privacy Protections Should Consumers Expect with Smart Home Devices?
The iRobot incident illustrates a broader gap in privacy expectations around smart home devices. Many consumers purchase cameras and robots without fully understanding where their data goes, who accesses it, and for how long. Industry standards and best practices suggest that devices collecting images in homes should have explicit controls: local processing (images analyzed on-device, not uploaded), encryption during transmission, immediate deletion policies, and transparent disclosure about any human review.
Leading consumer practices include: opt-in for cloud storage (rather than automatic uploading), clear labeling of which images are used for training, and restrictions preventing third-party contractors from accessing raw data. Some manufacturers now offer local-only processing options, storing the camera feed entirely on the device and never transmitting it. For comparison, some smart home companies now explicitly state that no human contractors will review camera footage outside the company, or limit review to quality assurance staff with specific clearances and training.
What Has Changed in Roomba’s Data Practices Since the Incident?
IRobot has made public statements about improving its data practices following the 2023 disclosure. The company adjusted its privacy documentation, added more explicit disclosures about how images are used, and promised stricter oversight of contractor access. However, given iRobot’s bankruptcy and acquisition by a Chinese company, the continuity of these commitments is uncertain. Shenzhen Picea Robotics will inherit Roomba operations, and its data practices and privacy philosophy may differ significantly from iRobot’s.
Looking forward, the regulatory landscape is tightening. State privacy laws (including California’s consumer privacy act and similar laws in other states) and proposed federal privacy legislation impose stricter requirements on how companies collect, store, and share data. Any Roomba successor will operate under heightened regulatory scrutiny. Consumer awareness of privacy risks has also increased, likely changing purchasing decisions and competitive dynamics in the robot vacuum market.
