While no widely publicized class action lawsuit matching this exact claim currently exists, Lyra Health’s data sharing practices with employers have been the subject of significant scrutiny based on investigative reporting. In 2021-2022, BuzzFeed News uncovered that Lyra Health—a mental health platform used by major employers like Google, Facebook, and Starbucks—shared employee therapy data with their employers in ways that conflicted with users’ expectations of confidentiality. The most documented case involved Chelsey Glasson, a Google employee, whose therapy notes were obtained by her employer after she sought them for her own pregnancy discrimination lawsuit, leading her therapist to terminate their therapeutic relationship.
Table of Contents
- What Are the Specific Data Sharing Practices at Lyra Health?
- How Does Data Sharing Violate Patient Privacy Expectations?
- The Chelsey Glasson Case—How Did Employer Access Become Possible?
- Transparency and Disclosure Failures
- Therapist Awareness and Ethical Violations
- What Remedies Might Apply to Affected Employees?
- Looking Forward—Are Employer Mental Health Platforms Safe?
What Are the Specific Data Sharing Practices at Lyra Health?
Lyra Health marketed itself as a confidential mental health platform, yet it operated with a dual disclosure system that created confusion about whether employee data would be shared with employers. According to BuzzFeed’s investigation, some versions of Lyra’s intake survey displayed data-sharing disclosures on page 3 of a 5-page form, buried among other terms. However, other versions of the same survey that employees accessed through their employer portals explicitly stated: “Your responses are confidential and are not shared with the employer.” This contradiction meant employees on the same platform received different information about their privacy protections depending on how they accessed the service.
The practical effect was striking: Of seven current and former employees at Google, Facebook, and Starbucks who participated in BuzzFeed’s interviews, all but one reported they did not know that survey data could be shared with employers in any form. This wasn’t a matter of employees missing fine print—they received contradictory assurances about their privacy. When workers sought mental health support believing it was confidential, they were often unaware that aggregated or individual data could flow back to their employers’ HR departments.

How Does Data Sharing Violate Patient Privacy Expectations?
One of the most troubling aspects of Lyra’s practices is the fundamental breach of therapeutic confidentiality. Mental health treatment relies on trust: patients must believe their disclosures will remain private to engage honestly with their therapists. When an employee’s mental health data can reach their employer—the very entity that controls their paycheck, promotions, and job security—the power dynamic fundamentally undermines that trust. An employee disclosing anxiety, depression, substance use concerns, or trauma to a therapist may self-censor or avoid treatment altogether if they fear that disclosure will influence employment decisions.
However, not all data sharing necessarily violates privacy laws or patient rights, depending on consent and how data is aggregated. If an employer receives only anonymized, aggregate statistics (“30% of employees used therapy services this year”), that’s different from individual therapy notes. Lyra’s documented problem was the conflicting messages about what level of sharing would occur. Employees who saw “confidential and not shared with employer” had a reasonable expectation of privacy that the company later undermined by sharing or allowing access to their information upon employer request. This expectation mismatch is where the legal and ethical vulnerability lies.
The Chelsey Glasson Case—How Did Employer Access Become Possible?
Chelsey Glasson’s experience illustrates how Lyra’s dual role—contracted with both employees and employers—created a pathway for employer access to therapy records. Glasson was seeking her own therapy notes from Lyra to use as evidence in her pregnancy discrimination lawsuit against Google. However, Google (her employer and Lyra’s client) learned of this request and demanded the notes as well. Lyra provided them.
Glasson’s therapist, discovering that the employer had accessed and received her patient’s therapy records, terminated their therapeutic relationship with Glasson. This outcome is particularly damaging: not only did Gassom lose access to her therapist in the middle of handling a legal matter, but the termination suggests even Lyra’s own therapists were unaware of or uncomfortable with the company’s data-sharing practices with employers. What makes Glasson’s case remarkable is that it involved an explicit conflict of interest: the same employer that Gassom was suing for pregnancy discrimination was able to access her mental health records through Lyra. There’s no evidence Gassom consented to Lyra sharing her therapy notes with Google in this context. The case demonstrates that Lyra’s consent frameworks and safeguards were insufficient to protect patients from employer surveillance in adversarial situations.

Transparency and Disclosure Failures
Lyra Health’s contradictory disclosures weren’t accidental—they suggest systemic problems with how the company communicated its practices. The versions of surveys accessed through employer portals that said “not shared with the employer” appear designed to reassure employees, yet they directly contradicted the company’s business model of providing employers with data insights. The company appears to have hoped employees would use the reassuring version without questioning whether aggregated data, usage metrics, or other insights were still shared with employers in other forms.
This disclosure failure creates a tradeoff for employees: they receive access to a mental health platform at a subsidized or free rate through their employer, but the price is that their employer has some window into their mental health engagement. Lyra didn’t clearly articulate this tradeoff upfront. Instead, the company left employees in some versions of its platform believing no data would be shared at all. For employers, this creates a different liability: if they contracted with Lyra believing they would receive individual employee therapy data, but the consent and disclosures don’t support that level of data access, the employer may face their own privacy liability for using data obtained under false pretenses.
Therapist Awareness and Ethical Violations
One of the most overlooked dimensions of Lyra’s practices is that the company’s own therapists—the clinicians providing care—were largely unaware of the data-sharing arrangements. BuzzFeed interviewed eight current and former Lyra therapists, and six of them did not know about the company’s practice of sharing therapy data with employers. This creates an ethical crisis: therapists have a duty to inform patients about the limits of confidentiality before treatment begins.
If Lyra therapists weren’t informed about data sharing, they couldn’t have properly disclosed it to their patients, violating therapeutic ethics standards. Additionally, therapists operating under the assumption that therapy was confidential might have documented information in patients’ records that they would never have included if they’d known employers could access it. Some therapists might have declined to work for Lyra had they understood the privacy model. The fact that Chelsey Gassom’s therapist ended the relationship upon learning that her employer had accessed records suggests therapists felt their ethical obligations had been compromised by Lyra’s practices.

What Remedies Might Apply to Affected Employees?
Employees who used Lyra through their employers during the period when these data-sharing practices were occurring may have claims under several legal theories. Privacy tort claims might apply if employees can show they were deceived about the level of confidentiality. Breach of contract claims could arise if Lyra’s terms of service or privacy policies promised confidentiality that wasn’t delivered.
Depending on the jurisdiction, state consumer protection laws or healthcare privacy laws might also apply. Federal law like HIPAA doesn’t cover most employer-sponsored mental health platforms unless Lyra was acting as a Business Associate, which would require clearer data-sharing restrictions. The challenge for affected employees is that the documented disclosures are conflicting and complex—proving what Lyra promised and when requires detailed evidence of which version of disclosures each employee saw and when. This is exactly the type of fact pattern where class action litigation becomes practical: a large group of employees experienced similar practices and privacy breaches, but proving individual damages would be expensive and time-consuming.
Looking Forward—Are Employer Mental Health Platforms Safe?
The Lyra Health case raises questions about the entire model of employer-sponsored mental health platforms. Many large employers now contract with companies like Lyra, Spring Health, Modern Health, and others to provide mental health benefits. These platforms occupy an inherently conflicted position: they work for both the employee (who seeks confidentiality) and the employer (who wants metrics and insights about their workforce’s mental health).
The future likely involves stricter requirements for informed consent, clearer firewalls between employee data and employer access, and potentially regulatory oversight of how mental health platforms handle data. Some employees are now more cautious about using employer-provided mental health platforms, instead seeking care outside these systems or paying out-of-pocket to maintain confidentiality. Others may push their employers to adopt platforms with stronger privacy protections and clearer consent frameworks.
