When Consent Isn't Trust: What the Flo & Meta Lawsuit Reveals About Building Wellness Tech Right

women and the labyrinth

When the menstrual tracking app Flo went to trial for allegedly sharing intimate health data with Meta and Google, the company's defense raised eyebrows:

"Is Flo a provider of health care? No. Are Flo's users its patients? No. Did Plaintiffs file their claims on time? No. Did Plaintiffs agree to Flo's privacy policy? Yes."

That defense may have been legally sound, but it sidesteps a deeper issue: in wellness tech—especially where women's health is concerned—users aren't just customers. They're people entrusting you with their bodies, routines, and vulnerabilities. Saying "they agreed to the privacy policy" doesn't absolve companies of the obligation to design with care.

The market delivered its verdict on that defense strategy. Just days after a federal judge approved Flo's settlement and dismissed the remaining class action claims, a San Francisco jury found Meta Platforms liable under California's Invasion of Privacy Act, ruling that it had intentionally intercepted sensitive menstrual health data from millions of Flo users via embedded SDKs. With penalties of up to $5,000 per violation, Meta now faces potential liabilities in the billions—roughly half of Fitbit's total acquisition price.

This wasn't an isolated case. The Flo litigation involved multiple defendants: Flurry Analytics (now defunct) settled for $3.5 million, Google settled for an undisclosed sum, and Flo itself had previously settled with the FTC in 2021, agreeing to clearer disclosures and independent privacy oversight (FTC Press Release). The pattern reveals how embedded technologies that developers don't fully control can create massive liability exposure.

Meta's loss establishes a new precedent: even when companies believe they're legally protected, juries may hold them to a higher standard when intimate health data is involved.

The Illusion of Consent

The Meta verdict highlights a critical flaw in how the tech industry interprets user consent. The idea that someone clicked "agree" doesn't mean they understood—especially when the average privacy policy requires a graduate-level education to comprehend. Most privacy policies are legal labyrinths designed to protect the company, not inform the user. Consent in this context isn't trust—it's compliance. The Meta case underscores that a policy on paper, no matter how carefully worded, won't shield a company from accountability if its design choices ignore the spirit of meaningful consent and data minimalism.

In femtech, that gap matters. Because even when data sharing is technically allowed, it may still feel like a betrayal.

How This Happens: The Invisible Leak

Many apps use third-party software development kits (SDKs)—bundles of prewritten code from companies like Google and Meta. These SDKs are often added for analytics, crash reporting, or marketing—but they can also transmit sensitive user data back to external platforms.

Flo's app included such SDKs, and data about menstrual cycles, sexual activity, and fertility intentions may have been exposed as a result (Courthouse News). Even if that data wasn't abused, its unintended transmission is a trust issue—not just a legal one. And often, it's not even intentional—just embedded in the tools apps rely on by default. Defaults shape outcomes.

What many users don't realize is that even when an app doesn't share the content of what you input—like a note or a symptom—you can still be identified through metadata. Metadata is the data about your data: when you opened the app, how often you used certain features, which buttons you clicked, what time of day you logged something. When combined, these behavioral patterns can closely mimic or even predict the real information you entered. This is why even 'anonymized' health data isn't truly anonymous—behavioral fingerprints are often more revealing than the raw data itself.

A third party might never see your exact entry—"cramps on day 24." But they could easily infer your cycle, sleep, stress, or activity patterns based on how and when you use the app. For example, if a user logs in daily and opens the symptom tracker around the same time each month, that behavior alone can signal where they are in their cycle. And if that metadata is collected through a third-party SDK, it often leaves your device whether you know it or not.

This is how systems built for analytics can quietly become systems of exposure—revealing more than users ever intended to share.

Building the Alternative

A growing number of companies are rejecting the surveillance model entirely—designing systems where intimate body data never touches centralized servers. Companies like Cirdia process data locally, on the user's device, with users maintaining complete control over what they choose to share.

This approach requires rejecting the standard toolkit. No Meta or Google SDKs embedded in products. Privacy-first analytics that never track individuals. Clear boundaries between advertising channels and product experiences. These choices require tradeoffs: more infrastructure, fewer shortcuts, and a fundamentally different product roadmap. But they also build real trust—not just legal defensibility.

Why It Matters More Than Ever

This isn't just about one app. It's about the entire ecosystem of wellness tech—especially tools that collect deeply personal data related to women's health, fertility, and sleep. In a post-Roe legal landscape, even inadvertent data exposure can have real-world consequences. The Meta case is a clear signal to femtech and wellness product leaders: meeting the letter of the law isn't enough. We must build systems that make bodily autonomy and digital trust non-negotiable.

Public awareness is growing: most wellness apps aren't protected under HIPAA. When users feel exposed, they don't just stop using your product—they stop engaging with tools that could help them.

The Strategic Imperative

For founders, this represents both risk and opportunity. While users readily trade social media posts or shopping habits for free apps, intimate body data triggers a fundamentally different response. Companies that fail to recognize this distinction face mounting legal exposure and user backlash. Meanwhile, those who understand that biometric privacy isn't just another data category can build defensible differentiation in a market where trust has become the scarcest commodity.

The playbook is emerging: audit your SDKs, minimize data collection, build opt-in pathways, and default to protection. But more fundamentally, it's about recognizing that in wellness tech, trust isn't just a nice-to-have—it's the entire foundation of sustainable growth.

The Agreement That Matters Most

When we build tech that touches people's bodies and private lives, we're not just writing code—we're making a promise. Not simply to follow the law, but to build systems that respect the boundaries of the people who use them.

Trust begins not with a privacy policy, but with the decision to collect only what's necessary—and to build with care when the data is deeply personal.

Because at the end of the day, a body isn't a business model. And once that trust is broken, no court ruling can restore what was lost.