State Health Exchanges Sent Citizenship and Race Data to TikTok and Meta

Share
Minimalist white line art on muted blue showing a medical clipboard with three data tags flying through a dashed network arc into two generic app-tile silhouettes.

A Bloomberg investigation found that nearly all 20 U.S. state-run health insurance exchanges plus D.C. were sending applicant data — citizenship, race, ZIP codes, prescriptions, even disclosures about incarcerated family members — to TikTok, Meta, Google, Snap, and LinkedIn via misconfigured tracking pixels. More than 7 million Americans bought insurance through these sites for 2026.

Bloomberg News published an investigation on May 4, 2026, reviewing thousands of enrollment and informational webpages across 20 U.S. state-run health insurance marketplaces and the Washington, D.C. exchange. Almost all of them were transmitting applicants' application information to advertising and tech companies — Google, LinkedIn, Meta, Snap, and TikTok — through tracking pixels embedded for marketing purposes. The categories of data leaking are not the kind that easily get explained away. Washington, D.C.'s exchange sent applicants' sex and citizenship responses to TikTok, plus race data the tracker failed to filter out. Virginia's premium-estimate tool sent ZIP codes to Meta. New York's marketplace shared with TikTok, Meta, Snap, and LinkedIn the pages applicants visited during enrollment — including pages where they disclosed having incarcerated family members.

The single most important fact: this is not Healthcare.gov. The federal exchange used by the other 30 states does not embed these specific trackers. The problem is concentrated in the state-run marketplaces, where tracker configuration was a state-level decision and the data sharing went further than state officials realized. Bloomberg's reporter, working with the privacy firm Feroot, confirmed that visits to ten of these exchange sites were tied to a journalist's Facebook account — meaning Meta could and did retarget that specific user with ads based on the visits. The retargeting test is, for defenders, the cleanest illustration of how pixel data flows in practice.

State Health Exchange Tracker Investigation
DetailInformation
InvestigatorBloomberg News, in collaboration with privacy firm Feroot Security
Scope reviewedThousands of pages across all 20 state-run health insurance exchanges + D.C.
Tech-giant recipientsGoogle, Meta, Snap, Microsoft (LinkedIn), TikTok
D.C.Sex, citizenship, race (filtered failed), email, phone, country identifiers sent to TikTok
VirginiaZIP codes sent to Meta via premium-estimate tool
New YorkPage-visit data shared with TikTok, Meta, Snap, LinkedIn — including pages disclosing incarcerated family members
Other states named (per CalMatters/The Markup follow-on)Nevada (prescription drug names like Fluoxetine to LinkedIn/Snapchat); Maine (CoverME.gov sent prescriptions and dosages to Google Analytics); Maryland (Spanish-language pages on noncitizen pregnancy and DACA); Rhode Island (Medicaid pages); Massachusetts
2026 enrollees affectedMore than 7 million Americans bought insurance through state exchanges
Healthcare.gov statusThe federal exchange (used by the other 30 states) does not embed these specific trackers
Removals after Bloomberg's outreachD.C. paused TikTok tracker rollout; Virginia removed Meta tracker; California had already removed trackers before review

Why "Race Data" and "Citizenship" Ended Up in TikTok

The Washington, D.C. exchange asked applicants questions about sex, race, and citizenship as part of its enrollment flow. The TikTok pixel installed on those pages was designed to collect activity data for ad targeting. According to a D.C. spokesperson, the tracker also captured email addresses, phone numbers, and country identifiers. The exchange's expectation, presumably, was that the pixel would respect content boundaries — that it would collect "page visited" but not "answer the user gave on this page." That expectation was wrong.

The mechanics, per a cybersecurity expert quoted by Bloomberg, are that some trackers attempt to filter sensitive data via keyword matching, but "the keyword filters don't always catch everything — the TikTok tracker on Washington's health exchange stripped out broader racial categories but left specific ethnicity details in." Bloomberg characterized the filtering as a "flawed and brittle process for filtering unwanted information." That is the polite framing. The blunter framing: pixel-based "advanced matching" features are designed to match users across properties for advertising effectiveness, and pretending those features will reliably filter out sensitive data on health-related pages is a category error in tracker selection, not a configuration mistake.

Virginia's response is worth quoting directly because it illustrates how state agencies have rationalized this. A Virginia spokesperson told Bloomberg the state did not consider ZIP codes to be personally identifiable information. That position is contested in privacy law — five-digit ZIP codes are reidentifiable when combined with other data, and HIPAA's safe-harbor de-identification standard explicitly requires removing the first three digits of ZIP codes for populations under 20,000 people. Virginia removed the Meta tracker after Bloomberg's review.

Tech Giants Say Their Terms Prohibit This — Which Is the Old Argument

Spokespeople for Meta, TikTok, LinkedIn, Snap, and Google told Bloomberg their terms prohibit advertisers — like the state exchanges — from sharing sensitive or health-related data, and place responsibility on advertisers to ensure compliance. TikTok and Snap acknowledged that sensitive data can include information contained in page URLs. A Meta spokesperson told Gizmodo: "We do not permit or want advertisers to share sensitive information with us through our business tools, and our systems are designed to detect and filter out information that appears potentially sensitive."

This is the same response the platforms gave during the 2022–2024 wave of healthcare provider pixel investigations. The Markup documented in 2022 that 33 of the top 100 U.S. hospital websites had Meta Pixel sending data to Facebook every time a patient clicked an appointment-scheduling button. STAT's investigative team showed in 2023 that nearly every hospital website in the country was leaking visitor data to ad-tech vendors. Federal regulators followed: the Office for Civil Rights and the FTC jointly warned roughly 130 hospitals and telehealth providers in 2023 that pixel-based tracking on patient-facing pages risked HIPAA and consumer-protection violations.

Then came the legal blow. In June 2024, a federal judge in Texas sided with hospital associations, ruling that HHS had exceeded its authority by trying to extend HIPAA to "unauthenticated webpage tracking." That ruling chilled OCR's enforcement appetite. Hospital tracker prevalence dropped from 98 percent in 2021 to 30 percent in 2025, per Bloomberg's reporting — but the drop came from litigation threat, not regulatory action. State health exchanges, operating outside HIPAA's clearest applicability and without the same litigation exposure as hospitals, were not paying attention. Our data breach coverage tracks the legal-and-regulatory cycle these incidents now run through.

What Defenders Should Do This Week

  • Audit your own organization's web trackers on any property that collects health, financial, or other sensitive data. This is not a state-government-only problem. Healthcare providers, telehealth startups, employee benefits portals, financial-services account-opening flows, and any web property that asks sensitive questions has the same risk surface. Run a tag audit using browser dev tools or your tag-management platform; identify every third-party tracker firing on pages with PII or health data.
  • Establish a "no third-party trackers on sensitive forms" policy if you don't have one. Even well-intentioned analytics deployments leak data when pixel configurations include "advanced matching" or similar features that opt into broader data collection by default. The default settings are not safe defaults for sensitive content.
  • Brief privacy and legal teams on the regulatory landscape. OCR has previously taken enforcement action against healthcare entities for pixel-based PHI sharing; FTC has additional jurisdiction over consumer protection; state attorneys general — particularly in California (CCPA), Virginia, and Washington — have additional levers. The June 2024 Texas ruling has narrowed OCR's reach but not eliminated it. Get a written legal read on your organization's exposure.
  • For sectoral organizations beyond healthcare — GLBA-covered financial services, FERPA-covered education, employer benefits portals — the audit logic is identical even when the regulatory framework differs. Pixel data sharing is sector-agnostic in mechanics; legal exposure varies, technical fix does not.
  • If your organization runs a public-facing intake or eligibility flow that asks demographic or health questions, treat the URL itself as sensitive data. TikTok and Snap acknowledged that sensitive data can include URL content; if your URL paths contain category identifiers (e.g., /medicaid/, /pregnancy/, /daca/), the page visit alone leaks information regardless of what the pixel is "supposed" to collect.

The CyberSignal Analysis

Signal 01 — The category of "configurable pixel sharing" is the wrong technical foundation for sensitive forms

The pattern across hospital websites in 2022, telehealth startups in 2023, and state exchanges in 2026 is the same: organizations install pixels intended for marketing analytics, configure them with default-or-near-default settings, and discover later that those settings leak more than the organization understood. The framing of "misconfiguration" undersells the structural problem. Pixel-based ad-tech is built to maximize matching across properties; the platforms have monetary incentive to collect more rather than less. Filtering "sensitive" data is a feature retrofitted onto an architecture designed for the opposite. Organizations that handle sensitive intake flows should not rely on pixel filtering as a control. The right control is architectural separation: sensitive forms on subdomains or page paths that explicitly do not embed third-party trackers, with marketing analytics restricted to non-sensitive surfaces.

Signal 02 — The Texas ruling has changed the regulatory math

The June 2024 federal court ruling that HHS exceeded its authority on web-tracking guidance has been the most consequential single event in this space, and it explains a lot about why state exchanges were not careful. With OCR's enforcement appetite chilled, the deterrent against pixel deployment on sensitive pages dropped from "regulatory action" to "potential class-action damages and reputational risk." Hospital websites responded by pulling trackers; state exchanges, with less litigation exposure and less media attention, did not. That asymmetry is the policy gap Bloomberg's investigation now closes through reputational pressure. Whether the ruling holds on appeal, and whether HHS issues new guidance with clearer statutory grounding, are the two regulatory variables to watch.

Signal 03 — State exchanges are now public-sector cases for the same lessons private healthcare learned

Healthcare.gov — operated by the federal Centers for Medicare & Medicaid Services — does not embed these specific trackers. State exchanges, run by state agencies with smaller technical teams and less centralized governance, do. The takeaway for any organization managing public-facing intake flows on behalf of a government program is that the technical-governance gap between "we know about pixel tracking" and "we have a verified policy enforced by code review and audits" is wider than CIOs typically estimate. The organizations that get caught in the next investigation will be the ones whose marketing teams deployed analytics without a privacy review, then assumed the platform's default sensitive-data filters would handle the rest. That assumption has now failed in front of seven million Americans.


Sources

TypeSource
PrimaryBloomberg: State Health Sites Send Race, Location, Immigration Data to Meta, TikTok
ReportingTechCrunch: U.S. Healthcare Marketplaces Shared Citizenship and Race Data with Ad-Tech Giants
ReportingGizmodo: Meta and TikTok Are Getting Your Data From State Healthcare Sites
Prior reportingCalMatters / The Markup: We Caught 4 More States Sharing Health Data to Big Tech Trackers
AnalysisThe Next Web: Hospital Websites Are Still Leaking Patient Data to Advertisers

Read more