Facebook’s parent company Meta is facing two proposed class-action lawsuits for using the Meta Pixel tracking tool on health system websites to target ads.
This is not the first time that Meta-Facebook has been dragged through the courts and sued for a breach of privacy. In this case the problem stems from the company’s wholesale vacuuming up of all kinds of metadata whenever a user visits a web page containing its Pixel tracker functionality.
Pixel is contained in a few lines of JavaScript code and is found widely embedded into various web applications. It appears unlikely that the providers using these web applications were aware of the code contained in their portal pages, or that highly confidential HIPAA protected information is being sent to and used by Meta-Facebook without patients' express written permission being obtained. This is especially so because Meta is not a duly authorized HIPAA Business Associate, a requirement before HIPAA Covered Entities (CE) can share protected health information with a third party, nor is Meta a HIPAA CEin its own right. Based upon recent research, it’s probable that hundreds of healthcare portals contain the Meta Pixel code unbeknownst to most providers and that millions of patients could be affected.
The big question is whether Meta Corporation failed to realize that it was illegally being sent PHI data from Pixel, as it continued to monetize this data to sell directed advertising to unsuspecting patients. This point may become a pivotal argument in pending lawsuits and any regulatory enforcement actions. Based upon previous privacy violations, Meta-Facebook is supposed to have implemented business tools to identify sensitive health data and to filter this out from its advertising revenue generating systems.
In what will likely be a double blow, the collected data was not just innocuous de-identified medical information. “The data Meta received reportedly contained medical symptoms & conditions, prescription information, doctors’ names, IP addresses, and other data defined as HIPAA identifiers. It would therefore be relatively easy to reverse engineer this PHI data to determine the patient identity. It all comes down to the number of data points held in the Meta advertising database,” claimed Richard Staynings, Chief Security Strategist with Cylera. “This could end up being labeled as a massive breach of highly sensitive and confidential regulated HIPAA data.”
In addition to the recently announced class action it seems likely that the Office of Civil Rights (OCR), the enforcement division of Health and Human Services (HHS) is spinning up a task force to investigate this breach and will be assigning a large team to examine potential violation of HIPAA, the 1996 federal Health Insurance Portability and Accountability Act.
Not only does Meta Corporation likely face HIPAA regulatory concerns, but it also seems likely that various states Attorney Generals (AGs) will be looking very carefully to determine if the Pixel code is present in their jurisdictions on web pages where there is an expected right of privacy. This is especially so on healthcare portals. Finally, it also seems likely that OCR and AGs will be looking carefully at healthcare providers to examine their policies, standards, procedures and guidelines around due-diligence for acceptance of web application technologies and enabled functionality.
“This is an extreme example of exactly how far the tentacles of Big Tech reach into what we think of as a protected data space,” said Nicholson Price, a University of Michigan law professor who studies big data and health care. “I think this is creepy, problematic, and potentially illegal” from the hospitals’ point of view.
In 2019, the Federal Trade Commission (FTC) imposed a $5 billion penalty on Facebook and required it to submit to new restrictions and requirements to hold the company accountable for its data privacy decisions. This included the promised use of a sensitivity filtering mechanism.
Systemic Problem
Many of these privacy issues stem from a fundamental imbalance between the rights of individuals in the United States to remain anonymous and their data kept private, versus the rights of large corporations to collect and mine data for profit. This is a balance that has been addressed in Europe through GDPR - the 2016 General Data Protection Regulation which has quickly become a global standard for Privacy outside of the United States.
The federal nature of the US however has resulted in 50 very different and separate state privacy regulations that make it hard to enforce privacy standards for individuals given so much cross-state commerce. Attempts by the federal government to catch up to other OECD nations with a revised national privacy act have met with opposition from some states concerned that a federal law will dumb down their existing provisions, while other state representatives oppose the imposition of something similar to GDPR which they regard as an undue constraint on businesses.
The latest in a long line of attempts to update US privacy laws is currently working its way through congress. It remains to be seen whether the highly fractured nature of US law making results in national privacy changes or goes the way of prior attempts.