Banner

Redefining biometrics & liability implications

Blog
Article

As biometric technologies expand in health care and consumer surveillance, regulators are cracking down on companies that misuse sensitive data — with HIPAA, FTC orders and billion-dollar state settlements setting the tone.

© Andy Dean - stock.adobe.com

© Andy Dean - stock.adobe.com

According to the U.S. Department of Homeland Security, “biometrics is the automated recognition of individuals based on their biological and behavioral characteristics from which distinguishing, repeatable biometric features can be extracted for the purpose of biometric recognition.” Biometrics are not new. In fact, for over 60 years, the National Institute of Standards and Technology (NIST) has been conducting research in the area of biometrics, “with work on fingerprint technologies for the FBI to support law enforcement.”

From a health care standpoint and biometrics application to the Health Insurance Portability and Accountability Act of 1996 (HIPAA), the following are a sample from 45 CFR § 164.514(B)(2)(i), which relates to the individual identifiers that must be removed in order to “de-identify” an individual in relation to health information, include: names, vehicle identifiers and serial numbers, including license plate numbers, device identifiers and serial numbers, biometric identifiers, including finger and voice prints and full face photographic images and any comparable images. (emphasis added). So, the first type of liability identified relates to HIPAA, the components of individually identifiable health information (IIHI) and relatedly protected health information (PHI) of which biometrics is one, the obligation to de-identify this type of information, and the Breach Notification Rule requirements. Essentially, covered entities and business associates are required to protect IIHI and PHI from unauthorized access, and biometric data is part of this.

The Federal Trade Commission (FTC) also enforces technology violations as they relate to consumers under the Federal Trade Commission Act. In December 2023, the FTC banned Rite Aid from using artificial intelligence “facial recognition technology for surveillance purposes for five years to settle [FTC] charges that the retailer failed to implement reasonable procedures and prevent harm to consumers in its use of facial recognition technology in hundreds of stores.” Notably,

The proposed order will require Rite Aid to implement comprehensive safeguards to prevent these types of harm to consumers when deploying automated systems that use biometric information to track them or flag them as security risks. It also will require Rite Aid to discontinue using any such technology if it cannot control potential risks to consumers. To settle charges it violated a 2010 Commission data security order by failing to adequately oversee its service providers, Rite Aid will also be required to implement a robust information security program, which must be overseen by the company’s top executives.

States are also enforcing privacy violations related to biometric data. For example, in July 2024, Texas Attorney General Ken Paxton, through the Texas Deceptive Trade Practices Act coupled with violations of Texas’ biometric laws, settled with Meta f/k/a Facebook for $1.4 billion “to stop the company’s practice of capturing and using the personal biometric data of millions of Texans without the authorization required by law.”

Companies can face a myriad of enforcement actions, as well as class action lawsuits. The best ways to stave off litigation are to have an effective compliance program, board governance, and comply with HIPAA and state laws.

Rachel V. Rose, JD, MBA, advises clients on compliance, transactions, government administrative actions, and litigation involving healthcare, cybersecurity, corporate and securities law, as well as False Claims Act and Dodd-Frank whistleblower cases. She also teaches bioethics at Baylor College of Medicine in Houston. Rachel can be reached through her website, www.rvrose.com.

Recent Videos
The doorknob question
Acing the interview
Handling phone calls with difficult patients
Andrea Greco on next steps after identifying a security gap during a risk assessment
Andrea Greco on regulatory compliance for risk assessments
Andrea Greco talks risk assessment blindspots
Related Content
© 2025 MJH Life Sciences

All rights reserved.