Editorial: Biometric privacy laws must evolve with the times
Published in Op Eds
Who should be held liable when a person’s biometric data — their facial features, voice or fingerprints — is misused?
Illinois is home to some of the strongest consumer privacy regulations in the country, including its rules governing the use of biometric data.
Perhaps you were one of the folks who got a check from Facebook after a class-action lawsuit alleging violations of Illinois’ 2008 Biometric Information Privacy Act. Facebook’s “Tag Suggestions” feature used facial recognition to scan users’ uploaded photos, creating “face templates” or biometric identifiers.
Plaintiffs said this happened without the required written consent or publicly available retention/destruction policy. In February 2021, the case was resolved with a court-approved $650 million class-action settlement — widely described by the judge as a landmark in consumer privacy law.
Facebook isn’t the only company dealing with these lawsuits. A Chicago man sued Home Depot this month, alleging the retailer used facial recognition at self-checkouts without consent or required policies — a potential violation of BIPA.
One could argue that our biometric privacy rules are a very important tool in protecting personal privacy. Indeed, we are grateful for barriers to widespread misuse of such personal information. On the other hand, you could also argue that Illinois’ biometric data laws are responsible for regulating a marketplace that is changing by the second — and they’re not keeping up. There’s truth to that, too.
Take this question, for example:
Should digital infrastructure providers like data centers or cloud platforms be held liable when someone’s biometric data is misused?
Legal minds are actively debating this question. And many in the industry — including those who lobby on its behalf — are concerned their network could be in the legal crosshairs unfairly if the rules aren’t updated.
Illinois’ biometric rules apply to any “private entity” that collects, captures, purchases, receives through trade or otherwise obtains a person’s biometric identifiers or information. That means if a company is in possession of, or is making use of, biometric data, it has duties such as proper informed consent before collection and written, publicly available retention and destruction policies.
So far, most high-profile cases have focused on end-user firms such as employers, retailers and social-media platforms. But there’s no explicit carve-out in BIPA for data centers or cloud providers. If they merely store encrypted information, they can argue they’re not “collecting” or “using” biometrics. But if a provider offers biometric processing services — or fails to safeguard the data in its possession — plaintiffs could test the boundaries of liability.
There’s a lot to gripe about when it comes to data centers — their reliance on our natural resources, or their debatable claims of being long-term job creators. This won’t be the last thing we write about the industry. But in this case, we think there’s room for improvement — and clarification.
It’s important to keep laws up to date with fast-moving technology. It’s also important that our state remains competitive as a tech hub.
The state has done this before. Just last year, lawmakers approved, and Gov. JB Pritzker signed, changes to BIPA that limited potential damages and broadened the definition of “written release” to cover electronic signatures.
Legislating is a slow process, but when you’re regulating something so big and fast-changing, you have to be nimble. If data centers played — or could play — an active role in policing use of biometrics, they would be fair game as potential defendants. But they don’t.
The data-center industry has a solid argument here, we believe, that they shouldn’t be liable under BIPA for misuse of people’s likenesses or identities.
_____
©2025 Chicago Tribune. Visit chicagotribune.com. Distributed by Tribune Content Agency, LLC.
Comments