Advertisement

Column: Millions of faces scanned without approval. We need rules for facial recognition

Rioters roam the halls of Congress on Jan. 6
Students at UCLA forced the university to back down from plans to install facial recognition systems on campus buildings. But the technology is actively used by private companies, sometimes without people knowing.
(AP)
Share

The powers that be at UCLA thought it was a good idea at the time — using state-of-the-art technology to scan students’ faces for gaining access to campus buildings. Students thought otherwise.

“The implementation of facial recognition technology would present a major breach of students’ privacy and make students feel unsafe on a campus they are supposed to call home,” the Daily Bruin said in an editorial last year.

UCLA dropped the facial recognition plan a few weeks later. “We have determined that the potential benefits are limited and are vastly outweighed by the concerns of our campus community,” officials declared.

Advertisement

I recalled that fracas after the Federal Trade Commission announced the other day that it had reached a settlement with a San Francisco company called Everalbum, which offered online storage of photos and videos.

The company, via its Ever app, scanned millions of facial images without customers’ knowledge and used the data to develop facial recognition software for corporate clients, the FTC said.

Everalbum also promised users it would delete their photos and videos from its cloud servers if they closed their account. However, the company “retained them indefinitely,” the agency said.

“Using facial recognition, companies can turn photos of your loved ones into sensitive biometric data,” said Andrew Smith, director of the FTC’s Bureau of Consumer Protection.

“Ensuring that companies keep their promises to customers about how they use and handle biometric data will continue to be a high priority for the FTC,” he said.

Be that as it may, there’s a lot of money to be made with such cutting-edge technology. Experts tell me consumers need to be vigilant about privacy violations as some of the biggest names in the tech world — including Google, Amazon, Facebook and Apple — pursue advances in the field.

Advertisement

“Since there aren’t federal laws on facial recognition, it seems pretty likely that there are other companies using this invasive technology without users’ knowledge or consent,” said Caitlin Seeley George, campaign director for the digital rights group Fight for the Future.

She called Everalbum’s alleged practices “yet another example of how corporations are abusing facial recognition, posing as much harm to people’s privacy as government and law enforcement use.”

Facial recognition technology took center stage after the Jan. 6 riot at the Capitol. Law enforcement agencies nationwide have been using facial recognition systems to identify participants from photos and videos posted by the rioters.

That’s creepy, to be sure, but it strikes me as a legitimate use of such technology. Every rioter in the building was breaking the law — and many were foolishly bragging about it on social media. These people deserve their comeuppance.

In the absence of clear rules, however, some of the big dogs in the tech world have adopted go-slow approaches to facial recognition, at least as far as law enforcement is concerned.

Microsoft said last year that it wouldn’t sell its facial recognition software to police departments until the federal government regulates such systems. Amazon announced a one-year moratorium on allowing police forces to use its facial recognition technology.

Advertisement

But law enforcement is just one part of the equation. There’s also the growing trend of businesses using facial recognition to identify consumers.

“Consumers need to know that while facial recognition technology seems benign, it is slowly normalizing surveillance and eroding our privacy,” said Shobita Parthasarathy, a professor of public policy at the University of Michigan.

Not least among the potential issues, researchers at MIT and the University of Toronto found that Amazon’s facial recognition tends to misidentify women with darker skin, illustrating a troubling racial and gender bias.

Then there’s the matter of whether people are being identified and sorted by businesses without their permission.

Facebook agreed to pay $550 million last year to settle a class-action lawsuit alleging the company violated an Illinois privacy law with its facial recognition activities.

The Everalbum case illustrates how facial recognition is spreading like poison ivy in the business world, with at least some companies quietly exploiting the technology for questionable purposes.

Advertisement

“Between September 2017 and August 2019, Everalbum combined millions of facial images that it extracted from Ever users’ photos with facial images that Everalbum obtained from publicly available datasets,” the FTC said in its complaint.

This vast store of images was then used by the company to develop sweeping facial recognition capabilities that could be sold to other companies, it said.

Everalbum shut down its Ever app last August and rebranded the company as Paravision AI. The company’s website says it continues to sell “a wide range of face recognition applications.”

Paravision “has no plans to run a consumer business moving forward,” a company spokesman told me, asking that his name be withheld even though he’s, you know, a spokesman.

He said Paravision’s current facial recognition technology “does not use any Ever users’ data.”

Emily Hand, a professor of computer science and engineering at the University of Nevada, Reno, said facial recognition data “is a highly sought-after resource” for many businesses. It’s one more way of knowing who you are and how you behave.

Advertisement

Hand said that “for every company that gets in trouble, there’s 10 or more that didn’t get caught.”

Seeley George at Fight for the Future said, “Congress needs to act now to ban facial recognition, and should absolutely stay away from industry-friendly regulations that could speed up adoption of the technology and make it even more pervasive.”

She’s not alone in that sentiment. Amnesty International similarly called this week for a global ban on facial recognition systems.

I doubt that will happen. With the biggest names in Silicon Valley heavily invested in this technology, it’s not going away. What’s needed are clear rules for how such data can be collected and used, especially by the private sector.

Any company employing facial recognition technology needs to prominently disclose its practices and give consumers the ability to easily opt out. Better still, companies should have to ask our permission before scanning and storing our faces.

“Today’s facial recognition technology is fundamentally flawed and reinforces harmful biases,” Rohit Chopra, then an FTC commissioner, said after the Everalbum settlement was announced.

“With the tsunami of data being collected on individuals, we need all hands on deck to keep these companies in check,” he said.

Advertisement

Chopra has since been appointed by President Biden to serve as director of the Consumer Financial Protection Bureau.

We can all recognize that as a positive step.

Advertisement