Column: Facial ID recognition can help on your phone, but not so much in law enforcement hands
After Hong Kong criminalized the wearing of masks at public gatherings, protesters who were the clear targets of the law promptly protested again –- this time wearing masks of faces that included cartoon characters and Hong Kong’s chief executive. The demonstrators had already destroyed some street security cameras they thought were equipped with facial recognition software, and put on the masks to thwart police from using the same technology. People may like the facial recognition ID in their smartphones, but in the hands of law enforcement – that may be another matter. California Gov. Gavin Newsom signed into law a three-year moratorium on putting facial ID technology into police body cameras, over the objections of the California Police Chiefs Assn., which had cautioned legislators that such a ban “severely hinders law enforcement’s ability to identify and detain suspects of criminal activity,” and to the relief of civil liberties and privacy groups. How reliable is this fledgling software? The ACLU in Massachusetts found that technology available from Amazon mistakenly mismatched official headshots of New England pro athletes with pictures from a criminal mugshot database, one out of six times. When the ACLU of Northern California tried the same thing, it got comparable numbers – with state legislatures. Matt Cagle is the group’s technology and civil liberties attorney.
What is the objection to facial recognition technology being used by police through body cams?
In California and in communities across the United States, body cameras were promised as a way to make officer conduct more transparent and more accountable. It would be a complete breaking of that promise, and the purpose of body cameras, to suddenly turn them against the public as surveillance devices roaming our streets, tracking our faces and cataloging our every movement.
But people in public are already de facto granting the right to other people to take their photographs.
Just because we go about our public lives, our daily lives in public spaces doesn’t mean we lose an expectation of privacy in our lives.
Just last year, the U.S. Supreme Court said that the government needs to get a warrant when they go and try to get your location from your cellphone company. And keep in mind, our cellphones are at our homes sometimes, but they’re often with us in public spaces.
So even the Supreme Court has rejected this idea that we lose our right to privacy because we engage in modern society and go into the world.
With body cameras, though, you have a system that can pervasively track people that is sometimes inaccurate and may get it wrong, that allows for an unprecedented amount of tracking of people’s locations, their whereabouts, and even potentially the sort of emotional expressions on their faces.
Is your concern when facial recognition technology doesn’t work, or when it does work correctly?
It’s actually both. You can imagine if facial recognition misidentified somebody, which prominent researchers have found these products often fail to accurately identify people. You can imagine that if that happens, a law enforcement officer may make a misinformed decision about whether to pull somebody over, about whether to arrest somebody, and potentially even about whether to use lethal force. That can be a life and death consequence because of a failed, inaccurate technology.
And we know facial recognition, when it’s inaccurate, is often more inaccurate for people of color, women of color and children. Many of these communities have been over-policed historically. And so the harms and consequences can be really drastic.
Even if this technology is perfectly accurate, it would then give the government an unprecedented power to track your day-to-day life and where you’ve gone, and with that, your private interactions, your private habits, whether you go to the doctor or the therapist, where you live, where you work, and whether you attend a certain church or not.
Even if perfectly accurate facial recognition were available to governments, that would be a huge civil rights, civil liberties problem, and these aren’t problems that the companies are going to fix simply by tweaking an algorithm. People should be able to walk down the street without having their identities tracked and logged by a company or government without their consent
The ACLU tried an experiment on how accurate facial recognition technology was. What happened?
We used Amazon’s facial recognition product to scan the faces of all California legislators, and we scanned their faces against a mugshot database – 120 legislators scanned against 25,000 mugshot photos. And that system produced 26 false matches.
About 20% of all the legislators were misidentified?
That’s right. They were falsely matched to an image of somebody who had been arrested.
And that caused a stir in Sacramento. Legislators realized that one of the prominent software products for facial recognition got it wrong and could get it wrong at an astounding rate.
But then they also realized that even if this technology improves, and it has improved in some respects, you would be attaching a tracking system to every officer’s lapel or in their hand. It would be basically the same as requiring all of us to show our I.D. if we walk past law enforcement going about our daily lives down the street. And that’s totally incompatible with the way that our society operates.
I read that researchers at Georgetown had found that ICE, Immigration and Customs, was mining millions of driver’s license photos for possible use for facial recognition?
That’s right. The researchers at Georgetown’s Center on Privacy and Technology found that ICE, in its effort to target immigrant communities and families, had sent demands to, I think, three different states, maybe four, asking those states to run facial recognition searches in their DMV databases.
This really, I think, shows a key concern with facial recognition systems, which is once you build a system that contains the sensitive face data and the personal information of people, that system becomes vulnerable to demands from other government actors, government actors who might have policy goals and sort of missions that are contrary to California values.
People would be familiar with seeing protesters in Hong Kong wearing masks, which is banned there. They wear these masks in defiance of the law, presumably because the police in Hong Kong are using or plan to use some form of facial recognition technology to identify the protesters.
I’m not sure the extent to which facial recognition is being deployed there in Hong Kong, but elsewhere in China -- including in the province where the largely Muslim ethnic minority, the Uighurs, live -- we’ve seen massive use of facial recognition to track, to target and to control populations that are disfavored by the government.
It’s making the Hong Kong protesters concerned that by exercising their right to their political speech, they’re going to be subject to retribution by the government.
Here in the United States, we have a right to free speech against the government, and facial recognition could be used as a tool or as a system to intimidate people who want to attend a protest on an issue that the local government or the federal government disfavors.
It’s a great point to think about the impact that facial recognition has on sort of rights that relate to free speech and not just privacy.
You’ve mentioned community safety, and a lot of Americans make that a priority, and they are willing -- on some different sliding scales, depending on who you’re talking to -- to give up certain privacy concerns, privacy protections in order to get safety.
Often in the United States, the conversation about surveillance proposals is one where we place privacy and civil rights on one side of the scale and we place security and safety of the other. But in fact, the reality is much more nuanced than that.
Being smart about new surveillance proposals, deciding to draw a line against dangerous surveillance proposals, inaccurate technologies and biased technologies -- that does make us safer.
We have the instance in a couple of communities where the police have said, ‘We’ll give you a free Ring system for your door if you give us access to your video,’ which would give police a way of watching the street that they don’t now have.
We’re seeing the growth of these agreements between law enforcement and Ring to provide discounted surveillance cameras for people’s homes in exchange for law enforcement having access to a system where they can request people to hand over their footage.
And these raise some really complicated issues. This company, Ring, has different interests at heart we think than many consumers, and that is an interest in making money here. I think it’s really important for communities to be part of decisions when their law enforcement agencies want to get into these close relationships with the private surveillance company.
Private companies like airlines are adopting this software, as are some gatherings like concerts, although some music venues have pledged not to use it. Aren’t there advantages to this technology?
There are companies that want to use it for entry into stadiums. There’s companies who want to deploy against schoolchildren. There’s companies who want to make sure that it is used against tenants in rented buildings.
But it’s really important to have a conversation about whether that technology is even necessary, and whether there’s another way to keep people safe and keep people secure
You use the word “against,” as if it were an adversarial system. Do you think it is?
We think that facial recognition poses an unprecedented threat to privacy and people’s civil rights, and that all too often, surveillance technologies like facial recognition are deployed against community members, against black and brown community members, against immigrants and against activists in a way that is unfair and unjust and harm civil rights.
We need not look only at facial recognition to see the history of surveillance technology use in the United States, and the history of these technologies being turned against activists and other communities who are disfavored by the government.
I wonder if the day’s going to come when, if we want to protect our privacy, we end up having to carry a famous-person mask with us and put it on before we go to a concert or a restaurant or go shopping.
Well, right now, the ACLU is fighting and really winning to make sure that you don’t have to, and no one has to hide their identities, simply because they want to walk down the street or engage in public life. And that’s really the way it should be in our free society.
Get Group Therapy
Life is stressful. Our weekly mental wellness newsletter can help.
You may occasionally receive promotional content from the Los Angeles Times.