Starting on Jan. 1, Californians can expect to find a new feature on retail and social media websites — a message and button or link through which they can instruct site owners not to sell their personal information.
That may be the first hint many users receive of the state’s landmark Consumer Privacy Act, which goes into effect on New Year’s Day. It’s a vital law that sets a standard for the protection of consumer privacy that other states and the federal government would be wise to follow.
But it’s only a start. New methods of incursions into personal privacy are being cooked up constantly by businesses seeking profits from snarfing up information about individuals and exploiting it themselves or selling it, piecemeal or in bulk, to others.
We had a constant battle to stop efforts by industry to gut the bill.
The field is “so complicated,” says Alastair Mactaggart, the Northern California real estate developer who became the driving force behind the new law. He’s already followed up by drafting a ballot initiative for November to close loopholes and strengthen enforcement. But even the new proposal may fail to address some of the privacy violations coming at ordinary citizens from all directions.
“We had a constant battle to stop efforts by industry to gut the bill,” says Justin Brookman, a privacy expert at Consumers Union in Washington.
Among its major provisions, the law gives California consumers the right to know what personal information retailers, social media platforms and other service providers are collecting, selling or sharing with others. Consumers can demand that their information be deleted, and can opt out of allowing it to be collected in the first place.
That’s the point of the notice and link that must appear on websites that collect any data. The law applies to businesses with annual sales of at least $25 million or that buy or sell the personal information of at least 50,000 customers, households or devices. That includes a wide swath of the internet, from web giants like Google to news outlets including the Los Angeles Times to e-commerce sites and more.
Businesses can charge customers who opt out of data collection more for their services, but the price difference has to be related to the value of the information. That should place a tight limit on the price differences because estimates of the cash value of individuals’ data are low.
For example, according to a study this summer by Berkeley Economic Advising and Research for Atty. Gen. Xavier Becerra, “general information about a person such as their age and gender were found to be worth $0.0005 per person.” Information that a woman was pregnant was pegged at about 11 cents per person.
The Berkeley figures were based on estimates compiled by the Financial Times, which concluded that the total value of 61 basic information nuggets often sought by data buyers was about $4.83 on average. Most individuals, of course, don’t evaluate their personal information strictly in dollars and cents. Data-collecting companies, however, value the information in the aggregate, in which it’s worth billions.
The law directs businesses to safeguard the personal information in their possession, and allows individual consumers to sue for up to $750 per incident, or for actual damages, in the case of a breach. The attorney general can also sue for up to $2,500 per incident or $7,500 per incident if the breach is intentional.
As advanced as the law may be, it became evident well in advance of its effective date that it left shortcomings businesses could exploit, Mactaggart told me; some were the result of the haste with which the law was written, in a legislative effort to fend off a ballot initiative Mactaggart had proposed. For example, a provision allows businesses to “cure” a data breach by implementing security provisions within 30 days after a breach has been discovered.
If the new ballot initiative Mactaggart drafted passes next year, retrospective cures of security holes won’t protect a business from liability. It would triple the maximum penalty, to $7,500, for breaches involving data about children.
The initiative also would establish a California Privacy Protection Agency, headed by a five-member commission and given an initial $10-million budget for enforcement (indexed to inflation after the first year).
Perhaps its most important innovation is the addition of special protections for so-called sensitive personal information, such as Social Security numbers and information about a person’s religious beliefs, race, sexual orientation, account logins and passwords, genetic data, health or geolocation. “We don’t have a concept of sensitive personal information in the law right now,” Mactaggart says, “but it’s information so sensitive that businesses shouldn’t use it unless it’s absolutely essential.”
Take geolocation. In a 2017 Massachusetts case, an advertising agency was caught identifying women who went to abortion clinics by tracking their smartphones and selling the information to an antiabortion group that inundated them with antiabortion messages. The ad agency’s owner bragged that he could “tag all the smartphones entering and leaving the nearly 700 Planned Parenthood clinics in the U.S.,” according to a complaint filed by Massachusetts authorities. The agency agreed to cease the practice.
Mactaggart’s initiative would prohibit tracking devices more precisely than within a circle of about 250 acres. (A football field is about 1.3 acres.) “So you won’t be able to track how long I’ve been at the gym, or when I arrived at work, or whether I went to a rehab clinic,” Mactaggart says.
The initiative was cleared by the attorney general on Tuesday to move into the signature-gathering stage. About 632,000 signatures will be needed to qualify the measure for November’s ballot.
Even if it passes, however, gaps and loopholes will remain. In part, that’s because consumers are casual about their personal data, too willing to give it up to merchants or social media platforms without asking how it will be used and objecting when it’s exploited against their interests.
Breaches involving Social Security and bank account numbers, birth dates and other information that could be exploited by identity thieves are so common an occurrence these days that few consumers seem to be stirred. The companies assembling these data and failing to adequately safeguard them are so big and rich that no penalties seem to matter.
In July, the Federal Trade Commission hit Facebook with a record fine of $5 billion for breaching an earlier order over its privacy loopholes — in other words, Facebook was judged to be a privacy recidivist — but the penalty was equivalent only to the revenue the company collects in an average month and less than a fourth of its annual profit. In other words, it could be shrugged off as the cost of doing business.
The problem is the domination of our public spaces by what privacy expert Shoshana Zuboff calls “surveillance capitalism.” Companies collect data not only from our online habits and retail activities but also from an increasing variety of devices that we invite into our homes or use to secure our property — internet-enabled thermostats, video-equipped doorbells, smartphones with geolocating technologies, and more.
Amazon employees have reported that commands issued to voice-activated devices such as the company’s Echo have been recorded and stored, and even casual conversations held within earshot of the devices can be swept up.
The company’s Ring doorbells can view and record not only visitors to a home’s front porch but passersby on the street as well. Homeowners have the option to share these recordings with local law enforcement agencies — though strangers who unwittingly come within range don’t have the chance to opt out. Amazon even encourages neighbors to mutually share their Ring videos in a sort of neighborhood watch system through a downloadable app.
Facial recognition can be especially insidious, because it’s invisible to the targets. In May, San Francisco banned local law enforcement agencies’ use of facial recognition, along with other passive surveillance technologies such as automated license plate readers and camera-equipped drones.
And on Dec. 18, five Democratic senators including Kamala D. Harris of California questioned the Department of Housing and Urban Development about its installation of facial recognition security systems, which they said “could be used to enable invasive, unnecessary and harmful government surveillance of their residents.”
“Unilateral incursion” into personal privacy by companies seeking surveillance profits has spread, Zuboff wrote in her 2019 book, “The Age of Surveillance Capitalism,” via the collection of personal data “through [Google’s] Street View’s Wi-Fi and camera capabilities” as well as the capture of voice communications, the tracking of smartphone location data and wearable technologies and facial recognition capabilities.
In other words, ordinary citizens are swimming in a sea of technological eyes and ears, generally without knowing. Rights to privacy “have been usurped by a bold market venture powered by unilateral claims to others’ experience and the knowledge that flows from it,” Zuboff wrote. Surveillance capitalism “claims human experience as free raw material.”
In that context, laws such as California’s, even if augmented with Mactaggart’s new initiative, struggle to hold back the tide. Americans are only just beginning to sense the scope of surveillance capitalists — some users of Fitbit, the exercise tracking device, said they’d scrap theirs after Google bought the company, presumably to gain access to users’ personal data. But it’s unclear how many would really follow through. (The acquisition is pending regulatory approval.)
The battle for consumer privacy may be a never-ending cat-and-mouse game between surveillance capitalists and lawmakers. If experience serves, the former will always be steps ahead of the latter.
“Policy,” Brookman says, “always tends to be slower than technology.”