Federal regulators served notice last week: They’re watching how businesses use — and possibly abuse — consumers’ personal information.
And that’s great. It’s about time more official attention was paid to ways that companies invade our privacy. But federal authorities can do more, much more, to level the playing field.
In a report titled “Big Data: A Tool for Inclusion or Exclusion?”, the Federal Trade Commission stopped short of laying down rules for how companies collect and profit from customer information. But it made clear that officials are wary of how such practices can cause harm.
Big Data refers to both the ways businesses exploit customer information and the shadowy industry that’s emerged to buy and sell people’s data like any other commodity. Highly detailed profiles of your likes and dislikes are now available to marketers, employers and others — and you have little if any say over how they’re used.
“Big Data’s role is growing in nearly every area of business, affecting millions of consumers in concrete ways,” FTC Chairwoman Edith Ramirez said. “The potential benefits to consumers are significant, but businesses must ensure that their Big Data use does not lead to harmful exclusion or discrimination.”
Among the benefits, the agency says, are “more efficiently matching products and services to consumers” and creating economic opportunities for lower-income people who might otherwise not receive credit or an affordable interest rate.
But these advantages come at a price.
“Our every move, online and increasingly offline, is being stealthily watched, analyzed and used to make decisions about us,” said Jeff Chester, executive director of the Center for Digital Democracy, an advocacy group.
“Every consumer should be alarmed about the host of little publicly known practices that can harm our credit, employment and privacy,” he said.
Maneesha Mithal, associate director of the FTC’s Division of Privacy and Identity Protection, told me that the agency didn’t want to set a combative tone by seeking new rules for businesses’ use of customer data. Instead, it hopes to raise awareness about issues companies should consider.
“Many companies are trying to do the right thing,” Mithal said. “But I’m not sure all companies know the right questions to ask.”
Among those questions, the FTC report says, is whether data collected by a company fairly reflect a consumer’s financial profile or cross any ethical lines.
It cites the example of a company that crunched available Big Data numbers and determined that people who live closer to work are less prone to seek other jobs. However, establishing a geographic preference for hires could result in racial discrimination because people from some neighborhoods would be viewed as less desirable.
I take a more macro view of Big Data. I’m troubled by the systemic violation of personal privacy by companies and organizations that treat your data as their asset, to be exploited however they please.
European nations recently unveiled new privacy rules aimed at giving consumers greater say over how their information is used.
The rules include a “right to be forgotten,” allowing people to request that companies delete information about them that’s no longer relevant, such as phone records held by a former wireless provider or past Internet searches. They also require that businesses inform regulators within three days of any data breach.
There’s no right to be forgotten in this country and no federal notification law covering all data breaches. In California, businesses are required to report a data breach only if it’s “reasonably believed” that unencrypted data has fallen into the hands of hackers.
Since 2005, according to the Privacy Rights Clearinghouse in San Diego, nearly 896 million consumer records have been put at risk by more than 4,700 known data breaches.
The actual number of breaches, said Beth Givens, the advocacy group’s executive director, “is almost certainly much higher but never were reported.”
The FTC has asked Congress for more authority to regulate privacy matters. So far, Congress has ignored the agency’s requests.
But a close look at Section 5 of the Federal Trade Commission Act shows that the agency already has considerable muscle in this regard. It grants the FTC authority to crack down on “unfair or deceptive acts or practices in or affecting commerce.”
I’d argue that failing to encrypt customer information, at a time when data breaches and identity theft are rampant, constitutes an unfair business practice. It unfairly places consumers at risk.
I shared that with the FTC’s Mithal, who sympathized with consumers’ plight. But she said Section 5 is clear about what can be considered an unfair practice.
First, she said, the practice has to cause or be likely to cause “substantial injury to consumers.” It has to be something that consumers can’t “reasonably avoid” by, say, taking their business elsewhere. A practice also won’t be considered unfair if it’s “outweighed by countervailing benefits to consumers or to competition.”
While the vast majority of U.S. businesses fail to encrypt customer data, they could argue that this isn’t unfair because few people suffer “substantial injury” from breaches and because customers benefit from lower prices and speedier service. Encryption can be expensive and can slow down computer systems.
“Our tools are limited,” Mithal said. “We’re using them as much as we can. Beyond that, we’ve asked for more tools.”
I don’t buy it. Businesses are constantly seeking innovative ways to exploit and make money from our information. The least that can be expected in return is that they’re using state-of-the-art resources to keep our info under wraps.
Otherwise, they’re getting more from the equation than we are.
And that’s not fair.
David Lazarus’ column runs Tuesdays and Fridays. He also can be seen daily on KTLA-TV Channel 5 and followed on Twitter @Davidlaz. Send your tips or feedback to firstname.lastname@example.org.