If you were surprised to learn that Facebook simply handed out the personal data of more than 50 million users — data that ended up in the hands of Donald Trump’s presidential campaign — you have every reason to feel bushwhacked.
Facebook gives its service away worldwide in return for people posting details of their lives. We all know that going in.
But the Menlo Park company also emphasizes in its terms of service that “you own all of the content and information you post on Facebook, and you can control how it is shared.”
The implication is clear (or should be): You’re in the driver’s seat. If you don’t want your info shared with others, it won’t be.
The reality is: Your data belong to Facebook, and the company will enrich itself by doing with it whatever it pleases.
“All Facebook users have to understand that the reason that the firm is so profitable is because our data is gold, and we’re giving it away for free,” said Scott J. Shackelford, an associate business professor at Indiana University focusing on cybersecurity law and policy.
“Reasonable users of the service need to understand this fact, and not be surprised when their data are gleaned and repackaged for an array of purposes,” he told me.
Facebook has drawn the attention of lawmakers in the United States and Europe after reports surfaced that the company provided access to the personal data of tens of millions of users to an academic researcher, who in turn made the information available to others, including a company called Cambridge Analytica.
That company, backed by former Trump aide and right-wing provocateur Steve Bannon, specializes in data mining for conservative political purposes.
Facebook said it was suspending access to Cambridge Analytica and its parent, Strategic Communication Laboratories. It also stressed that the release of user data wasn’t a security breach.
“People knowingly provided their information, no systems were infiltrated, and no passwords or sensitive pieces of information were stolen or hacked,” the company said.
Well, yeah. However, that’s hardly the whole story.
A University of Cambridge professor named Aleksandr Kogan wanted to build a database of personality profiles. Kogan persuaded around 270,000 people to fill out a survey in an app installed in their Facebook accounts.
That app, in turn, gave Kogan access to the survey takers’ Facebook friends, which rapidly expanded the universe of available data to over 50 million people. Information included people’s likes and dislikes, where they live, what they do for a living and how much education they’d received.
All of that apparently was kosher in Facebook’s eyes. Where Kogan went astray was in subsequently making the info available for political purposes to Cambridge Analytica.
“Facebook’s terms of service are quite obviously of no value to consumers,” said Sam Lester, consumer privacy fellow at the Electronic Privacy Information Center in Washington, D.C. “Consumers had no knowledge that a controversial data mining firm was accessing their personal data.”
Although the company’s terms of service are emphatic that Facebook users “own” the info they post, a dive into Facebook’s separate data policy takes a broader view.
It says information will be shared with business partners and with entities “conducting academic research and surveys.” That’s the back door that Kogan climbed through.
“These partners must adhere to strict confidentiality obligations in a way that is consistent with this data policy and the agreements we enter into with them,” Facebook says. But the Cambridge Analytica case shows that once information gets into the wild, there are few effective means of limiting where it goes.
Facebook says that when it found out in 2015 about Cambridge Analytica using the data, it demanded the company delete all the files. Cambridge Analytica says it duly deleted the data two years ago, once it learned that Facebook wasn’t pleased.
But it’s not like anyone from Facebook went to Cambridge’s office and made sure the data had been erased. According to the New York Times and the Observer of London, the data played a role in Trump’s digital election endeavors.
Privacy experts say social-media users need to be clear-eyed about what these companies are doing. They’re in the business of making money, and they do this by treating users’ self-posted information as a commodity.
“Users should understand that if they want to protect their personal data they should not share it with a company that makes money off their users’ information and attention,” said Susan Freiwald, a law professor at the University of San Francisco.
“Even if it were not Facebook’s intent to put users’ personal information at risk, merely collecting and storing it turns it into a honeypot that is attractive to bad actors,” she observed.
Always go into the privacy settings of an online service or app and limit, as best as you can, how much your data will be shared. Keep in mind that you can’t completely keep your information under wraps — these companies aren’t charities, after all.
Beyond that, privacy experts say current law lags behind technological capabilities, and it’s up to lawmakers to pass new safeguards addressing data sharing.
“There are simply no constraints,” said Kathryn Montgomery, a communications professor at American University. “These corporations can do with our data whatever they please, without telling us.”
She called Facebook’s provision about data sharing with academics and survey takers “a huge loophole.”
I can think of 50 million reasons why she’s right.