Advertisement

Facebook needed third-party apps to grow. Now it’s left with a privacy crisis

Facebook Chief Executive Mark Zuckerberg is shown in silhouette in 2013. His company wouldn't be the behemoth it is today if it didn't allow third-party apps such as the one that harvested data from 50 million unwitting users for Cambridge Analytica.
(Marcio Jose Sanchez / Associated Press)
Share

Facebook had only 20 million users when it opened up its budding platform to outside app developers in 2007, giving them much-needed access to the social network’s growing web of friends and family.

The developers built online games, quizzes and dating apps that gave people even more reasons to join Facebook.

“Until now, social networks have been closed platforms. Today, we’re going to end that,” Facebook founder Mark Zuckerberg told a gathering of hundreds of developers at a company conference at the time.

Advertisement

It proved a turning point for the company, sparking runaway growth that saw Facebook add an average of 200 million users a year en route to becoming the world’s biggest and most powerful social network. It also entrusted outside developers with Facebook’s treasure trove of personal data, showing where users lived, where they went to school and what, if any, political affiliations they had.

The consequences of that shift are now coming into sharper view amid a growing scandal over Cambridge Analytica, a data analytics firm tied to the Donald Trump presidential campaign that accessed details from 50 million Facebook users without their knowledge in an attempt to influence voters.

Revelation of the scandal, which was first reported by the New York Times and the British newspaper the Observer over the weekend, resulted in news Tuesday that the chief executive of Cambridge Analytica, Alexander Nix, had been suspended. Facebook is also the subject of a new probe by the U.S. Federal Trade Commission to see if it mishandled private user data and a joint investigation from attorneys general Maura Healey of Massachusetts and Eric Schneiderman of New York.

Authorities will likely want to know how much information Facebook provides to outside app developers and what role, if any, the social network has in enabling unauthorized third parties to gain access to that data, experts say.

“App integration allowed people to do things like play Scrabble online with their old high school friends on the other side of the country and it allowed user growth to increase a lot,” Heather Antoine, a Beverly Hills attorney who specializes in internet and privacy law, said of the company’s new tack in 2007. “It didn’t start with a malicious intent, and I still don’t know if Facebook has any malicious intent, but other people did and they found loopholes to get data.”

Cambridge Analytica, a company owned by conservative billionaire Robert Mercer, is accused of receiving the data from University of Cambridge psychology professor Aleksandr Kogan. He had developed a personality quiz app for Facebook called “thisisyourdigitallife,” which was downloaded 270,000 times by Facebook users in 2013. At the time, Kogan could glean information from those users’ contacts, leading to additional information from millions more accounts.

Advertisement

Kogan had permission to obtain the data, but is accused of violating Facebook rules when he passed the information to a third party, Cambridge Analytica, for money.

Facebook knew about the access in 2015 and demanded that Cambridge Analytica destroy the data — something the firm says it did. However, former employees of Cambridge Analytica say the company still has some of the data and that Facebook never bothered to verify that it had been deleted.

Their claims, if proved correct, suggest there are few consequences to ignoring Facebook’s terms of service about receiving data. Cambridge Analytica was only suspended from Facebook on Friday, two years after the social media giant knew about the violation.

The controversy has raised suspicions that more Facebook data have been passed to third parties than the company is willing to acknowledge — a potentially vast market that has spread to the so-called dark web, where stolen information and identities are exchanged.

Facebook accounts were selling for $5.20 apiece on the dark web last month, more than three times the price for Twitter accounts, according to Top10VPN, a site that tracks online secuity tools.

Sandy Parakilas, a former Facebook employee whose job used to entail policing data breaches by third-party developers, said the spread of ill-gotten user information was rampant.

Advertisement

“Once the data left Facebook servers, there was not any control, and there was no insight into what was going on,” Parakilas, who served at his position for two years starting in 2011, told the Guardian.

“It has been painful watching,” he added, “because I know that they could have prevented it.”

Parakilas alleges that Facebook turned a blind eye because the company felt willful ignorance of the problem would diminish legal liability. Despite that, it was becoming increasingly apparent that a black market existed for Facebook user data, he said.

In November, the company’s vice president for global operations, Justin Osofsky, acknowleged that Facebook had been lax about defending user data in the past. But he said the company has since introduced more stringent rules requiring developers to explain what data they need and how they’re going to use it.

“We also do a variety of manual and automated checks to ensure compliance with our policies,” a Facebook spokesperson said in an e-mailed statement Tuesday. “These include steps such as random audits of existing apps along with the regular and proactive monitoring of apps.”

Had Kogan introduced his app a little more than a year later, he wouldn’t have been able to access users’ contact lists. That’s because Facebook reduced how much data it shared with developers in 2015, including details about work histories and relationship statuses.

Advertisement

Now that Facebook has amassed more then 2 billion users, it has less incentive to share its most valuable user data. By keeping that information close, the company can bolster its own ad business and reduce the risk of security breaches.

The shift was necessary because Facebook had been under fire for sharing data with third parties long before the Cambridge Analytica scandal.

In 2011, Facebook settled with the FTC and entered a consent decree after the regulator ruled that the company had deceived its users about privacy claims. “Facebook had a ‘Verified Apps’ program and claimed it certified the security of participating apps. It didn’t,” the FTC said at the time.

The action came not long after the Wall Street Journal reported widespread misuse of Facebook user information by app developers and third-party companies. In one case, an online tracking firm called RapLeaf was found to be collecting user data and selling it to advertisers and political consultants. Facebook later banned the company.

While the breadth of data now available to app developers has diminished, experts say it has only increased for Facebook. That includes tracking users’ locations, their payments and “activities on and off Facebook from third-party partners,” according to the company’s data policy.

“They’re still collecting tons of information from us,” said Betsy Sigman, a professor at Georgetown’s McDonough School of Business. “And they’re sharing it all over the place and making money. It’s the greatest registry the world has ever seen.”

Advertisement

david.pierson@latimes.com

Twitter: @dhpierson

Advertisement