Many Facebook users rely on the social network to figure out what's going on in the world. But what if the world Facebook shows them is wildly distorted?
That's the question raised after a former employee of a data mining firm that worked for Donald Trump's presidential campaign alleged the company used Facebook to bombard specific individuals with misinformation in hopes of swaying their political views.
The accusations raised alarm across the Atlantic on Monday, sparking an investigation into the firm, Cambridge Analytica, by the United Kingdom's Information Commissioner's Office. In the U.S., Sen. Ron Wyden (D-Ore.) sent a letter asking Facebook Chief Executive Mark Zuckerberg whether the social media giant was aware of other data violations on its platform, and why it failed to take action sooner.
The controversy drove Facebook's stock price down nearly 7% on Monday, suggesting that investors are feeling skittish about the regulatory liabilities of a company that has spent the last year dogged by questions of fake news and Russian propaganda.
The scope of Facebook's problems ballooned after Christopher Wylie, a political strategist who used to work for Cambridge Analytica, alleged on NBC's "Today" show Monday that the firm believed that if it could "capture every channel of information around a person and then inject content around them, you can change their perception of what's actually happening."
By mining Facebook user data, Wylie said, the company could tailor the ads and articles individual users would see — a practice he calls "informational dominance."
In a video secretly recorded by Britain's Channel 4, Mark Turnbull, managing director of Cambridge Analytica's political division, suggests users targeted by the firm wouldn't know their online experience was being manipulated.
"We just put information into the bloodstream of the internet ... and then watch it grow, give it a little push every now and again … like a remote control," he said. "It has to happen without anyone thinking, 'that's propaganda,' because the moment you think 'that's propaganda,' the next question is, 'who's put that out?'"
Turnbull, according to Channel 4, also bragged about the firm's practice of recording politicians in compromising situations with bribes and sex workers.
In a statement sent to The Times, Cambridge Analytica accused Channel 4 of entrapment and rejected the allegations made in the report. In a separate statement, also issued Monday, the firm said it did not carry out "personality targeted advertising" for President Trump's campaign.
The company obtained the Facebook data linked to 50 million accounts through a Cambridge University psychology professor who had permission to gather information on users of the social media platform, but violated Facebook guidelines by passing it on to a third party for commercial purposes. Although Cambridge Analytica said in a news release over the weekend that it deleted the data as soon as it learned it had broken Facebook's rules, Wylie alleged that the firm continued to use the information.
What's worrisome about Cambridge's alleged practice, say social media and psychology experts, is that it works on even the most rational of people.
"Attribution theory teaches us that if you hear the same thing from multiple sources, then you start believing that it might be true even if you originally questioned it," said Karen North, a social media professor at USC who has also studied psychology.
In Cambridge Analytica's case, Wylie on Monday accused the firm of going beyond simply serving targeted ads to people on Facebook. He alleged that the firm "works on creating a web of disinformation" so that unwitting consumers are confronted with the same lies and false stories both on and off Facebook.
"Even if you thought it was just one biased person or one paid ad, when you start to see it everywhere, you start thinking there's a critical mass of people or experts that buy into the same position," North said. "You start to believe there must be a groundswell of support for it."
The ability to target ads at individuals isn't unique to Facebook. But what makes the social media giant's role profound is the breadth and depth of information it collects and the sheer number of people who use the service. Last year 67% of Americans told Pew Research that they get at least some of their news on social media. In 2016, 64% of those who got their news from social media got it from only one source — most commonly Facebook.
Since the 2012 presidential campaign, Facebook has been the "number one destination" for digital media strategists looking to influence politics, according to Laura Olin, a digital strategist who ran social media strategy for former President Obama's reelection campaign.
Prior to that election, campaigns spread their focus among Facebook, Twitter and traditional media outlets, she said. But in 2012, three things became clear:
People were spending more of their online time on Facebook than anywhere else.
It reached a broader demographic than its competitors.
Ads could be targeted more effectively on Facebook than on other platforms.
The Obama campaign that year was able to aim advertisements and messages at voters based on gender, location and existing political beliefs.
"We showed people what it could look like," said Olin, who ran Obama's Facebook pages during the campaign. "From there, people realized they could use paid advertising to reach voters in a targeted way. I feel some guilt over any potential part I might have played in that."
Digital media experts such as Olin worry that the growing influence of misinformation on Facebook is likely to get worse before it gets better.
In 2013, 47% of Americans used Facebook as a source for news, according to research from Pew. In 2016, that number had grown to 63%. Facebook itself has nearly 2.2 billion people who visit its website and app every month, and its subsidiaries continue to grow, with Instagram commanding nearly a billion monthly active users, WhatsApp recording more than a billion users, and Messenger at more than 900 million users.
The social network has pledged to more than double its current team of 10,000 content moderators by the end of 2018 to keep false and misleading information in check. But with hundreds of millions of photos, videos and articles uploaded to Facebook every day, safety and security experts question whether this will be enough.
Despite the rampant misinformation on the platform, users flock to it, North said. Policing its platform will be especially hard for Facebook, she said, because the tools used for propaganda — the wealth of information it collects and its micro-targeted advertisements — are the same ones Facebook uses to generate revenue.
Gathering and selling access to that kind of granular data helped increase Facebook's advertising revenue last year by 49%. Advertising accounted for more than 98% of Facebook's total revenue in 2017, according to company filings.
And so despite Facebook's share price dropping $12.53 on Monday to $172.56 after the Cambridge Analytica allegations, multiple analysts maintained a "buy" rating on the company's stock.
"What's important to understand is that all social media platforms can be 'weaponized,' so this is not limited to Facebook by any means," analysts at Monness, Crespi, Hardt & Co. said in a note to investors.
Or, as Olin put it: "No one thinks of themselves as a fake news consumer. We all assume we're smarter than that."