In the two years since the fake-news problem on Facebook and other major social media networks burst into the spotlight, the companies have taken one dramatic action after another to try to rid themselves of disinformation. At Facebook, for example, the company deleted more than 2.8 billion bogus accounts from Oct. 1, 2017, to Sept. 30, 2018; those accounts are the frequent launching pads for spam, scams and fake news. Twitter periodically announced similar crackdowns, such as its takedown of more than 10,000 accounts in late 2018 that spread false information to try to deter Democrats from voting in the midterm elections.
The task seems Sisyphean, however. As Facebook dryly noted in its latest community standards enforcement report, although it removed more fake accounts in the first half of 2018 than in the previous six months, “the increase did not have any effect on prevalence of fake accounts on Facebook overall.” Those continue to represent 1 out of every 25 to 33 sign-ups. The number is so high, the company explained, because “bad actors try to create fake accounts in large volumes automatically using scripts or bots.”
In other words, these platforms continue to be gamed to spread disinformation and manipulate their users.
That’s why it was encouraging to see Pinterest, a social scrapbooking site online, take a dramatic step to combat another damaging form of misinformation online: the spread of debunked or outright false health claims. Instead of trying to stop people from expressing potentially harmful views, it’s trying to stop itself from spreading them.
Pinterest’s action focuses on the “pins” — that is, pictures or graphics copied from other Pinterest users’ pages or other sites, accompanied often by comments — that discourage childhood vaccinations or promote fake cures for terminal or chronic diseases. The company actually barred users from posting that sort of content in 2017, but vaccine myths and fake cures kept making their way onto the site.
Last week, the company revealed that it has taken an extra step, disabling searches related to these topics. Now, searching on Pinterest for “vaccine harms” will return a blank page with the explanation, “Pins about this topic often violate our community guidelines, so we're currently unable to show search results.” The same happens on a search for “diabetes cures,” for example.
The change acknowledges how hard it is to keep a site free of potentially harmful material when the site relies on users to supply the content. Given that reality, it makes sense to try to limit what gets found and shared.
There’s a trade-off, though: Simply disabling searches to cut off misleading information can also hide factual and useful content. When Pinterest has gone this route before, on searches for pins related to suicide, eating disorders and self-harm, it redirected users to pages offering support for those who need help. The company is moving in that direction on the newly filtered searches, but that’s a work in progress.
Meanwhile, blocking search results tees up a never-ending game of whack-a-mole, as the barred search terms (“vaccine harms,” for example) get replaced by alternatives that are not (“vaccine risks”). That’s a problem not just for Pinterest, but for any platform trying to weed out content that violates its rules.
Pinterest is still looking for better ways to stop its platform from being used to distribute fake health news. Part of the answer lies in developing algorithms that don’t unwittingly promote that sort of thing just because it’s in high demand among a portion of the site’s users — again, a challenge common to social networks in general. Another part is to enlist other sites that are rife in health misinformation in the effort to weed it out too.