The data scientist Jeff Hammerbacher, one of Facebook's early employees, lamented years ago that "the best minds of my generation are thinking about how to make people click ads." Social media companies succeeded in part by gathering as much information as possible about their users, so that ads could be micro-targeted as never before. It was suddenly much easier to reach and influence almost any kind of person — vegan weightlifters, estate lawyers with corgis.
Americans are now learning that "almost any kind of person" also includes self-avowed racists and unapologetic anti-Semites. Facebook has attracted sudden, intense scrutiny with news that it enabled advertisers to target users who expressed interest in topics including "Jew hater" and "how to burn Jews." (Those categories were generated automatically, not conceived by Facebook employees.)
As the public reacted in dismay, the tech giant pledged that it would work to block such advertising in the future, showing deference to the norm that bigots deserve to be shunned. Their response reinforced the stigma associated with prejudice. Going forward, it will likely be marginally more difficult for malign political actors to reach coalitions of the hateful. And Facebook Chief Executive Mark Zuckerberg has repeatedly said that "there is no place for hate in our community."
But if it is now easier than ever to identify and target bigots, isn't there an opportunity to do more than merely stymie the hatemongers as they try to reach one another? Maybe this is an opportunity for the good guys to find the bad guys, invade their timelines, and convert them to anti-racism.
There is precedent for drawing people out of hate groups. The black musician Daryl Davis began to interview Ku Klux Klan members in the early 1980s, formed relationships with many, and ultimately persuaded several to give up their hoods and robes. Derek Black was an heir apparent in the white supremacist movement when his classmates at New College of Florida found out about his hateful ideology. Rather than immediately cutting ties, they engaged their classmate in a series of social encounters. Their perseverance paid off: He renounced white supremacy and began working against it.
Facebook and other tech companies aren't going to reach white supremacists as friends or acquaintances might, face to face, but there may be ways to nudge them away from extremist bigotry by exposing them to new information or different social circles. Surely the expert data scientists and product creators of Silicon Valley can investigate what works to change their online behavior.
Sure, I'm a bit wary of urging quasi-monopolistic corporations to manipulate members of the public, even in the service of A/B testing bigot conversion. Maybe deprogramming neo-Nazis is a step on a slippery slope that ends in all manner of mind control.
For those who take that view, here's an alternative that would allow tech companies to stand at a greater remove: Outside organizations could come up with messages to engage hatemongers; Facebook and others would merely let them direct micro-targeted ads at those who "like" phrases such as "how to burn Jews," perhaps free of charge. What I have in mind is a persuasion campaign akin to the old "The More You Know" public service announcements. Our era's version could be "The Less You Hate."
Both of these suggestions admittedly transgress our bygone ideal of the web as a content-neutral platform for communication, where the architects step back and let a spontaneous order emerge. Urging platform owners to interfere, directly or even at a remove, feels like a failure.
But Facebook and its competitors already marshal the most private data about our lives and intimate relationships in an effort to keep us on their sites, clicking their links. They are already heavy-handed groundskeepers and they will never fully embrace content neutrality, as their decision to block future ads targeting "Jew haters" illustrates. So long as they're interceding in some cases, it seems to me they might as well do so against the KKK, the most murderous terrorist organization in U.S. history, and other white supremacist groups.
Conor Friedersdorf is a contributing writer to Opinion and a staff writer at the Atlantic.