Column: Hate on the Web: Does banning neo-Nazi websites raise free-speech issues for the rest of us?

Share via

Many Internet users cheered when the Daily Stormer, a website openly devoted to white supremacy and neo-Nazism, was sent packing by its Web domain host, GoDaddy, following last weekend’s racist violence in Charlottesville, Va.

GoDaddy’s action, which turned the Daily Stormer into a site without a host, seemed like a beacon of effective response to an era of rising hate speech online — years of vicious attacks that had driven many women, blacks, LGBTQ individuals and others off such popular platforms as Twitter and Facebook, and seemed only to have intensified with the rise of Donald Trump.

But a counter-narrative already has emerged: Is this response really a good thing?’s forced march in search of an Internet home began Sunday, when GoDaddy gave the site 24 hours to find another domain host service. GoDaddy provided the link between its Internet protocol address, which is a series of numbers, and its URL, which is what users typed into their browsers to reach it.


Charlottesville is a flashpoint.... With this event the connection between hate speech and real-world violence is quite obvious.

— Brittan Heller, Anti-Defamation League

The Daily Stormer fetched up the next day at Google’s hosting service, which promptly sent it packing. Later the site appeared to be using a Russian hosting service, but by late in the week it seemed to be inaccessible anywhere on the Web.

Meanwhile, other online services said they would look askance at any potential clients associated white supremacist or neo-Nazi activities. After Charlottesville, PayPal issued a statement emphasizing that its Acceptable Use Policy bars accepting “payments or donations for activities that promote hate, violence or racial intolerance,” including “organizations that advocate racist views, such as the KKK, white supremacist groups or Nazi groups.” As my colleague Tracey Lien reported, Apple shut off Apple Pay services to several websites selling Nazi or white supremacist products.

Cloudflare, which speeds up and protects websites from hackers, terminated the Daily Stormer’s account after initially resisting calls to do so. As its co-founder and Chief Executive Matthew Prince explained in a blog post, Cloudflare took that action not because of the content per se, but because he was irritated that the Daily Stormer was bragging “that we were secretly supporters of their ideology.”

These actions turn an uncomfortable spotlight on the power that Internet gatekeepers have to deny services to websites they dislike.


There’s no question that GoDaddy, Google, and the other services have the legal right to refuse to do business with anyone they wish. Their actions don’t implicate the 1st Amendment’s guarantee of freedom of speech, since the amendment applies only to government agencies.

“This part of the Charlottesville story makes people think about who controls speech on the Internet,” says Daphne Keller of Stanford Law School’s Center for Internet and Society. “We don’t have 1st Amendment rights to stop private companies from shutting down our speech, and most of the Internet is run by private companies. Most of us want some intermediaries to play that role — when we go on Twitter, we don’t want to be barraged with obscenities and on Facebook we don’t want to see racism. But it’s kind of scary that all these other companies can also be shutting down speech willy-nilly, and that’s certainly their right under the law.”

By almost any standard, the Daily Stormer is an easy target for total eradication from the Web. Brimming with unapologetically bigoted and anti-Semitic content, the site is named after “Der Sturmer,” a Nazi propaganda newspaper that promoted violence against Jews during the Third Reich. Its proprietor, Julius Streicher, was convicted of crimes against humanity at Nuremberg and hanged. Wherever one chooses to draw the line separating appropriate discourse from hate speech, the Daily Stormer lies outside the boundaries of civilization. The Southern Poverty Law Center, a leading hate-group tracker, has endowed it with the label of “the top hate site in America.”

Nor is it hard to argue that the website crosses the line from mere speech to incitement. That’s the rationale cited by GoDaddy, which says it generally does “not take action on complaints that would constitute censorship of content,” except “where a site … crosses over to promoting, encouraging, or otherwise engaging in violence against any person.”, the firm says, “crossed the line and encouraged and promoted violence.”

Other Internet services came to the same conclusion. “Charlottesville is a flashpoint,” says Brittan Heller, the Anti-Defamation League’s director of technology and society and its liaison to Silicon Valley. “The reason that companies feel they can take action now, where they were uncertain earlier, is that with this event the connection between hate speech and real-world violence is quite obvious.”

Companies may have been reluctant to play Internet police in the past in part because vetting every website or utterance online could be a superhuman task. Distinguishing “hate speech” from political commentary can be daunting, Heller says, because much of it consists of dog whistles audible to a site’s followers but not outsiders. She mentions memes such as Pepe the Frog, an originally innocent cartoon character that was adopted by neo-Nazi groups despite the objections of its creator, and the triple parentheses that white supremacists and Neo-Nazis placed around the Twitter handles of users to identify them as ostensibly Jewish. That symbol eventually got co-opted as a symbol of solidarity among Jewish and progressive Twitter users.


As hate speech has proliferated on the Internet, especially over the last year, companies have been seeking out more tools to fight it. “We have now crossed the Rubicon,” Heller told me. “They feel they must do something because their users and the public are demanding it.”

But what if Internet gatekeepers begin to shun any potentially controversial speech to avoid disturbing some users or groups? “The standard 1st Amendment mantra is that we don’t need to worry about popular content,” says Eric Goldman, a cyberlaw expert at Santa Clara University law school. “It’s the unpopular content we need to fight for.” Almost anything in public discourse will be controversial to somebody: “If we decide we can suppress content because of its unpopularity, then no content is safe,” he says.

Some experts argue that the risk that any but the most noxious sources will entirely lose access to the Internet is vanishingly small, thanks to the sheer multiplicity of service providers. “What makes this not terribly troubling,” says Eugene Volokh of UCLA law school and a prominent blogger on 1st Amendment issues, “is that there are a lot of domain registrars and hosting services out there, and it’s pretty easy to switch.”

That’s a virtue, but it may eventually prove cold comfort. Consolidation among Internet services is proceeding apace, with little interference from regulators. Today’s multiplicity may morph into a small number of dominant providers and a few inefficient little ones.

In his post defending his shutdown of the Daily Stormer, Cloudflare’s Prince listed 14 categories of Internet service providers that could be choke points limiting someone’s access to the Internet or closing it off entirely, for reasons of their own. They include publishing platforms such as Facebook and WordPress, infrastructure providers such as Amazon Web Services, domain registrars such as GoDaddy, and search engines such as Google.

“Any of the above could regulate content online,” Prince observed. “The question is: which of them should?”


No one has come up with a surefire way to distinguish all hate speech from more innocuous expression online and act against it without being too heavy. In 2014, Heller’s department at the ADL worked with the tech community to develop a roster of best practices for responding to cyberhate on their platforms. They included terms of service with clear definitions of hateful content, “user-friendly mechanisms and procedures” for reporting it, and consistent enforcement and sanctions.

But she acknowledges that many have fallen short in execution. “You need both strong and transparent terms of service and effective and transparent mechanisms for enforcement,” she says. “Sometimes there’s a gap between having the ambition for a responsible response and having the bandwidth to enforce it.” Up to now, moreover, companies have relied on their users to report hate online rather than proactively looking for it.

The response by GoDaddy, Google, Cloudflare and other companies suggests that Charlottesville may have changed that, at least in the near term and for gross violations. “When people are using their platforms to plan violence, incite violence, and celebrate violence, that’s different,” Heller says. “That’s the Rubicon.”

Keep up to date with Michael Hiltzik. Follow @hiltzikm on Twitter, see his Facebook page, or email

Return to Michael Hiltzik’s blog.



Courageous or craven? Ranking the CEOs by how long it took them to bail on Trump

Toxic Trump: CEOs are now abandoning him in droves

CBO confirms canceling Obamacare’s cost-sharing subsidies would be a disaster — for Republicans