Slate's Anne Applebaum stirred up a hornet's nest recently by arguing that websites should stop allowing people to post comments anonymously. Although it's not a new argument -- in fact, it's one that we at The Times have struggled with for years -- Applebaum gives it a powerful new rationale: Readers' views can be influenced more by trolls than by the piece they just read.
It's an interesting argument, but I don't think banning anonymous comments would solve the problem Applebaum is trying to solve. As I'll explain below, The Times' dalliance with Facebook-based comments is a case in point.
According to Applebaum's piece, "Multiple experiments have shown that perceptions of an article, its writer, or its subject can be profoundly shaped by anonymous online commentary, especially if it is harsh." In other words, the work that a writer does to bring facts to light can be negated by the invective spewed (OK, typed) by the peanut gallery. Facts, logic, even reasoning don't matter, just vehemence.
What's worse, Applebaum reports, special interests are deploying trolls to neutralize reporting they don't like. The comment mercenaries include Russian teens hired to praise the Kremlin while blasting its opponents (and the United States) -- a model of disinformation that politicians around the world may be putting into service. And how many times have you read comments on a hotel or restaurant review that read as if they were written by an employee of the company?
Hence the interest in banning anonymous comments, which is easier and less expensive than hiring moderators to review comments before they're published and display only the ones that add the most to the conversation.
The problem is, requiring people to comment under their real names is no guarantee that they'll behave less like trolls.
For the record, I'm not talking about readers who react strongly to a news story or opinion piece. I define a troll as someone who posts a comment that's off-topic and inflammatory, or who responds to other people's comments in a demeaning way to provoke a response. The underlying subject is immaterial to trolls, other than it helps define the audience of people to offend.
Trolls have been most active here on emotionally charged pieces, such as those involving immigration, race and birth control. So back in 2011, the paper plugged Facebook's commenting platform into its blogs in the hope that Facebook's real names policy would weed out the trolls and raise the level of discussion.
The main effect was to reduce the volume of comments, as some readers balked at joining Facebook just for the privilege of commenting on our site. Other than that, it was hard to detect much change, if any, in the comments posted.
That's my unquantified and unscientific analysis, but that opinion is shared by Martin Beck, who was The Times' director of social media and reader engagement. One of Beck's jobs was to help moderate comments, which gave him the best seat in the house when it came to watching trolls.
According to Beck, who now writes about social media for Marketing Land, the switch to Facebook "really didn't do too much" to deter trolls, especially on the subjects that typically drew the most vitriolic comments. "The level of discourse wasn't remarkably improved," he said in an interview Wednesday.
Using Facebook for comments also made it harder to moderate them, in part because complaints about abuse went to Facebook, not The Times. So when the newspaper moved its blogs onto a different publishing platform, it replaced Facebook with a system from Viafoura. It also hired an outside company to help moderate comments.
Those changes have resulted in more comments being blocked, which has caused its own blowback. Readers have accused The Times of being thin-skinned, enforcing political correctness and censoring remarks that disagree with the paper's reporting. We've been guilty of all three, but not consistently and not as a matter of policy. Some moderators seem to equate sharp criticism with abuse, and they occasionally err on the side of not publishing remarks that might offend, rather than waiting for someone to raise a valid complaint.
In the face of such shortcoming, the easiest thing to do would be to stop allowing people to comment on stories, period. That wouldn't be a meaningful change for many readers, who already take their comments about our stories to other venues (e.g., Twitter, Facebook and their favorite blogs). And it would solve the troll problem once and for all.
Personally, I think the right solution to bad speech is more speech. I cling to the view that readers have the critical-thinking skills necessary to separate signal from noise, and they're so inured to trolls that they're not fooled or driven off by them.
Granted, that runs counter to the research cited in Applebaum's piece, as well as the oft-repeated observation that people are quick to accept information if it reinforces their preconceived notions of reality. But other research suggests that we're more open to new sources of data than the conventional wisdom holds.
What do you think? Should news sites such as this one block anonymous comments, drop them entirely, or stick with the status quo? Or do you have a genius idea for how to elevate the quality of the debate online?
Since its publication Wednesday afternoon, this post has drawn a bevy of thoughtful comments from readers, the vast majority of whom appeared to be using names other than the ones their parents bestowed upon them at birth. There was a strong consensus in the group in favor of publishing comments and permitting anonymity, even if readers have to deal with trolls.
Wrote "Ed Wood," "Ignorant and/or baiting comments may just be the price of a free society, in the way that I find it almost impossible to get away from panhandlers of one sort or another, even in the suburbs, apart from withdrawing into secured and gated areas for shopping and living. I wish it might be different, but in America people have the constitutional right to annoy other people."
"RAmeeti" voiced a common sentiment with this remark: "Without allowing the readers to comment, the paper will appear to be taking the attitude that [it] can do no wrong. Comments are needed to allow an alternative point of view, a correction to a published 'fact', etc."
Several readers specifically objected to using Facebook's commenting platform, as The Times used to do. For those readers, let me quote the great Aaron Rodgers: "R-E-L-A-X." We're not going back there, as far as I know.
Readers also suggested a number of ways to improve The Times' approach. A common wish was that fewer comments be disabled. But some also sought a limit on the number of comments people can make on posts, as well as a way to "unlike" a comment. And here's a suggestion I particularly liked, from reader "Ceptfur":
"I would like it if those [who] reference and or quote sources should have to link to their source. Since the LAT does not check for facts at least we should have the opportunity to scrutinize sources for ourselves."
(Note to Ceptfur: Asking the moderators to be fact-checkers too would turn a Herculean task into an impossible one.)
We're still grappling with how to start discussions that are as civil as they're vigorous, so please share your thoughts in the comment section, with or without your real name.