Facebook scans the photos and links you send on Messenger, and it reads flagged chats

Icons for Facebook and its Messenger and Messenger Kids apps are shown on an iPhone.
(Jenny Kane / Associated Press)
Share via

Facebook Inc. scans the links and images that people send each other on Facebook Messenger, and reads chats when they’re flagged to moderators, making sure it all abides by the company’s rules governing content. If it doesn’t pass muster, it gets blocked or taken down.

The company confirmed the practice after an interview with Chief Executive Mark Zuckerberg, published this week, raised questions about Messenger’s practices and privacy. Zuckerberg told Vox’s Ezra Klein a story about receiving a phone call related to ethnic cleansing in Myanmar. Facebook had detected people trying to send sensational messages through the Messenger app, he said.

“In that case, our systems detect what’s going on,” Zuckerberg said. “We stop those messages from going through.”


Some people reacted with concern on Twitter: Was Facebook reading messages more generally? Facebook has been under scrutiny in recent weeks over how it handles users’ private data, and the revelation struck a nerve. Messenger doesn’t use the data from the scanned messages for advertising, the company said, but the policy may extend beyond what Messenger users expect.

The Menlo Park, Calif., company told Bloomberg that although Messenger conversations are private, Facebook scans them and uses the same tools to prevent abuse there that it does on the social network more generally. All content must abide by the same “community standards.” People can report posts or messages for violating those standards, which would prompt a review by the company’s “community operations” team. Automated tools can also do the work.

“For example, on Messenger, when you send a photo, our automated systems scan it using photo matching technology to detect known child exploitation imagery, or when you send a link, we scan it for malware or viruses,” a Facebook Messenger spokeswoman said in a statement. “Facebook designed these automated tools so we can rapidly stop abusive behavior on our platform.”

Messenger used to be part of Facebook; it was spun off into a separate application in 2014. Facebook’s other major chat app, WhatsApp, encrypts both ends of its users’ communications, so not even WhatsApp can see it — a fact that has made WhatsApp more secure for users and more difficult for investigators. Messenger also has an encryption option, but users have to turn it on.

The company updated its data policy and proposed new terms of service Wednesday to clarify that Messenger and Instagram use the same rules as Facebook does. “We better explain how we combat abuse and investigate suspicious activity, including by analyzing the content people share,” Facebook said in a blog post.

Facebook is on the defensive after revelations that private information from as many as 87 million users wound up in the hands of political ad-data firm Cambridge Analytica without their consent. Zuckerberg has agreed to testify before the House next week and is holding a conference call Wednesday afternoon to discuss changes to Facebook privacy policies.


The company is working to make its privacy policies clearer, but still ends up with gaps between what it says users agreed to and what users think they agreed to.

The Messenger scanning systems “are very similar to those that other internet companies use today,” the company said.

To read this article in Spanish, click here


3:05 p.m.: This article was updated to clarify that Facebook scans images and links sent in its Messenger app and that it reads chats that are flagged for moderators’ attention. The article was also updated to say that data from as many as 87 million users wound up in the hands of Cambridge Analytica without their consent.

10:30 a.m.: This article was updated with Facebook updating its data policy and proposing new terms of service.


This article was originally published at 9:10 a.m.