Advertisement

Google, privacy and a sex tape

A French Court ruled that Google must filter certain images of Max Mosley out of its search engine. Above, Mosley is seen in 2011 in France.
(Patrick Hertzog / AFP/Getty Images)
Share

Max Mosley won fame in the motorsports world as the longtime chief of Formula One’s governing body. Five years ago, however, Mosley’s notoriety spread far beyond the race circuit, and not in a good way. A British tabloid released a prison-themed sadomasochistic sex tape featuring Mosley and five prostitutes, and alleged that Mosley had paid for an orgy set in an ersatz Nazi concentration camp — an accusation made all the more sensational by the fact that Mosley’s father had led Britain’s fascist party in the 1930s. A British court later ruled that the Nazi allegation was baseless and that the tabloid had violated Mosley’s privacy, but images from the sex tape continued to flourish online. Rather than attempting the daunting task of retrieving and destroying the photos, Mosley sought to render them invisible on Google. And last week, a French court decided that the law was on his side.

The Mosley case is just the latest clash between those whose interests have been threatened on a grand scale by Internet users who post copyrighted or private material unlawfully and the technology companies like Google that act as middlemen, enabling the public to find and share it. The law gives people such as Mosley tools to force websites to take down material that invades their privacy, defames them, infringes their copyrights or violates their trademarks. But those efforts often become a futile game of whack-a-mole, because the material replicates quickly and moves easily from one site to the next — or to sites in countries that don’t respond to takedown requests. So rather than endlessly battling the people who improperly put content online, aggrieved parties have asked lawmakers and the courts to shift responsibility to the companies that allow it to spread.

Mosley argued to the Tribunal de Grande Instance in Paris (the French equivalent of a U.S. District Court) that Google should alter its search engine so that users couldn’t find nine images from the sex tape that had circulated online. The company has the technology to keep copyrighted videos off YouTube and eliminate links to child pornography, but its lawyers insisted that identifying and blocking links to the nine Mosley images would require an “unprecedented new Internet censorship tool.” The court was unmoved, ordering Google to filter out links to the Mosley images in its search results for five years, starting in January. The company has said it will appeal; meanwhile, Mosley has asked a German court to order Google to block links to the full sex tape.

Advertisement

The exasperation that Mosley feels with Google is widely shared by people whose valuable or private content has spread online like a stack of papers caught by the wind. Not only is Google by far the most popular gateway to information online, but it makes money off searches for illegal uploads, and often supplies the advertisements that sustain the sites that host the material.

It’s not easy to bury something after it’s been published, however. Despite Google’s prominence, it’s just one of several tools available for tracking things down on the Net, and more of them will emerge over time. Applying a restriction just to Google won’t stop people from finding a banned photo or file through other means. And filters are hardly foolproof; instead, there’s an ongoing cat-and-mouse game between companies that make content barriers and uploaders trying to circumvent them. No matter how definitive a court order may sound, the results in practice are far less certain.

Nor is trying to hide the material necessarily the best policy. Mosley had a strong case because courts in Britain and France had ruled that his privacy had been invaded. In other words, he’s battling to suppress Web content that shouldn’t have been created in the first place. That’s not true of many other types of content that some might be eager to remove — for example, offensive Tweets or misguided blog posts that later prove to be a political liability. The Mosley case shouldn’t provide a template for forcing search engines to hide such things. There is a difference between enforcing privacy rights and helping people conceal an inconvenient historical record.

And even in cases such as Mosley’s, it’s risky to have courts mandating changes in technology. Granted, neither Google nor any of the other major search engines are neutral information finders. They all tweak their results to promote some sources and types of content over others. Still, they’re able to search through the vast reaches of the Net and deliver results at eye-blink speed because there are no humans weighing in on the propriety of each search or each result it produces. Yet that sort of human judgment is what some people like Mosley want. In essence, they’re arguing that Google needs to pay more attention to where it’s leading people online.

Such intervention doesn’t work on a scale as large as the Internet. That’s not to say there’s no role for online middlemen such as Google to play in preventing their users from violating the law — witness its work to stop pirated material from being uploaded to YouTube, which it does in collaboration with copyright holders. Rather, it’s to remind the courts of the trade-offs involved when they try to make something disappear from the Net.

Advertisement