The incorrect election results were shown Monday in a 2-day-old story posted on the pro-Trump website 70 News. A link to the site appeared at or near the top of Google's influential rankings of relevant news stories for searches on the final election results.
Google acknowledged that its search engine misfired. "In this case we clearly didn't get it right, but we are continually working to improve our algorithms," the company said in a statement.
As of Monday afternoon, the link to the 70 News story — which had the misinformation in both its headline and the body of the text — remained prominent in Google's search results.
Bad information in an online headline or at the top of a story can be particularly damaging. Roughly 53% of the people who land on a Web page stay for 15 seconds or less, according to online analytics firm Chartbeat.
Although Google rarely removes content from its search results, the Alphabet Inc.-owned company is taking steps to punish sites that manufacture falsehoods. Late Monday, it said it will prevent its lucrative digital ads from appearing on sites that "misrepresent, misstate, or conceal information." The action could give sites a bigger incentive to get things right or risk losing a valuable source of revenue.
Google's dominant search engine is the leading source of traffic to media sites, according to Chartbeat. Meanwhile, a study by the Pew Research Center found about 60% of Americans get at least some of their news from social media sites such as
False information is nothing new on the Internet. But the problem has gained more attention in the post-mortem of a bitterly contested presidential election.
Facebook Inc. has been accused of possibly swaying the election's outcome by promoting fake news stories on its social network. Last summer, the company fired a handful of journalists who oversaw its "trending" news list and replaced them with an algorithm; fake news stories quickly began to trend.
CEO Mark Zuckerberg brushed off that criticism, calling it "crazy" last week. He elaborated in a Saturday post on Facebook in which he asserted that "more than 99% of what people see" on Facebook is authentic. He said that more must be done to block bogus information, but that determining what's blatantly wrong isn't always an easy task.
"Identifying the 'truth' is complicated," Zuckerberg wrote. "While some hoaxes can be completely debunked, a greater amount of content, including from mainstream sources, often gets the basic idea right but some details wrong or omitted."
The stories featured in the feeds of Facebook users are primarily selected by automated formulas called algorithms. Google's search results are also powered by algorithms, which the company regularly revises to thwart sites that try to artificially boost their prominence.
Incorrect information is bound to ripple across the Internet as more people rely on their phones, computers and other digital devices to read news that is picked out for them by automated programs, media analyst Ken Doctor said.
"These algorithms bring a lot of things into our lives that humans cannot do," Doctor said. "But when algorithms fail, it highlights the fact that they are not just some kind of neutral technology. They are programmed by human beings, and they have all the failings of human beings."