Advertisement

Search engine doesn’t just find, it also computes

Share

How long does it take to get to Saturn at, say, the speed of light?

With Wolfram Alpha, the online “computational knowledge engine” that launched Monday, the answer -- 75 minutes -- can be found in a fraction of a second.

Web users can submit customized questions to the service, and Wolfram Alpha will try to work out the answer on the fly. The chance that a healthy 35-year-old woman will contract heart disease in the next 10 years? One in 167. The temperature in Washington, D.C., during the July 1976 bicentennial? An average of 74 degrees.

For questions like these, Google and Wikipedia, perhaps the two best known online reference tools, would search through vast databases of existing Web pages hoping for a match.

Advertisement

Not so with Wolfram Alpha. “We’re not using the things people have written down on the Web,” said Stephen Wolfram, the project’s creator and the founder of Wolfram Research Inc., which is based in Champaign, Ill. “We’re trying to use the actual corpus of human knowledge to compute specific answers.”

To do that, Wolfram and his team of human curators have equipped their system with a wide array of mathematical equations, as well as 10 terabytes of data from thousands of sources: scientific journals, encyclopedias, government repositories and any other source the company feels is credible. That generally doesn’t include user-created websites.

How much data is 10 terabytes? Ask Wolfram Alpha: It’ll tell you that’s about half the text-based content held by the Library of Congress.

And there’s more to come.

Adding more data and computational capability is an endless process, Wolfram said. “The main thing we have to do is to work with experts in every possible domain.”

Whether all that specific knowledge will translate into advertising dollars remains to be seen. Some analysts are skeptical about the site’s potential to become a Google-like thoroughfare for online consumption.

Most search revenue comes from people doing commerce-related searches, said Douglas Clinton, an analyst at investment firm Piper Jaffray Cos. “You’re not going to want an answer from Wolfram Alpha’s computer about what the best digital camera is, because there’s not really an algorithmic answer to a question like that.”

Advertisement

As a much-hyped entrant into the knowledge search market, Wolfram Alpha has not escaped comparisons to Google and speculation about whether it could steal some of the search giant’s massive market share.

But their mission statements make it clear that the two services are not identical.

Google famously hopes to “organize the world’s information and make it universally accessible and useful.”

The focus of Wolfram Alpha, on the other hand, is to “make it possible to compute whatever can be computed about anything.”

Lofty hopes, but neither is there yet.

Wolfram Alpha can display the molecular structure of the solvent acetone. It can list recent earthquakes near most U.S. cities. And it can tell you the rate of inflation in Tanzania.

Yet it gets tripped up on a question as simple as “What time is it?”

As Wolfram himself points out, making the engine smarter is not just a matter of shoveling in more data. Even when the answer already exists in the database, the software may simply be unable to understand the question.

“What time is it in California,” for instance, yields the correct result.

Half the battle, then, is teaching the program to parse human language so it knows what it’s being asked to do.

Advertisement

But as rough as it may seem now, Wolfram Alpha looks to be the leading edge of a newer, smarter crop of search engines.

It’s the use of so-called semantic technologies, where computers grapple with concepts and simple learning, that may define the next generation of Web services.

Does that mean artificial intelligence? Not quite yet, said James Hendler, a professor of computer science at New York’s Rensselaer Polytechnic Institute.

“Computers are getting very good at the sort of powerful learning that comes from recognizing patterns in very large sets of data,” he said. “But they still haven’t gotten at all good at figuring out the very general, intuitive, complex things that make us human.”

--

david.sarno@latimes.com

Advertisement