IBM has created a supercomputer that will compete toe-to-toe in a game of Jeopardy! against Ken Jennings and Brad Rutter, two other (more organic) trivia machines. Aptly named “Watson” (though it is actually named after IBM’s founder, not the English Doctor), the supercomputer has the ability to learn by studying past questions as well as parsing natural language queries to determine its strengths and weaknesses. While Watson is much less ominous than Skynet, HAL 9000 or Windows ’95, it still has the same potential to make our lives utterly insignificant.
Essentially, Watson is able to compute an answer in just three seconds by running the question through thousands of algorithms, each specifically designed for a certain type of question (for instance, sports, geography, or literature), which rely on a massive database of information. Each answer or “intermediate hypothesis” returned is assigned a confidence level which decides how likely it is. Another series of algorithms are used for risk-assessment, to decide whether or not to buzz in. What’s even more impressive is that Watson is completely self-contained, and does not require an Internet connection.
It is impractical — and likely impossible — to create a structured database that correlates every possible question with an answer, so Watson depends heavily on natural language processing to determine the context of a word or phrase, which is precisely why Jeopardy! was chosen as a benchmark. Future technology (read: Starships) will require a much more human-like interaction with machines. Currently, questions have to be worded in a very specific way in order for them to be recognized by a computer. Watson is able to interpret questions as a human would, and can even recover from mis-formulated queries.
Text has been mined from the web and encyclopedias to give Watson an understanding of what humans would consider “common knowledge.” David Gondek, one of Watson’s engineers, mentions how difficult puns are to compute, since they rely heavily on word association.
“So, what does “bat” mean, if I say “I got hit with a bat”? You don’t know what it means. And so what we do with Watson is we learn how to disambiguate language and learn that, in some cases, we can tell it’s a flying mammal and in other cases it’s a wooden object, and in other cases it’s a verb.”
Computers are very good at solving complex math equations and answering definable questions. But, due to the nature of words, natural language questions have always been extremely difficult for computers to solve, so the research behind Watson (dubbed “DeepQA“) will undoubtedly be indispensable in the future.
For you hardware nuts, Watson has over 3,000 cores, 15 terabytes of RAM, and 80 teraFLOPS of performance. For comparison, the fastest supercomputer in the world has an impressive peak computing rate of 2.566 petaFLOPS.
The IBMÂ Jeopardy! Grand Challenge will air on February 14th. Teaser video below:
Published: Jan 14, 2011 08:59 am