It's a hard problem, but it's one Allen is eager to solve. After years of pondering these ideas abstractly, he's throwing his fortune into a new venture targeted entirely at solving the problems of machine intelligence, dubbed the Allen Institute for Artificial Intelligence or AI2 for short. It’s ambitious, like Allen's earlier projects on space flight and brain-mapping, but the initial goal is deceptively simple. Led by University of Washington professor Oren Etzioni, AI2 wants to build a computer than can pass a high school biology course. The team feeds in a textbook and gives the computer a test. So far, it's failing those tests… but it's getting a little better each time.
The key problem is knowledge representation: how to represent all the knowledge in the textbook in a way that allows the program to reason and apply that knowledge in other areas. Having the computer study biology is a way of laying the groundwork for new kinds of learning and reasoning. "How do you build a representation of knowledge that does this?" Etzioni asks. "How do you understand more and more sophisticated language that describes more and more sophisticated things? Can we generalize from biology to chemistry to mathematics?"
That also means getting a grip on the complexity of language itself. Most language doesn't offer discrete pieces of information for computers to piece through; it's full of ambiguity and implied logic. Instead of simple text commands, Etzioni envisions a world where you can ask Siri something like, "Can I carry that TV home, or should I call a cab?" That means a weight calculation, sure — but it also means calculating distance and using spatial reasoning to approximate bulkiness. Siri would have to proactively ask whether the television can fit in the trunk of a cab. Siri would have to know "that TV" refers to the television you were just looking at online, and that "carry it back" means a walking trip from the affiliated store to your home. Even worse, Siri would have to know that "can I" refers to a question of advisability, and not whether the trip is illegal or physically impossible.
- More Here
The key problem is knowledge representation: how to represent all the knowledge in the textbook in a way that allows the program to reason and apply that knowledge in other areas. Having the computer study biology is a way of laying the groundwork for new kinds of learning and reasoning. "How do you build a representation of knowledge that does this?" Etzioni asks. "How do you understand more and more sophisticated language that describes more and more sophisticated things? Can we generalize from biology to chemistry to mathematics?"
That also means getting a grip on the complexity of language itself. Most language doesn't offer discrete pieces of information for computers to piece through; it's full of ambiguity and implied logic. Instead of simple text commands, Etzioni envisions a world where you can ask Siri something like, "Can I carry that TV home, or should I call a cab?" That means a weight calculation, sure — but it also means calculating distance and using spatial reasoning to approximate bulkiness. Siri would have to proactively ask whether the television can fit in the trunk of a cab. Siri would have to know "that TV" refers to the television you were just looking at online, and that "carry it back" means a walking trip from the affiliated store to your home. Even worse, Siri would have to know that "can I" refers to a question of advisability, and not whether the trip is illegal or physically impossible.
- More Here
No comments:
Post a Comment