In development: the next generation of artificial intelligence – An A.I. that doesn’t just perceive but imagine and reason, too? A startup called Vicarious is working on just that.

Last year, the film Her imagined a world where a sophisticated piece of software, voiced by actress Scarlett Johansson, thought and learned a lot like humans. But a San Francisco startup called Vicarious aspires to bring a Her-like level of so-called “artificial intelligence” to the real world.

Today’s A.I. can pull off tasks like offering a Web site’s customers written advice on which shampoo to buy, but co-founders D. Scott Phoenix and Dileep George envision a day when robots powered by their software do much more. “Jobs like cleaning up the Fukushima reactor, treating Ebola patients, or building infrastructure on Mars are very dangerous for humans, and having A.I. to support us would be incredibly valuable,” Phoenix says.

Sure, those scenarios are far from becoming reality, if they ever do at all. But Vicarious, which is backed with $70 million from Silicon Valley luminaries like Mark Zuckerberg and Elon Musk, has made progress by taking an unusually visual approach.

In 2013, Vicarious generated buzz when it announced its A.I. could solve CAPTCHAs, the ubiquitous Web site puzzles that require visitors to transcribe a string of distorted letters—a basic test used to verify that the site’s visitor is actually human—to proceed. Now not only can A.I. recognize objects and shapes, it can “imagine” them, too, like clouds in the shape of an animal. Sentient, Ebola-treating robots? Not quite. But it’s a start.

JP Mangalindan
“In development: the next generation of artificial intelligence”
January 2, 2015