With learning robots and emotional computers, artificial intelligence becomes real
The robotic cockroach was called Zeus, and it came into the world knowing only two things.
The first was that it hated light. The second was that it could move its body – though it didn't know how, or what parts it had.
Within five minutes, Zeus had learned to walk. Within 15, it could walk backwards. The little robot, searching for darkness, had figured out that backing up is sometimes more efficient than making a forward turn.
Zeus’ tiny steps backward were an enormous step forward for its creator, James Crowder, one of Raytheon’s experts in the field of artificial intelligence. Their work in creating things that think, learn and reason includes mechanical versions of insects and octopuses, simulated emotions, cultural coaches and computerized versions of schoolteachers. And it all comes as the United States military looks for innovation in artificial intelligence as part of its plan to find new ways of pairing humans and machines.
Thinking 'at the speed of light'
Artificial intelligence is part of the Department of Defense’s “third offset” strategy, a plan to give the United States military strong advantages that would deter enemies from attacking.
Smart machines that "operate at the speed of light" could help troops make better battlefield decisions, Deputy Defense Secretary Bob Work said during a discussion of the third offset strategy at the Reagan National Defense Forum.
“So when you’re operating against a cyber-attack or an electronic warfare attack or attacks against your space architecture or missiles that are coming, screaming in at you at Mach 6,” Work said, “you’re going to have to have a learning machine that helps you solve that problem right away.”
The advent of AI could change the way vehicles are designed, how pilots fly and how battlefield information is delivered, said Paul Scharre, a former U.S. Army Ranger and now a senior fellow at the Center for a New American Security.
"You're not going to eliminate people from warfare, but there are advantages to machine intelligence to augment the capacity of warfighters – the same way Google augments our ability to process information, or a smartphone," Scharre said.
Artificial intelligence takes many forms. There’s the classic chess-playing robot. The smartphone personal assistant. Seemingly sentient characters in videogames. Self-driving cars.
And in Crowder’s case, there’s the cockroach.
He knew if he was ever going to build a machine of true artificial intelligence – “a fully thinking, reasoning, intelligent, autonomous system,” – he would have to start simple.
“If I can’t do it at that level, I’m not going to do it at a C-3PO level,” he said.
Zeus came first, running on a 9-volt battery and equipped with a basic brain: three neurons on each half, and a communications hub called an artificial prefrontal cortex.
Next came Hercules and Athena. And that’s where things really got interesting.
Crowder programmed them to avoid light, just like Zeus. But he also designed them to run on solar power. When their batteries ran low, they felt an urge.
“They have to find light and charge up, only light still hurts them,” Crowder said. “So now they have to balance the instincts of ‘light still hurts, but if I don’t find light, I die.’”
Emotions in motion
That conflict creates emotion. And emotion is essential to artificial intelligence, Crowder said, because how we feel influences everything we do.
“Our entire learning system in our human brain is tied to our emotions,” he said.
Emotions help the brain decide how to use its resources – something called cognitive economy, he said. If you’re feeling happy, you’ll respond to an event accordingly. If you’re anxious, or sad, or fearful, your brain might tell you to respond differently.
“We need the same thing in AI because no matter how robust you build the system, it has limited memory, limited processors, limited power,” Crowder said. “So I have to understand how to allocate the resources. We do the same thing in our brain. If I have 25 things to worry about and one of them is going to kill me, obviously that one gets the resources.”
Emotions such as fear and anxiety can help an artificially intelligent system survive. But other emotions are useful as well.
Affection and annoyance, for example. Which brings us to the laboratory of Bill Ferguson.
How do you get a video game to like you?
If it’s the game Ferguson helped build, you bow, point to yourself and say your name.
Ferguson, an engineer at Raytheon BBN Technologies in Cambridge, Massachusetts, helped work artificial intelligence into a training tool that teaches Americans how to approach strangers in a foreign land. The videogame-style simulator, built for the Defense Advanced Research Projects Agency, encourages making polite, friendly overtures before taking on tasks such as asking for directions.
“People in other cultures want to talk with you for a while. They want to get to know you,” Ferguson said.
In one scenario, the user is lost but has a photograph of the destination. Two locals approach, speaking only Esperanto. Using a motion sensor, the system watches and interprets the user’s body language. If the user seems pushy or rude, the characters might back away. If the user shows a little more finesse, the characters might offer a piece of fruit. Once the user has established a little goodwill, the characters will take a look at the photo and point out the way.
Behind the on-screen action is some clever programming that controls the characters’ attitude and actions.
“When you smile at them, they have an urge to smile back at you,” Ferguson said. “But they won’t smile if they’re irritated.”
Other researchers are incorporating similar technology into Raytheon's Learning Platform . The electronic tutoring system detects when students are having trouble and adjusts its teaching style. Raytheon built the system for the military but is donating it to high schools nationwide for teaching physics and other subjects.
bRAINS THROUGHOUT THE BODY
Crowder has graduated from the neural system of the cockroach to that of the octopus. Octopuses have a main brain that issues broad commands like “eat” or “move,” then a separate packet of nerves in each arm that knows how to carry out the order.
So he built one with an octopus-like neural system.
Crowder’s goal is a network of machines that can work together through coordination by a central command unit, one most likely with a person at the helm. Think of a squadron of deep-sea minehunters that can scour the ocean floor on their own and report back to a human controller only when they find something of interest.
“They have a sense of their own defense and security, but the operator is the one who’s giving the mission parameters,” Crowder said. “They don’t require a lot of care and feeding.”
To Ferguson, who has been working on artificial intelligence systems for more than 30 years, the use of machines to replicate human thought is a clear next step in doing things faster, better and smarter.
“A guy with a calculator would have run circles around a guy without a calculator 40 years ago,” he said. “It’s just a new kind of tool that’s helping human intelligence get farther.”