icon bookmark-bicon bookmarkicon cameraicon checkicon chevron downicon chevron lefticon chevron righticon chevron upicon closeicon v-compressicon downloadicon editicon v-expandicon fbicon fileicon filtericon flag ruicon full chevron downicon full chevron lefticon full chevron righticon full chevron upicon gpicon insicon mailicon moveicon-musicicon mutedicon nomutedicon okicon v-pauseicon v-playicon searchicon shareicon sign inicon sign upicon stepbackicon stepforicon swipe downicon tagicon tagsicon tgicon trashicon twicon vkicon yticon wticon fm
23 Jun, 2016 04:43

US Navy employs video games to prevent a robot apocalypse

US Navy employs video games to prevent a robot apocalypse

The United States Navy is using artificial intelligence software to teach its robots about human ethics and norms. They think that way robots won’t become ‘psychotic’ and turn against the human race.

To create the software, the Navy teamed up with a research team at the Georgia Institute of Technology on a project called Quixote, the name of the AI software.  

"For years, researchers have debated how to teach robots to act in ways that are appropriate, non-intrusive and trustworthy," Marc Steinberg, an Office of Naval Research program manager who oversees the research, said in a statement. "One important question is how to explain complex concepts such as policies, values or ethics to robots. Humans are really good at using narrative stories to make sense of the world and communicate to other people. This could one day be an effective way to interact with robots."

The software teaches robots to read stories, learn acceptable behavior and understand successful ways to conduct themselves in diverse social situations.

Researchers collected scenarios from sources on the internet, from which they created the foundation of 500,000 video games simulations, such as going to the pharmacy or a restaurant.

During the games, robots must learn to decide whether to wait in line, interact with a waiter or steal. They are rewarded for appropriate behavior. The Georgia Tech team argued that without a reward system the bot would conclude stealing is the most efficient way to acquire food or medicine.

“The collected stories of different cultures teach children how to behave in socially acceptable ways with examples of proper and improper behavior in fables, novels, and other literature,” Mark Riedl, associate professor at Georgia Tech’s Entertainment Intelligence Lab and team leader of Quixote, said in a statement.

“We believe story comprehension in robots can eliminate psychotic-appearing behavior and reinforce choices that won’t harm humans and still achieve the intended purpose,” he added.

Riedl said the video games for now are simplistic and similar to early arcade games. However, over the next six months, the software will be upgraded and reach the complexity of games such as Halo or Minecraft, with simulations that require bots to build societal structures.

The team plugged the data into Quixote to create a virtual agent. Riedl’s team ran the agent through the 500,000 simulations, and it displayed proper social interactions more than 90 percent of the time.

"Within a decade, there will be more robots in society, rubbing elbows with us," said Riedl. "Social conventions grease the wheels of society, and robots will need to understand the nuances of how humans do things. That's where Quixote can serve as a valuable tool. We're already seeing it with virtual agents like Siri and Cortana, which are programmed not to say hurtful or insulting things to users."

Podcasts
0:00
28:18
0:00
25:17