US Navy employs video games to prevent a robot apocalypse
The United States Navy is using artificial intelligence software to teach its robots about human ethics and norms. They think that way robots won’t become ‘psychotic’ and turn against the human race.
To create the software, the Navy teamed up with a research team at the Georgia Institute of Technology on a project called Quixote, the name of the AI software.
"For years, researchers have debated how to teach robots to act in ways that are appropriate, non-intrusive and trustworthy," Marc Steinberg, an Office of Naval Research program manager who oversees the research, said in a statement. "One important question is how to explain complex concepts such as policies, values or ethics to robots. Humans are really good at using narrative stories to make sense of the world and communicate to other people. This could one day be an effective way to interact with robots."
How does this work in a #secular society: https://t.co/slXWQBpyNV#Ethics#HumanDignity#Responsibility .@JonMarcGrodi
— Eric Neubauer (@eneubauer) June 22, 2016
The software teaches robots to read stories, learn acceptable behavior and understand successful ways to conduct themselves in diverse social situations.
Researchers collected scenarios from sources on the internet, from which they created the foundation of 500,000 video games simulations, such as going to the pharmacy or a restaurant.
During the games, robots must learn to decide whether to wait in line, interact with a waiter or steal. They are rewarded for appropriate behavior. The Georgia Tech team argued that without a reward system the bot would conclude stealing is the most efficient way to acquire food or medicine.
Two robots in every kitchen: Elon Musk wants AI to handle domestic drudgeryhttps://t.co/MTwy6YQFQQpic.twitter.com/bR0vfh0vyw
— RT America (@RT_America) June 21, 2016
“The collected stories of different cultures teach children how to behave in socially acceptable ways with examples of proper and improper behavior in fables, novels, and other literature,” Mark Riedl, associate professor at Georgia Tech’s Entertainment Intelligence Lab and team leader of Quixote, said in a statement.
“We believe story comprehension in robots can eliminate psychotic-appearing behavior and reinforce choices that won’t harm humans and still achieve the intended purpose,” he added.
Riedl said the video games for now are simplistic and similar to early arcade games. However, over the next six months, the software will be upgraded and reach the complexity of games such as Halo or Minecraft, with simulations that require bots to build societal structures.
RT_com: RT RT_America: Multi-gendered robots? Custom traits help humans decide their household uses (VIDEO)https://t.co/5wqgcjIXnj …
— ChavezNews (@ChaviztaNews) May 21, 2016
The team plugged the data into Quixote to create a virtual agent. Riedl’s team ran the agent through the 500,000 simulations, and it displayed proper social interactions more than 90 percent of the time.
"Within a decade, there will be more robots in society, rubbing elbows with us," said Riedl. "Social conventions grease the wheels of society, and robots will need to understand the nuances of how humans do things. That's where Quixote can serve as a valuable tool. We're already seeing it with virtual agents like Siri and Cortana, which are programmed not to say hurtful or insulting things to users."
Robots could take half of all jobs in 30 years (VIDEO) https://t.co/sOAVYRUuxN@anyaparampil@LionelMediapic.twitter.com/YUDFevlEgB
— RT America (@RT_America) February 18, 2016