Google awards $1.5mn to research into robots that learn by doing (VIDEO)
US researchers teaching robots to develop their abilities and usefulness through tactile learning rather than specific programming have received a $1.5 million funding windfall from Google.
While the field of robotics has made huge strides, many machines still struggle with basic human functions such as grasping handles, picking up objects and walking up and down stairs.
READ MORE: Autonomous AI: New robots will learn as children do & set own goals
Scientists from Carnegie Mellon University’s Robotics Institute are working to improve robots’ basic motor functions through ‘kinesthetic learning’ - learning by doing.
The project, led by Abhinav Gupta, centers on robots building a bank of datasets from which they can recognize certain tasks. Video from the team shows that one robot underwent 700 hours and 50,000 attempts to discover how to grasp properly.
Gupta and his team are now set to benefit from a $1.5 million Google “Focused Research Award,” according to Carnegie Mellon University. The money is to be used to advance their goal of teaching“visual representations via physical interactions.”
Gupta said physical manipulation abilities in robots still “pale in comparison to those of a two-year-old.”
Reduced costs in robotics will allow researchers to use more machines and collect huge amounts of data, he explained. The team will use “dozens of different robots, including one and two-armed robots and even drones, to learn about the world and actions that can be performed,” Gupta said.
Lerrel Pinto, a PhD student involved in the project, suggests that their robot learning focus is linked to how humans evolved.
“Psychological studies have shown that if people can’t affect what they see, their visual understanding of that scene is limited,” Pinto said. “Interaction with the real world exposes a lot of visual dynamics.”
READ MORE: Robots will create art & may even fall in love - AI engineer
Much of the university research to date has been conducted using a two-armed robot, Baxter. The Rethink Robotics machine has been used in similar research before.
Computer scientist Stefanie Tellex from Brown University used the somewhat freaky-looking bot while setting a dinner table and organizing small objects.
“It somehow has to look at that matrix of numbers, and then run a program to figure out where the object is and what it is and where it should put its gripper in order to not be so pathetic and actually pick it up,” she told NPR.