If grasping-armed robots are ever going to be widely used for applications such as assisting seniors in their homes, then they’ll have to be easy to program. And unfortunately, programming most robots to grasp and retrieve an object can still be a rather complex process. That’s why scientists at the Georgia Institute of Technology have developed a new system that simply requires users to click twice with a computer mouse.
Currently, in one of the most commonly-used computer-based programming systems, users are shown a locked-off camera view of the robot and its surroundings, along with a 3D map of that scene. Using a series of onscreen rings and arrows, they then set about manually adjusting each of the six degrees of freedom of the robot’s arm, lining it up so that it is hopefully able to grasp and lift the desired object.
It’s a setup that’s definitely designed for experts, and that involves a certain amount of trial and error.
In the new system – designed by a team led by Ph.D student David Kent – users start by just clicking on the object that they want retrieved, within the overhead camera view. Using the information from the 3D map, the system automatically figures out how best to position the arm, in order to reach that item. It then presents the user with a choice of grasping styles, one of which they choose by clicking on it.
From there, the robot arm goes to work, moving its gripper to the item, grasping it, then delivering it to the user.
“The robot can analyze the geometry of shapes, including making assumptions about small regions where the camera can’t see, such as the back of a bottle,” says assistant professor Sonia Chernova, who advised on the project. “Our brains do this on their own – we correctly predict that the back of a bottle cap is as round as what we can see in the front. In this work, we are leveraging the robot’s ability to do the same thing to make it possible to simply tell the robot which object you want to be picked up.”
The system can be seen in use, in the video below.
>> Read more by Ben Coxworth, New Atlas, April 26, 2017