Robots Grip Better When They Grip Smarter
Even simple robotic grippers can perform complex tasks—so long as it’s smart about using its environment as its handy aide. This, at least, is the finding of new research from Carnegie Mellon University’s Robotics Institute.
In robotics, simple grippers are typically assigned straightforward tasks such as picking up objects and placing them somewhere. However, by making use of their surroundings, such as pushing an item against a table or wall, simple grippers can perform skillful maneuvers usually thought achievable only by more complex, fragile and expensive, multi-fingered artificial hands.
However, previous research on this strategy, known as “extrinsic dexterity,” often made assumptions about the way in which grippers would grasp items. This in turn required specific gripper designs or robot motions.
“Simple grippers are underrated.”
—Wenxuan Zhou, Carnegie Mellon University
In the new study, scientists used AI to overcome these limitations to apply extrinsic dexterity to more general settings and successfully grasp items of various sizes, weights, shapes and surfaces.
“This research may open up new possibilities in manipulation with a simple gripper,” says study lead author Wenxuan Zhou at Carnegie Mellon University. “Potential applications include warehouse robots or housekeeping robots that help people to organize their home.”
The researchers employed reinforcement learning to train a neural network. They had the AI system attempt random actions to grasp an object, rewarding those series of actions that led to success. The system, then, ultimately adopted the most successful patterns of behavior. It learned, in so many words. After first training their system in a physics simulator, they next tested it in a simple robot with a pincer-like grip.
The scientists had the robot attempt to grab items confined within an open bin that were initially oriented in ways that meant the robot could not pick them up. For example, the robot might be given an object that was too wide for its gripper to grasp. The AI needed to figure out a way to push the item against the wall of the bin so the robot could then grab it from its side.
“Initially, we thought the robot might try to do something like scooping underneath the object, as humans do,” Zhou says. “However, the algorithm gave us an unexpected answer.” After nudging an item against the wall, the robot pushed its top finger against the side of the object to lever it up, “and then let the object drop on the bottom finger to grasp it.”
In experiments, Zhou and her colleagues tested their system on items such as cardboard boxes, plastic bottles, a toy purse and a container of Cool Whip. These varied in weight, shape and how slippery they were. They found their simple grippers could successfully grasp these items with a 78 percent success rate.
“Simple grippers are underrated,” Zhou says. “Robots should exploit extrinsic dexterity for more skillful manipulation.”
In the future, the group hopes to generalize their findings to, Zhou says, “a wider range of objects and scenarios,” Zhou says. “We are also interested in exploring more complex tasks with a simple gripper with extrinsic dexterity.”
The scientists detailed their findings 18 December at the Conference on Robot Learning in Auckland, New Zealand.