New R2D2: U-M researchers working to increase robots’ intelligence
Rosie the Robot Maid in “The Jetsons” and R2-D2 in “Star Wars” are highly advanced robots that can clean, prepare meals and even send secret messages. While the robots of today have not yet reached this level of intelligence, Dmitry Berenson and Jason Corso, associate professors of electrical engineering and computer science, are working with teams of graduate students in the hopes that they one day will.
Their current research, supported by the Toyota Research Institute, involves developing algorithms to make a robot able to search for and find objects in assortments of several items. They refer to the project as “Manipulating Piles of stuff.”
Berenson’s group focuses on the motion-planning and manipulation components — figuring out how to make the robot actually move the objects. Corso’s group focuses on the robot’s perception of the objects and the surrounding environment.
Rackham student Abhishek Venkataraman, who works with Corso’s team, emphasized the cooperation between the two groups.
“This is more of a collaborative effort,” he said.
The two groups perform tests on two robotic arms manufactured by KUKA, a supplier of robotic hardware, in the Autonomous Robotic Lab. In these tests, they place a pile of bean bags or laundry in front of the robot and transmit the appropriate information so that it can find the desired object.
This task is momentous for robots, according to Berenson. He explained it is difficult for robots to respond to complex assortments of several objects.
“It turns out it’s much more complicated,” he said. “The reason is that when the object is by itself, you just kind of identify it. But when it’s in a pile or a stack, you have to actually move other objects out of the way first. You have to basically be able to reason about what you do when you can’t really see everything in the environment.”
Venkataraman compared how challenging these actions are for robots to how inherent they are for humans.
“What we feel as humans is so intuitive,” he said. “It’s like this is not even a task, it’s so easy. But looking at it from the perspective of a robot, this is a very complicated task. When (a robot) looks at an image, you need to isolate that this is there, then you need to make a plan. All of these are parameters you need to change.”
Dale McConachie, an Engineering Ph.D. student working with Berenson’s team, noted the difficulty is enhanced by the fact objects like bean bags or pieces of laundry are deformable. They’re harder for a robot to manipulate than something hard and rigid.
“Math is really good at describing where something hard and rigid is, and if I move my head, where does it go?” he said. “We can do that to some extent with deformable objects, but it gets very computationally messy very quickly. There’s an infinite number of dimensions for something soft and squishy, so how do you describe that efficiently? Do you even need to?”
However, to advance robots so they can succeed in a human environment, overcoming these challenges is necessary. Human homes, hospitals and a variety of other places are uncertain, sometimes hectic environments. Brent Griffin, an assistant research scientist in the Department of Electrical Engineering and Computer Science, also works on the team and discussed the unforeseeability in human environments and the importance of trying to develop a robot that can solve many kinds of problems.
“Robotics has a lot of success in industry because there’s a lot of predictability and certainty,” he said. “We can spend a lot of time tuning or preparing a robot for this specific problem. The thing that we’re trying to get towards is getting to the point where you can have a general applications robot and be fairly robust to a lot of uncertainty in the environment, because that’s really what is more difficult about operating in human environments.”
Griffin noted the robotics field has developed the appropriate hardware for achieving this goal. He has recently been working with a mobile robot named Fetch. According to Griffin, Fetch is just as physically capable as the robots in movies, but the software used to make it function isn’t yet up to par.
Ultimately, according to the researchers, their goal is to make robots commonplace in the lives of humans, specifically the elderly and people with disabilities. Berenson believes their research can be applied to almost any real-world scenario.
“Just anywhere you encounter a complicated arrangement where you have to find the object in that arrangement is where you can apply this kind of work,” he said.