A matter of common sense
Robotics researcher wants to develop robots that understand the world the way humans do.
For many people, imagining a robot that can set the table for you seems like a huge technological leap. For Monica Nicolescu, it's just a start.
Nicolescu, who is an associate professor in the Department of Computer Science and Engineering, works in the area of social robotics, which seeks to develop robots that can work alongside people and help with everyday tasks. Nicolescu envisions a robot that may be able to deliver medicine in a hospital, perform office tasks such as mail or message delivery, or help around the home. And while that may not seem that far away from a Roomba, developing a robot that can interact with humans requires solving a number of complex problems.
"We are probably going to see more and more robots used in daily life, but all of these robots that you could one day buy from the stores are probably going to be restricted to simple applications," Nicolescu said. "It is a significant challenge to have a robot that works in real environments with people."
In order to develop robots that are capable or helping humans efficiently, Nicolescu is tackling a number of challenging problems that she hopes can help robots navigate the human social world more effectively.
"The biggest project that we have is on intent recognition, having robots understand a person's intention, because that would make robots much better collaborators alongside people," she said.
Nicolescu's work on intent recognition attempts to help robots build models of the world that they can use to determine the probability that a person intends to take a certain action.
For example, a person reaching for a pen probably intends to pick it up, but Nicolescu is hoping to develop models that robots can use to also understand why the person might be reaching for the pen - for example, do they want to write with it or are they going to put it away.
"There's a lot of contextual information that needs to come in to help you detect this higher-level intention, and that is the challenge that we are working on right now," she said. "How do you represent and encode this information so that the robots can make the appropriate inference?"
Another challenge Nicolescu is tackling is that of developing robots that can learn by demonstration, without requiring a complicated interface or specific programming for each new task.
"Ideally if you have a robot helper around the house, you wouldn't want to sit down and write the code for it because not everybody can program, so you'd like the robot to be able to learn that this is how you set the table in the morning," she said. "You have to show the robot how to do it once, and then the robot will know how to do it from that point forward."
However, this also requires developing a robot that is able to incorporate contextual cues or changes in the environment to adapt a specific demonstration to general rules that would still apply even if the environment changes.
"It's not just reproducing exactly what the teacher had shown you, but also generalize from what you've seen, so you know that plates go on the table, and if the tables has moved a little bit, they still go on the table."
Despite these challenges, Nicolescu is optimistic about the advances the field is making. In particular, advances in sensing technology and perception have the potential to open the door for advanced reasoning for robots.
"There are a lot of algorithms that have been developed that allow you to reason and to solve complicated problems, but there was no connection between the world the robot is actually in and those algorithms," she said. "They were so disconnected because of the perception problem. For a long time, we could not make that transition. Now we can."