Model-based Embedded and Robotic Systems Group



The overarching goal of MERS is to develop “cognitive robots,” robots that are able to think and act much like humans do. Toward this goal, we have three main thrusts to our research: goal-driven interaction with robots, natural human/robot teaming, and robotic reasoning about the environment. When combined, these research topics allow us to create cognitive robots that can be talked to like another human, can work with a team member to finish a task, can recover from many failures without assistance, and can collaborate with a human to recover from a failure that the robot cannot solve alone.

We enable these cognitive robot abilities by using model-based techniques. At the heart of these techniques are engineering models of how the robot works and models of how the robot’s environment behaves. On top of these models, we have developed algorithms that enable the robot to reason over how it believes the world works, much like humans do.

Goal-driven planning

Currently, robots are typically controlled by writing scripts that specify exact locations for the robot to move to, exact positions in which to put its arm, and where to look with a camera. Now imagine you are asking your friend to do a task for you. It’s doubtful you would tell her how to move her legs, where her arm should go, and exactly where she should look at all times. Instead, you would say something like “Please take the food off the stove,” and she will figure out how to accomplish that task. This is exactly the same behavior we are enabling in cognitive robots with our research on goal-directed programming.

Prime examples of our research in this area are our work on Chance Constrained Planning and the human interaction parts of the Personal Transportation System.

Natural human/robot teaming

Goal-directed programming is complemented by the ability to team with a human to accomplish a task.  By coupling these goal-directed and collaboration techniques, cognitive robots can operate as an equal partner on a team with humans.

We have active research in areas such as learning by demonstration, which enables a robot to learn how to perform an activity from demonstrations provided by a human, and collaborative diagnosis, a method by which the robot tries to fix something that goes wrong by working with a human instead of relying on the human to fix the problem, for the Personal Transportation System. The Robotic Manufacturing project aims for fluid interaction between people and robots in a factory setting.

Reasoning about the robot and the environment

Our last primary research thrust is giving cognitive robots the ability to reason about themselves and the environment. This research is aimed at allowing a robot to detect and compensate for unexpected behavior. For example, a component internal to the robot, such as a valve, may fail or the robot’s environment may suddenly change when a box it is reaching for is moved. By giving robots the cognitive ability to reason about themselves and the environment, instead of relying on scripts, the robots can recover without the aid of a human operator by taking actions such as switching to a backup fluid system, moving forward to pick up the box, or some other novel, non-preprogrammed action. The Robotic Manufacturing uses these ideas to make robust robots that can recover from failure.