A new generation of robots can sense what a resident might do or need and then provide assistance.

Investigators at Cornell University’s Personal Robotics Lab are developing robots that can accurately predict and respond to human actions. Such machines could potentially offer a new level of comfort to people requiring assistance with their daily activities.

One of the robots being created can do things such as fill a cup or hold a door open. What makes it different from its predecessors, however, is that it is not just programmed to do specific tasks. Rather, it can respond to events as they unfold in a process that looks eerily similar to the way nursing aides behave.

But even the most optimistic researchers admit that developing helper robots can be maddeningly slow. Actions such as balancing a plate, opening a door or pouring a drink may seem basic, but turn out to be remarkably complex.

“Many of the tasks that look simple to us are hard for robots,” said Ashutosh Saxena, Ph.D., an associate professor of computer science at Cornell. “Not being clumsy, which most robots are, is a hard problem to solve and a hard thing to teach.”

To help here, researchers are using data to build models of human activity, by capturing dozens of three-dimensional videos of humans performing common activities. Using technology similar to that seen in computer-animated films, human beings are represented in these videos as skeletons built of lines and joints.

Ultimately, this could lead to robots with the ability to assist unstable residents, or perhaps help them up after a fall.

Investigators said the goal is to make it possible for robots to operate intelligently in common human environments such as long-term care settings. 

Investigators will give a progress report at the Neural Information Processing Systems conference this month. The U.S. Army Research Office, Microsoft and the National Science Foundation have provided funding.