Science Daily recently featured a piece on robotics research at Brown. The robotics group has demonstrated how a robot can follow nonverbal commands from a person in a variety of environments — indoors as well as outside — all without adjusting for lighting. According to the team’s leader, Chad Jenkins, “We have created a novel system where the robot will follow you at a precise distance, where you don’t need to wear special clothing, you don’t need to be in a special environment, and you don’t need to look backward to track it.”
This achievement was presented at the 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI 2009) March 11-13, 2009, in San Diego. A paper accompanying the video was also presented at the conference. Ph.D. student Matt Loper is the lead author on the paper. Contributors include former Brown graduate student Nathan Koenig, now at the University of Southern California; Carnegie Mellon graduate student Sonia Chernova; and Chris Jones, a researcher with the Massachusetts-based robotics maker iRobot Corporation.
A video that shows the robot following gestures and verbal commands is available online.