Seeing What Robots "See"

The eighth-grade Physics by Design class partnered with a researcher from Tufts University to test a new augmented-reality app for their LEGO MINDSTORMS robots. Mark Cheli, a Ph.D. student in computer science at Tufts, created the app that lets students “see” what their robot perceives. It does this by overlaying data from the robot’s two sensors (color and distance) onto the iPad’s camera view of the robot.
The students programmed their robots to run a maze with the goal of finding a red square in the maze and then stopping on it. The students tried two versions of the maze, one filled with different colored shapes and the other containing both colored shapes and three-dimensional objects. For both mazes, Cheli’s augmented-reality app provided useful information to students about the information their sensors were reporting. For example, they could see that their color sensor was reading orange as red and purple as black. Once students understood the information from the robot’s point of view, they could reprogram their robot and achieve greater success more quickly.
 
Mark Cheli was delighted with the quality of the conversations our students had as they used the app and with the feedback he got from them afterward. The information he gained will help him improve the next version of the app. Our students gained a better sense of the ways in which computer perception differs from human perception—and some insights into software design and research.
Back
617.520.5260      178 Coolidge Hill  Cambridge MA 02138           Association of Independent Schools in New England