Robot Intent

The Robot Intent project involved the development and testing three interfaces for communicating a robot’s intended movements to people operating near the robot. The three interfaces included a mobile Android device that displayed the next intended movement, a earpiece that gave audio messages of the next intended movement, and an LED Light display that displayed directional arrows for the next intended directions in both visual and infrared LED displays. As a control condition for a study conducted the participants used a logitech gamepad controller to manually operate the robot as is typical of robots operating with people in close proximity. This project involved designing and building the LED display, designing and programming the Android interface, and setting up the audio messaging system. This project was led by undergraduate researchers and resulted in a conference publication.

Intent Devices

Robot Control

The other aspect of this project was the development of control interfaces to override autonomy based on intended messages through supervisory control of the robot. Undergraduate and graduate researchers created three interfaces for overriding a semi-autonomous robot’s behaviors. The three interfaces were a touchscreen Android mobile device to direct the movements of the robot, a voice command through a microphone attached to the Android mobile device, and the use of a Kinect to record directions for the robot to move through arm gestures. This resulted in a publication for the undergraduate and graduate researchers.

Both of these projects are in support of improving operations between a robot and a team of humans, such as tactical or search and rescue teams. A study was conducting for both intent and control interfaces using a maze in which the participants navigated a Turtlebot robot.




Vocal Prosody in Robotic Speech

Most HRI research concerning speech is investigating speech directed from human to robot. There has been relatively little research of the reverse communication channel: speech directed from robot to human. We are conducting experiments to determine if conveying emotion in robotic speech will increase the quality of human-robot interactions using the Survivor Buddy robot platform.