This project is funded by NSF Award #1249488 and includes REU supplements. The Therabot™ team is in the process of developing a robotic therapy support system. The goal is to provide a small robotic dog to individuals undergoing therapy, especially those with a Post-Traumatic Stress Disorder (PTSD). The robot will be used by an individual to perform therapy exercises at home and to provide support during these practices. It will also provide support for the less frequent supervised therapy sessions. It is an autonomous, responsive robotic therapy dog that has touch sensors, the ability to record therapy sessions and instructions from a therapist for home therapy practice. It has sound localization to orient toward the user and responsive sounds to provide support. The project entails extensive mechanical design, electronics and sensors, and programming including artificial intelligence and machine learning. This project has provided many opportunities for undergraduate research projects. For more information please visit https://mytherabot.com/.
This project involves working directly with the Starkville Police Department SWAT Team to investigate the uses of robots in coordination with a SWAT or related tactical teams. In addition to developing tactics for a robot, we are looking at which robot form factors and features are most useful to a SWAT team. This project also includes the development of interfaces for supervised autonomy and deployment of distraction effects such as high intensity LED lighting and sound effects. The project also includes programming of supervised autonomy, the use and programming of sensor systems, and interface design and evaluation. The developments of the project are tested on a regular basis with the SWAT team for evaluation of usability and user experiences with the officers. This research has been funded by Army Research Laboratories, the Human Research and Engineering Directorate (ARL-HRED). This research project was the focus of a professional video (shown below) through the Computing Community Consortium and the Computing Research Association.
This project is funded through NSF Award #1408672 under the Cyber-Human Systems Division of Computer and Information Science and Engineering (CISE). The project has three primary aspects.
Hybrid Cognitive Architecture – Interactive Social Engagement (ISEA) Toolkit
This aspect of the project involves the design, development, and implementation of a hybrid cognitive architecture and toolkit. This includes research into the research literature to form a literature review that is expected to be published as a journal article. The architecture includes expert knowledge from forensic interviews, along with sensory processing of the environment, and a decision-making process. Undergraduate and graduate students work on this project writing software that includes artificial intelligence, machine learning, algorithms, and other computational systems.
Eyewitness Memory with Children
The Eyewitness Memory aspect of this project has a Nao robot or a human interview children about some incident they observed. The goal of the project is to determine whether children are more likely to accurately answer questions about their experiences when with a robot interviewer or a human interviewer. The children are between the ages of 8 and 12. The project gives undergraduates the opportunity to work on learning about collecting and analyzing data. There have been projects that included interface design for operating the robot through a Wizard-of-Oz approach. The students have also programmed the robot behaviors to enhance the human-robot interactions.
Gathering Information about the Experiences of Children with Bullying
This project is similar to the Eyewitness Memory aspect of the project, but in this set of studies, either the Nao robot or a human interview will ask students about their experiences with bullying in school. The children are between the ages of 8 and 12. The undergraduate students have served as interviewers, researchers, program the robot behaviors and responses, as well as develop the user interfaces.
Read more about Hydra here.
View the Hydra GitHub repository here.