Task Progress:
|
Human-robot interaction (HRI) is a discipline investigating the factors affecting the interactions between humans and robots. It is important to evaluate how the design of interfaces and command modalities affect the human’s ability to perform tasks accurately, efficiently, and effectively when working with a robot. By understanding the effects of interface design on human performance, workload, and situation awareness, interfaces can be developed to appropriately support the human in performing tasks with minimal errors and with appropriate interaction time and effort. Thus, the results of research on human-robot interfaces have direct implications for the design of robotic systems. This research project concentrates on three areas associated with interfaces and command modalities in HRI which are applicable to NASA robot systems: 1) Video Overlays, 2) Camera Views, and 3) Command Modalities.
The first study focused on video overlays, and investigated how Augmented Reality (AR) symbology can be added to the human-robot interface to improve teleoperation performance. Three types of AR symbology were explored in this study, command guidance (CG), situation guidance (SG), and both (SCG). CG symbology gives operators explicit instructions on what commands to input, whereas SG symbology gives operators implicit cues so that operators can infer the input commands. The combination of CG and SG provided operators with explicit and implicit cues allowing the operator to choose which symbology to utilize. The objective of the study was to understand how AR symbology affects the human operator’s ability to align a robot arm to a target using a joystick and the ability to allocate attention between the symbology and external views of the world. The study evaluated the effect of type of symbology (CG and SG) on operator task performance and attention allocation during teleoperation of a robot arm.
Planned studies for the near future:
The second study will expand on the first study by evaluating the effect of type of navigational guidance (CG and SG) on operator task performance and attention allocation during teleoperation of a robot arm through uplinked, manually entered commands. Although this study complements the first study on navigational guidance with hand controllers, it is a separate investigation due to the distinction in intended operators (i.e., crewmembers versus ground-operators).
A third study will look at superimposed and integrated overlays for teleoperation of a mobile robot using a hand controller. When AR is superimposed on the external world, it appears to be fixed onto the display and internal to the operators’ workstation. Unlike superimposed overlays, integrated overlays often appear as three-dimensional objects and move as if part of the external world. Studies conducted in the aviation domain show that integrated overlays can improve situation awareness and reduce the amount of deviation from the optimal path. The purpose of the study is to investigate whether these results apply to HRI tasks, such as navigation with a mobile robot.
HRP Gaps:
This HRI research contributes to closure of HRP gaps by providing information on how display and control characteristics--those related to guidance, feedback, and command modalities--affect robot operator performance. The overarching goals are to improve interface usability, reduce operator error, and develop candidate guidelines to design effective human-robot interfaces.
|
|
Abstracts for Journals and Proceedings
|
Rochlis J, Sandor A, Chang ML, Pace J. "Human-Robot Interaction Directed Research Project." 2013 NASA Human Research Program Investigators’ Workshop, Galveston, TX, February 12-14, 2013. 2013 NASA Human Research Program Investigators’ Workshop, Galveston, TX, February 12-14, 2013. , Feb-2013
|
|