Menu

 

The NASA Task Book
Advanced Search     

Project Title:  Assessment, Evaluation, and Development of Methodologies, Metrics and Tools Available for Use in Multi-agent (Human and Robotic) Teaming Reduce
Fiscal Year: FY 2015 
Division: Human Research 
Research Discipline/Element:
HRP SHFH:Space Human Factors & Habitability (archival in 2017)
Start Date: 09/07/2012  
End Date: 12/31/2014  
Task Last Updated: 06/16/2015 
Download report in PDF pdf
Principal Investigator/Affiliation:   Sandor, Aniko  Ph.D. / Lockheed-Martin/NASA Johnson Space Center 
Address:  2101 Nasa Parkway 
Mail Code: C46 
Houston , TX 77058 
Email: Aniko.Sandor-1@nasa.gov 
Phone: 281.483.9726  
Congressional District: 22 
Web:  
Organization Type: NASA CENTER 
Organization Name: Lockheed-Martin/NASA Johnson Space Center 
Joint Agency:  
Comments:  
Co-Investigator(s)
Affiliation: 
Cross, Ernest  Ph.D. Lockheed Martin/NASA Johnson Space Center 
Chang, Mai Lee  NASA Johnson Space Center 
Project Information: Grant/Contract No. Directed Research 
Responsible Center: NASA JSC 
Grant Monitor: Whitmore, Mihriban  
Center Contact: 281-244-1004 
mihriban.whitmore-1@nasa.gov 
Unique ID: 9450 
Solicitation / Funding Source: Directed Research 
Grant/Contract No.: Directed Research 
Project Type: GROUND 
Flight Program:  
TechPort: No 
No. of Post Docs:
No. of PhD Candidates:
No. of Master's Candidates:
No. of Bachelor's Candidates:
No. of PhD Degrees:
No. of Master's Degrees:
No. of Bachelor's Degrees:
Human Research Program Elements: (1) SHFH:Space Human Factors & Habitability (archival in 2017)
Human Research Program Risks: (1) HSIA:Risk of Adverse Outcomes Due to Inadequate Human Systems Integration Architecture
Human Research Program Gaps: (1) HSIA-701:We need to determine how human-automation-robotic systems can be optimized for effective enhancement and monitoring of crew capabilities, health, and performance, during increasingly earth-independent, future exploration missions (including in-mission and at landing).
Flight Assignment/Project Notes: NOTE: Project ended 12/31/2014 per E. Connell/JSC HRP (Ed., 6/15/15)

Task Description: The study of human-robot interaction (HRI) involves understanding and shaping the interactions between humans and robots (Goodrich & Schultz, 2007). It is important to evaluate how the design of interfaces and command modalities affect the human’s ability to perform tasks accurately, efficiently, and effectively (Crandall, Goodrich, Olsen Jr., & Nielsen, 2005). Many NASA robot systems are teleoperated. Developing safe, reliable, and effective human-robot interfaces for teleoperation involves providing the information necessary to support operator task performance. For robot navigation tasks, which include the operator moving a robot through space or commanding individual robot segments, the operator needs to understand the current and desired state of the robot, and have the most compatible command modality with the task.

In Fiscal Year 2011 (FY11), preparatory work was completed in the form of literature reviews; observations of NASA robot systems; interviews with NASA robotic operators and trainers; and a space HRI workshop. These activities resulted in the selection of three research areas that are the focus of the proposed work. The three research areas are: Video Overlays, Camera Views, and Command Modalities.

Studies proposed in this Directed Research Project in the area of Video Overlays consider two factors in the implementation of augmented reality (AR) for operator displays during teleoperation. The first of these factors is the type of navigational guidance provided by AR symbology. Participants’ performance during teleoperation of a robot arm will be compared when they are provided with command-guidance symbology (i.e., directing the operator what commands to make) or situation-guidance symbology (i.e., providing natural cues so that the operator can infer what commands to make). The second factor to be considered for AR symbology is the effect of overlays that are either superimposed or integrated into the external view of the world. A study is proposed that compares the effects of superimposed and integrated overlays on operator task performance during teleoperated driving tasks.

Studies proposed in the area of Camera Views investigate inclusion/exclusion of a robot within the video feed and camera frame of reference. One study will investigate the effects of including and excluding the robot’s chassis within the video feed presented to operators on path-following and maze traversal task performance. Another study will investigate the effects of the addition of an exocentric camera frame of reference to egocentric frames of reference on operator task performance for these same tasks.

Lastly, studies in the area of Command Modalities will systematically build and evaluate gesture and voice vocabularies for commanding a ground-based mobile robot. The first in this series of studies will have participants produce robot commands for a set of critical control functions. The characteristics of the commands will be analyzed. In a second phase of this study, the strength of association between command and voice/gesture inputs will be evaluated. The next two studies will test the learnability and memorability of the developed vocabularies in the context of a representative task.

Research Impact/Earth Benefits: The video overlays developed by the research projects can be applied to any type of robotic teleoperation situation.

The results of the camera view studies will be applicable to mobile robots such as rovers and search and rescue robots.

Finally, the method applied and tested for the development of a gesture and voice command vocabulary can be used for any other system to develop similar kinds of communication systems.

Task Progress & Bibliography Information FY2015 
Task Progress: Teleoperation is usually accomplished with the use of one or multiple camera views. The amount and type of information provided by the limited view of cameras can lead to reduced situation awareness, increased task times, and errors. The use of augmented reality, in the form of overlays, is one approach that may compensate for the issues associated with teleoperation with video feed. A series of studies were conducted to investigate the effect of overlays on teleoperator performance. The first study investigated the effects of situation and command guidance overlays on operator performance when teleoperating a robot arm and found that command guidance led to better performance than situation guidance. The combination of command and situation guidance seemed to have too much information and was more difficult to interpret than the other guidance conditions. The second experiment was a pilot study that considered the use of integrated and superimposed overlays for a teleoperated navigation task. The results were consistent with the literature: participants had higher number of collisions with the integrated overlays than with the head-down display, though in this condition participants were also slower. Since this study was conducted only with five participants, statistical inferences cannot be drawn from these results. The third study used redesigned versions of the superimposed and integrated overlays in normal and degraded visual condition. The results of this study showed no effect of overlay or visual condition on driving performance as measured by the number of collisions, speed, task completion time, or situation awareness.

Follow-on studies need to investigate these types of overlays with improved designs and in more realistic environments.

Bibliography: Description: (Last Updated: 03/03/2016) 

Show Cumulative Bibliography
 
 None in FY 2015
Project Title:  Assessment, Evaluation, and Development of Methodologies, Metrics and Tools Available for Use in Multi-agent (Human and Robotic) Teaming Reduce
Fiscal Year: FY 2014 
Division: Human Research 
Research Discipline/Element:
HRP SHFH:Space Human Factors & Habitability (archival in 2017)
Start Date: 09/07/2012  
End Date: 12/31/2014  
Task Last Updated: 06/10/2014 
Download report in PDF pdf
Principal Investigator/Affiliation:   Sandor, Aniko  Ph.D. / Lockheed-Martin/NASA Johnson Space Center 
Address:  2101 Nasa Parkway 
Mail Code: C46 
Houston , TX 77058 
Email: Aniko.Sandor-1@nasa.gov 
Phone: 281.483.9726  
Congressional District: 22 
Web:  
Organization Type: NASA CENTER 
Organization Name: Lockheed-Martin/NASA Johnson Space Center 
Joint Agency:  
Comments:  
Co-Investigator(s)
Affiliation: 
Cross, Ernest  Ph.D. Lockheed Martin/NASA Johnson Space Center 
Chang, Mai Lee  NASA Johnson Space Center 
Project Information: Grant/Contract No. Directed Research 
Responsible Center: NASA JSC 
Grant Monitor: Whitmore, Mihriban  
Center Contact: 281-244-1004 
mihriban.whitmore-1@nasa.gov 
Unique ID: 9450 
Solicitation / Funding Source: Directed Research 
Grant/Contract No.: Directed Research 
Project Type: GROUND 
Flight Program:  
TechPort: No 
No. of Post Docs:  
No. of PhD Candidates:  
No. of Master's Candidates:  
No. of Bachelor's Candidates:  
No. of PhD Degrees:  
No. of Master's Degrees:  
No. of Bachelor's Degrees:  
Human Research Program Elements: (1) SHFH:Space Human Factors & Habitability (archival in 2017)
Human Research Program Risks: (1) HSIA:Risk of Adverse Outcomes Due to Inadequate Human Systems Integration Architecture
Human Research Program Gaps: (1) HSIA-701:We need to determine how human-automation-robotic systems can be optimized for effective enhancement and monitoring of crew capabilities, health, and performance, during increasingly earth-independent, future exploration missions (including in-mission and at landing).
Flight Assignment/Project Notes: NOTE: Project ended 12/31/2014 per E. Connell/JSC HRP (Ed., 6/15/15)

Task Description: The study of human-robot interaction (HRI) involves understanding and shaping the interactions between humans and robots (Goodrich & Schultz, 2007). It is important to evaluate how the design of interfaces and command modalities affect the human’s ability to perform tasks accurately, efficiently, and effectively (Crandall, Goodrich, Olsen Jr., & Nielsen, 2005). Many NASA robot systems are teleoperated. Developing safe, reliable, and effective human-robot interfaces for teleoperation involves providing the information necessary to support operator task performance. For robot navigation tasks, which include the operator moving a robot through space or commanding individual robot segments, the operator needs to understand the current and desired state of the robot, and have the most compatible command modality with the task.

In Fiscal Year 2011 (FY11), preparatory work was completed in the form of literature reviews; observations of NASA robot systems; interviews with NASA robotic operators and trainers; and a space HRI workshop. These activities resulted in the selection of three research areas that are the focus of the proposed work. The three research areas are: Video Overlays, Camera Views, and Command Modalities.

Studies proposed in this Directed Research Project in the area of Video Overlays consider two factors in the implementation of augmented reality (AR) for operator displays during teleoperation. The first of these factors is the type of navigational guidance provided by AR symbology. Participants’ performance during teleoperation of a robot arm will be compared when they are provided with command-guidance symbology (i.e., directing the operator what commands to make) or situation-guidance symbology (i.e., providing natural cues so that the operator can infer what commands to make). The second factor to be considered for AR symbology is the effect of overlays that are either superimposed or integrated into the external view of the world. A study is proposed that compares the effects of superimposed and integrated overlays on operator task performance during teleoperated driving tasks.

Studies proposed in the area of Camera Views investigate inclusion/exclusion of a robot within the video feed and camera frame of reference. One study will investigate the effects of including and excluding the robot’s chassis within the video feed presented to operators on path-following and maze traversal task performance. Another study will investigate the effects of the addition of an exocentric camera frame of reference to egocentric frames of reference on operator task performance for these same tasks.

Lastly, studies in the area of Command Modalities will systematically build and evaluate gesture and voice vocabularies for commanding a ground-based mobile robot. The first in this series of studies will have participants produce robot commands for a set of critical control functions. The characteristics of the commands will be analyzed. In a second phase of this study, the strength of association between command and voice/gesture inputs will be evaluated. The next two studies will test the learnability and memorability of the developed vocabularies in the context of a representative task.

Research Impact/Earth Benefits: The video overlays developed by the research projects can be applied to any type of robotic teleoperation situation.

The results of the camera view studies will be applicable to mobile robots such as rovers and search and rescue robots.

Finally, the method applied and tested for the development of a gesture and voice command vocabulary can be used for any other system to develop similar kinds of communication systems.

Task Progress & Bibliography Information FY2014 
Task Progress: Human-Robot Interaction (HRI) is a discipline investigating the factors affecting interactions between humans and robots. It is important to evaluate how the design of interfaces and command modalities affect the human’s ability to perform tasks accurately, efficiently, and effectively when working with a robot. By understanding the effects of interface design on human performance, workload, and situation awareness, interfaces can be developed to appropriately support the human in performing tasks with minimal errors and with appropriate interaction time and effort. Thus, the results of research on human-robot interfaces have direct implications for the design of robotic systems. This research project concentrates on areas associated with human-robot interfaces applicable to NASA robot systems. One area of research focused on video overlays. The first study investigated how Augmented Reality (AR) symbology can be added to the human-robot interface to improve teleoperation performance. Three types of AR symbology were explored in this study, command guidance (CG), situation guidance (SG), and both (SCG). CG symbology gives operators explicit instructions on what commands to input, whereas SG symbology gives operators implicit cues so that operators can infer the input commands. The combination of CG and SG provided operators with explicit and implicit cues allowing the operator to choose which symbology to utilize. The objective of the study was to understand how AR symbology affects the human operator’s ability to align a robot arm to a target using a joystick and the ability to allocate attention between the symbology and external views of the world. The study evaluated the effect of type of symbology (CG and SG) on operator task performance and attention allocation during teleoperation of a robot arm.

A second study is looking at superimposed and integrated overlays for teleoperation of a mobile robot using a hand controller. When AR is superimposed on the external world, it appears to be fixed onto the display and internal to the operators’ workstation. Unlike superimposed overlays, integrated overlays often appear as three-dimensional objects and move as if part of the external world. Studies conducted in the aviation domain show that integrated overlays can improve situation awareness and reduce the amount of deviation from the optimal path. The purpose of this ongoing study is to investigate whether these results apply to navigation with a mobile robot.

Bibliography: Description: (Last Updated: 03/03/2016) 

Show Cumulative Bibliography
 
Abstracts for Journals and Proceedings Sandor A, Cross EV 2nd, Chang ML. "Human-robot interaction." Presented at the 2014 Human Research Program Investigators’ Meeting, Galveston, TX, February 12-13, 2014.

2014 Human Research Program Investigators’ Meeting, Galveston, TX, February 12-13, 2014. http://www.hou.usra.edu/meetings/hrp2014/pdf/3049.pdf , Feb-2014

Abstracts for Journals and Proceedings Sandor A, Cross EV 2nd, Chang ML. "Human-robot interaction: overlays for teleoperation." Presented at the Southwest Regional Human Factors and Ergonomics Society Symposium, College Station, TX, June 6, 2014.

Southwest Regional Human Factors and Ergonomics Society Symposium, College Station, TX, June 6, 2014. , Jan-2014

Project Title:  Assessment, Evaluation, and Development of Methodologies, Metrics and Tools Available for Use in Multi-agent (Human and Robotic) Teaming Reduce
Fiscal Year: FY 2013 
Division: Human Research 
Research Discipline/Element:
HRP SHFH:Space Human Factors & Habitability (archival in 2017)
Start Date: 09/07/2012  
End Date: 09/30/2015  
Task Last Updated: 10/29/2013 
Download report in PDF pdf
Principal Investigator/Affiliation:   Sandor, Aniko  Ph.D. / Lockheed-Martin/NASA Johnson Space Center 
Address:  2101 Nasa Parkway 
Mail Code: C46 
Houston , TX 77058 
Email: Aniko.Sandor-1@nasa.gov 
Phone: 281.483.9726  
Congressional District: 22 
Web:  
Organization Type: NASA CENTER 
Organization Name: Lockheed-Martin/NASA Johnson Space Center 
Joint Agency:  
Comments:  
Co-Investigator(s)
Affiliation: 
Cross, Ernest  Ph.D. Lockheed Martin/NASA Johnson Space Center 
Chang, Mai Lee  NASA Johnson Space Center 
Project Information: Grant/Contract No. Directed Research 
Responsible Center: NASA JSC 
Grant Monitor: Whitmore, Mihriban  
Center Contact: 281-244-1004 
mihriban.whitmore-1@nasa.gov 
Unique ID: 9450 
Solicitation / Funding Source: Directed Research 
Grant/Contract No.: Directed Research 
Project Type: GROUND 
Flight Program:  
TechPort: No 
No. of Post Docs:  
No. of PhD Candidates:  
No. of Master's Candidates:  
No. of Bachelor's Candidates:  
No. of PhD Degrees:  
No. of Master's Degrees:  
No. of Bachelor's Degrees:  
Human Research Program Elements: (1) SHFH:Space Human Factors & Habitability (archival in 2017)
Human Research Program Risks: (1) HSIA:Risk of Adverse Outcomes Due to Inadequate Human Systems Integration Architecture
Human Research Program Gaps: (1) HSIA-701:We need to determine how human-automation-robotic systems can be optimized for effective enhancement and monitoring of crew capabilities, health, and performance, during increasingly earth-independent, future exploration missions (including in-mission and at landing).
Task Description: The study of human-robot interaction (HRI) involves understanding and shaping the interactions between humans and robots (Goodrich & Schultz, 2007). It is important to evaluate how the design of interfaces and command modalities affect the human’s ability to perform tasks accurately, efficiently, and effectively (Crandall, Goodrich, Olsen Jr., & Nielsen, 2005). Many NASA robot systems are teleoperated. Developing safe, reliable, and effective human-robot interfaces for teleoperation involves providing the information necessary to support operator task performance. For robot navigation tasks, which include the operator moving a robot through space or commanding individual robot segments, the operator needs to understand the current and desired state of the robot, and have the most compatible command modality with the task.

In Fiscal Year 2011 (FY11), preparatory work was completed in the form of literature reviews; observations of NASA robot systems; interviews with NASA robotic operators and trainers; and a space HRI workshop. These activities resulted in the selection of three research areas that are the focus of the proposed work. The three research areas are: Video Overlays, Camera Views, and Command Modalities.

Studies proposed in this Directed Research Project in the area of Video Overlays consider two factors in the implementation of augmented reality (AR) for operator displays during teleoperation. The first of these factors is the type of navigational guidance provided by AR symbology. Participants’ performance during teleoperation of a robot arm will be compared when they are provided with command-guidance symbology (i.e., directing the operator what commands to make) or situation-guidance symbology (i.e., providing natural cues so that the operator can infer what commands to make). The second factor to be considered for AR symbology is the effect of overlays that are either superimposed or integrated into the external view of the world. A study is proposed that compares the effects of superimposed and integrated overlays on operator task performance during teleoperated driving tasks.

Studies proposed in the area of Camera Views investigate inclusion/exclusion of a robot within the video feed and camera frame of reference. One study will investigate the effects of including and excluding the robot’s chassis within the video feed presented to operators on path-following and maze traversal task performance. Another study will investigate the effects of the addition of an exocentric camera frame of reference to egocentric frames of reference on operator task performance for these same tasks.

Lastly, studies in the area of Command Modalities will systematically build and evaluate gesture and voice vocabularies for commanding a ground-based mobile robot. The first in this series of studies will have participants produce robot commands for a set of critical control functions. The characteristics of the commands will be analyzed. In a second phase of this study, the strength of association between command and voice/gesture inputs will be evaluated. The next two studies will test the learnability and memorability of the developed vocabularies in the context of a representative task.

Research Impact/Earth Benefits: The video overlays developed by the research projects can be applied to any type of robotic teleoperation situation. The results of the camera view studies will be applicable to mobile robots such as rovers and search and rescue robots. Finally, the method applied and tested for the development of a gesture and voice command vocabulary can be used for any other system to develop similar kinds of communication systems.

Task Progress & Bibliography Information FY2013 
Task Progress: Human-robot interaction (HRI) is a discipline investigating the factors affecting the interactions between humans and robots. It is important to evaluate how the design of interfaces and command modalities affect the human’s ability to perform tasks accurately, efficiently, and effectively when working with a robot. By understanding the effects of interface design on human performance, workload, and situation awareness, interfaces can be developed to appropriately support the human in performing tasks with minimal errors and with appropriate interaction time and effort. Thus, the results of research on human-robot interfaces have direct implications for the design of robotic systems. This research project concentrates on three areas associated with interfaces and command modalities in HRI which are applicable to NASA robot systems: 1) Video Overlays, 2) Camera Views, and 3) Command Modalities.

The first study focused on video overlays, and investigated how Augmented Reality (AR) symbology can be added to the human-robot interface to improve teleoperation performance. Three types of AR symbology were explored in this study, command guidance (CG), situation guidance (SG), and both (SCG). CG symbology gives operators explicit instructions on what commands to input, whereas SG symbology gives operators implicit cues so that operators can infer the input commands. The combination of CG and SG provided operators with explicit and implicit cues allowing the operator to choose which symbology to utilize. The objective of the study was to understand how AR symbology affects the human operator’s ability to align a robot arm to a target using a joystick and the ability to allocate attention between the symbology and external views of the world. The study evaluated the effect of type of symbology (CG and SG) on operator task performance and attention allocation during teleoperation of a robot arm.

Planned studies for the near future:

The second study will expand on the first study by evaluating the effect of type of navigational guidance (CG and SG) on operator task performance and attention allocation during teleoperation of a robot arm through uplinked, manually entered commands. Although this study complements the first study on navigational guidance with hand controllers, it is a separate investigation due to the distinction in intended operators (i.e., crewmembers versus ground-operators).

A third study will look at superimposed and integrated overlays for teleoperation of a mobile robot using a hand controller. When AR is superimposed on the external world, it appears to be fixed onto the display and internal to the operators’ workstation. Unlike superimposed overlays, integrated overlays often appear as three-dimensional objects and move as if part of the external world. Studies conducted in the aviation domain show that integrated overlays can improve situation awareness and reduce the amount of deviation from the optimal path. The purpose of the study is to investigate whether these results apply to HRI tasks, such as navigation with a mobile robot.

HRP Gaps:

This HRI research contributes to closure of HRP gaps by providing information on how display and control characteristics--those related to guidance, feedback, and command modalities--affect robot operator performance. The overarching goals are to improve interface usability, reduce operator error, and develop candidate guidelines to design effective human-robot interfaces.

Bibliography: Description: (Last Updated: 03/03/2016) 

Show Cumulative Bibliography
 
Abstracts for Journals and Proceedings Rochlis J, Sandor A, Chang ML, Pace J. "Human-Robot Interaction Directed Research Project." 2013 NASA Human Research Program Investigators’ Workshop, Galveston, TX, February 12-14, 2013.

2013 NASA Human Research Program Investigators’ Workshop, Galveston, TX, February 12-14, 2013. , Feb-2013

Project Title:  Assessment, Evaluation, and Development of Methodologies, Metrics and Tools Available for Use in Multi-agent (Human and Robotic) Teaming Reduce
Fiscal Year: FY 2012 
Division: Human Research 
Research Discipline/Element:
HRP SHFH:Space Human Factors & Habitability (archival in 2017)
Start Date: 09/07/2012  
End Date: 09/30/2015  
Task Last Updated: 09/17/2013 
Download report in PDF pdf
Principal Investigator/Affiliation:   Sandor, Aniko  Ph.D. / Lockheed-Martin/NASA Johnson Space Center 
Address:  2101 Nasa Parkway 
Mail Code: C46 
Houston , TX 77058 
Email: Aniko.Sandor-1@nasa.gov 
Phone: 281.483.9726  
Congressional District: 22 
Web:  
Organization Type: NASA CENTER 
Organization Name: Lockheed-Martin/NASA Johnson Space Center 
Joint Agency:  
Comments:  
Co-Investigator(s)
Affiliation: 
Cross, Ernest  Ph.D. Lockheed Martin/NASA Johnson Space Center 
Chang, Mai Lee  NASA Johnson Space Center 
Project Information: Grant/Contract No. Directed Research 
Responsible Center: NASA JSC 
Grant Monitor: Sullivan, Thomas  
Center Contact:  
thomas.a.sullivan@nasa.gov 
Unique ID: 9450 
Solicitation / Funding Source: Directed Research 
Grant/Contract No.: Directed Research 
Project Type: GROUND 
Flight Program:  
TechPort: No 
No. of Post Docs:  
No. of PhD Candidates:  
No. of Master's Candidates:  
No. of Bachelor's Candidates:  
No. of PhD Degrees:  
No. of Master's Degrees:  
No. of Bachelor's Degrees:  
Human Research Program Elements: (1) SHFH:Space Human Factors & Habitability (archival in 2017)
Human Research Program Risks: (1) HSIA:Risk of Adverse Outcomes Due to Inadequate Human Systems Integration Architecture
Human Research Program Gaps: (1) HSIA-701:We need to determine how human-automation-robotic systems can be optimized for effective enhancement and monitoring of crew capabilities, health, and performance, during increasingly earth-independent, future exploration missions (including in-mission and at landing).
Task Description: The study of human-robot interaction (HRI) involves understanding and shaping the interactions between humans and robots (Goodrich & Schultz, 2007). It is important to evaluate how the design of interfaces and command modalities affect the human’s ability to perform tasks accurately, efficiently, and effectively (Crandall, Goodrich, Olsen Jr., & Nielsen, 2005). Many NASA robot systems are teleoperated. Developing safe, reliable, and effective human-robot interfaces for teleoperation involves providing the information necessary to support operator task performance. For robot navigation tasks, which include the operator moving a robot through space or commanding individual robot segments, the operator needs to understand the current and desired state of the robot, and have the most compatible command modality with the task.

In Fiscal Year 2011 (FY11), preparatory work was completed in the form of literature reviews; observations of NASA robot systems; interviews with NASA robotic operators and trainers; and a space HRI workshop. These activities resulted in the selection of three research areas that are the focus of the proposed work. The three research areas are: Video Overlays, Camera Views, and Command Modalities.

Studies proposed in this Directed Research Project in the area of Video Overlays consider two factors in the implementation of augmented reality (AR) for operator displays during teleoperation. The first of these factors is the type of navigational guidance provided by AR symbology. Participants’ performance during teleoperation of a robot arm will be compared when they are provided with command-guidance symbology (i.e., directing the operator what commands to make) or situation-guidance symbology (i.e., providing natural cues so that the operator can infer what commands to make). The second factor to be considered for AR symbology is the effect of overlays that are either superimposed or integrated into the external view of the world. A study is proposed that compares the effects of superimposed and integrated overlays on operator task performance during teleoperated driving tasks.

Studies proposed in the area of Camera Views investigate inclusion/exclusion of a robot within the video feed and camera frame of reference. One study will investigate the effects of including and excluding the robot’s chassis within the video feed presented to operators on path-following and maze traversal task performance. Another study will investigate the effects of the addition of an exocentric camera frame of reference to egocentric frames of reference on operator task performance for these same tasks.

Lastly, studies in the area of Command Modalities will systematically build and evaluate gesture and voice vocabularies for commanding a ground-based mobile robot. The first in this series of studies will have participants produce robot commands for a set of critical control functions. The characteristics of the commands will be analyzed. In a second phase of this study, the strength of association between command and voice/gesture inputs will be evaluated. The next two studies will test the learnability and memorability of the developed vocabularies in the context of a representative task.

Research Impact/Earth Benefits:

Task Progress & Bibliography Information FY2012 
Task Progress: New project for FY2012.

[Ed. note 9/17/13: added to Task Book when received information from HRP]

Bibliography: Description: (Last Updated: 03/03/2016) 

Show Cumulative Bibliography
 
 None in FY 2012