Menu

 

The NASA Task Book
Advanced Search     

Project Title:  HCAAM VNSCOR: Responsive Multimodal Human-Automation Communication for Augmenting Human Situation Awareness in Nominal and Off-Nominal Scenarios Reduce
Fiscal Year: FY 2020 
Division: Human Research 
Research Discipline/Element:
HRP HFBP:Human Factors & Behavioral Performance (IRP Rev H)
Start Date: 04/01/2019  
End Date: 12/31/2019  
Task Last Updated: 01/14/2020 
Download report in PDF pdf
Principal Investigator/Affiliation:   Stirling, Leia  Ph.D. / University of Michigan 
Address:  Industrial and Operations Engineering 
1205 Beal Avenue, G634 IOE Building 
Ann Arbor , MI 48109 
Email: leias@umich.edu 
Phone: 617-324-7410  
Congressional District: 12 
Web:  
Organization Type: UNIVERSITY 
Organization Name: University of Michigan 
Joint Agency:  
Comments: NOTE: PI moved to University of Michigan in fall 2019; previous affiliation was Massachusetts Institute of Technology 
Co-Investigator(s)
Affiliation: 
Blossom, Jonathon   NASA Jet Propulsion Laboratory 
Liu, Andrew  Ph.D. Massachusetts Institute of Technology 
Atkins, Ella  Ph.D. University of Michigan 
Key Personnel Changes / Previous PI: Personnel changes (January 2020 report): Principal Investigator (PI) Prof. Stirling moved from MIT to the University of Michigan in fall 2019; grant will be transferred there in early 2020. Prof. Dave Miller (MIT, Co-Investigator) left the project and Prof. Ella Atkins (University of Michigan, Co-I) was added. Mr. Jonathon Blossom (Jet Propulsion Laboratory-JPL PI) took over lead at JPL as the previous JPL PI, Mr. Victor Luo, and Co-I, Mr. Alex Menzies, have left JPL.
Project Information: Grant/Contract No. 80NSSC19K0703 
Responsible Center: NASA JSC 
Grant Monitor: Williams, Thomas  
Center Contact: 281-483-8773 
thomas.j.will1@nasa.gov 
Unique ID: 12300 
Solicitation / Funding Source: 2017-2018 HERO 80JSC017N0001-BPBA Topics in Biological, Physiological, and Behavioral Adaptations to Spaceflight. Appendix C 
Grant/Contract No.: 80NSSC19K0703 
Project Type: GROUND 
Flight Program:  
TechPort: No 
No. of Post Docs:
No. of PhD Candidates:
No. of Master's Candidates:
No. of Bachelor's Candidates:
No. of PhD Degrees:
No. of Master's Degrees:
No. of Bachelor's Degrees:
Human Research Program Elements: (1) HFBP:Human Factors & Behavioral Performance (IRP Rev H)
Human Research Program Risks: (1) HSIA:Risk of Adverse Outcomes Due to Inadequate Human Systems Integration Architecture
Human Research Program Gaps: (1) HSIA-101:We need to identify the Human Systems Integration (HSI) – relevant crew health and performance outcomes, measures, and metrics, needed to characterize and mitigate risk, for future exploration missions.
(2) HSIA-201:We need to evaluate the demands of future exploration habitat/vehicle systems and mission scenarios (e.g. increased automation, multi-modal communication) on individuals and teams, and determine the risks these demands pose to crew health and performance.
(3) HSIA-301:We need to determine the on-board, intelligent systems that will support crew health and performance, and we need to establish the thresholds that will define how these systems should be implemented (including in-mission and at landing).
(4) HSIA-401:We need to determine how HSI can be applied in the vehicle/habitat and computer interface Design Phase to mitigate potential decrements in operationally-relevant performance (e.g. problem-solving, execution procedures), during increasingly earth-independent, future exploration missions (including in-mission and at landing).
(5) HSIA-501:We need to determine how HSI will be used in the development of dynamic and adaptive mission procedures and processes, to mitigate individual and team performance decrements during increasingly earth-independent, future exploration missions (including in-mission and at landing).
(6) HSIA-701:We need to determine how human-automation-robotic systems can be optimized for effective enhancement and monitoring of crew capabilities, health, and performance, during increasingly earth-independent, future exploration missions (including in-mission and at landing).
Task Description: This task is part of the Human Capabilities Assessments for Autonomous Missions (HCAAM) Virtual NASA Specialized Center of Research (VNSCOR).

Crew extravehicular activity (EVA) is limited on spaceflight missions. Multiple, small robotic spacecraft with varying levels of autonomy are needed to perform tasks that might have been completed by an astronaut (e.g., an exterior surface inspection or repair). Crews on long duration exploration missions (LDEM) will have less access to ground support during task operations. As a result, they will need to process more information and communicate with autonomous robots effectively to ensure tasks are progressing safely and on schedule.

The objective of these studies is to investigate the use of augmented reality (AR) multimodal interface displays and communication pathways for improving human-robot communication, situation awareness (SA), trust, and task performance. This will lead to developing guidelines for designing human-robot system interactions that enable operational performance for crews on spaceflight missions.

The specific aims are to:

1) Develop a simulation testbed for examining communication between human-robot teams.

2) Develop a hardware testbed for examining communication between human-robot teams.

3) Evaluate human SA, trust, and task performance within a short duration and long-duration ground-based study (simulation and/or hardware) through testing various interface communication modalities and information displays.

4) (Option) Perform additional studies for alternate parameters of interest that could be tested using the study testbeds. Additional parameters include timing and persistence of information, gesture command mapping, varying the levels of robot automation, evaluating precision enabled by each command mode.

Research Impact/Earth Benefits: Augmented Reality (AR) has opportunity to support decision making across a variety of use-case scenarios, including but not limited to manufacturing, automated vehicles, military training, and entertainment. This research compares AR to other visual modalities for telerobotics applications, specifically considering robotic control and anomaly inspection. Results from this study can inform how AR is integrated for task-specific applications, as there may be tasks that have increased benefit from AR, whereas others may have additional considerations that emerge.

Task Progress & Bibliography Information FY2020 
Task Progress: In this reporting period, an initial simulation environment was developed aligning with Aim 1. While the larger study proposed includes the research question examining the effect of display (standard 2D camera images on a computer screen, 2D projection of a 3D reconstruction on a computer screen, and 3D AR) on task performance, an initial pilot study was performed to assess the simulation environment created. The Microsoft HoloLens, a gesturally-controlled AR headset, was used to display a simulated space station and satellite environment. Twelve subjects inspected the exterior of the space station using a simulated free-flying robot to detect surface anomalies. The subjects used gestural commands to control the satellite in three operation modes: satellite body (local) reference frame control, station (global) reference frame control, and waypoint control (markers are placed for the inspector to follow). Anomalies occurred in areas with high or low risk of satellite contact with the station structure. Subjects were instructed to prioritize their performance based on the following: (1) avoid station collisions, (2) locate anomalies, and (3) inspect the full exterior as quickly as possible.

Subjects performed the inspection in both fixed command mode (in each of the three operation modes, order randomized) and in free mode (with free choice of when and how often they utilized different operation modes). Performance measures included percentage of station inspected, number of collisions, and accuracy in anomaly detection. Workload was evaluated using the NASA Task Load Index (TLX) method. Interactions with the simulated environment were logged to characterize subject strategies for utilizing the interface (i.e., moving the viewpoint, changing scale and position of virtual models).

Manually controlling the satellite in the global and local frames maximized the percentage of the station that was inspected, while using waypoints led to fewer collisions between the satellite and the station. There was no significant difference in anomaly detection across the various command modes for the anomaly frequency and type selected. When operating with free choice of command mode, subjects preferred to remain in a single mode, typically either the global or local control. Subject feedback and NASA TLX scores suggest the global and local modes required less workload and were more usable than waypoint control as currently implemented. One subject was observed to select strategies that did not comply with our instructed task priorities as they appeared to prioritize speed over anomaly detection. The AR interface enabled users to manipulate the virtual models to alter their viewpoint and move around the virtual model in the real-world space throughout the inspection task. The design of waypoint navigation enabled better collision avoidance as it encouraged planning, but was more suited to single-point inspection tasks as currently implemented. The global and local modes were more suited to the patrol task in this study, enabling easier navigation at close proximity.

Based on the initial pilot evaluation, we are in the process of updating the simulation environment. While the initial pilot study focused on robotic system control and inspection, the follow-on study simulation incorporates two different tasks across the visual modalities, an inspection-only (IO) task, where the images are pre-recorded, and a robotic control and inspection (RCI), where the operator must both obtain the images and inspect within the same trial. We have also enabled an automated mode where the robot performs an edge following task. In the manual modes, the operator can still control in local or global mode. While waypoint navigation provides a mid-level form of automation, it was determined that this control mode was not appropriate for the perimeter inspection task. When the operator manually stops the edge following, they would switch into manual control mode. During the next year of the program, the simulation study comparing the visual modalities will be performed.

[EDITOR'S NOTE 4/15/2020: Due to Principal Investigator Stirling's move to University of Michigan, the project was transferred there with a new grant number 80NSSC20K0409 and a new period of performance, 12/4/2019-12/3/2023. See the continuation project for subsequent reporting.]

Bibliography: Description: (Last Updated: 11/09/2023) 

Show Cumulative Bibliography
 
Abstracts for Journals and Proceedings Todd JE, Liu A, Stirling L. "Investigation of augmented reality in enabling telerobotic on-orbit inspection of spacecraft." Presented at the 2020 NASA Human Research Program Investigators’ Workshop, Galveston, TX, January 27-30, 2020.

Abstracts. 2020 NASA Human Research Program Investigators’ Workshop, Galveston, TX, January 27-30, 2020. , Jan-2020

Abstracts for Journals and Proceedings Marquez J, Stirling L, Fanchiang C, Selva D, Lee J, Schreckenghost D, Robinson S. "Overview of the virtual NASA specialized center of research (VNSCOR)-'Human capabilities assessments for autonomous missions' (HCAAM)." 2020 NASA Human Research Program Investigators’ Workshop, Galveston, TX, January 27-30, 2020.

Abstracts. 2020 NASA Human Research Program Investigators’ Workshop, Galveston, TX, January 27-30, 2020. , Jan-2020

Abstracts for Journals and Proceedings Stirling L. "Emergent behaviors when defining a gesture interface for controlling a robotic arm." Presented at the ErgoX Symposium 2019, Seattle, WA, October 28, 2019.

ErgoX Symposium 2019, Seattle, WA, October 28, 2019. , Oct-2019

Project Title:  HCAAM VNSCOR: Responsive Multimodal Human-Automation Communication for Augmenting Human Situation Awareness in Nominal and Off-Nominal Scenarios Reduce
Fiscal Year: FY 2019 
Division: Human Research 
Research Discipline/Element:
HRP HFBP:Human Factors & Behavioral Performance (IRP Rev H)
Start Date: 04/01/2019  
End Date: 12/31/2019  
Task Last Updated: 04/22/2019 
Download report in PDF pdf
Principal Investigator/Affiliation:   Stirling, Leia  Ph.D. / University of Michigan 
Address:  Industrial and Operations Engineering 
1205 Beal Avenue, G634 IOE Building 
Ann Arbor , MI 48109 
Email: leias@umich.edu 
Phone: 617-324-7410  
Congressional District: 12 
Web:  
Organization Type: UNIVERSITY 
Organization Name: University of Michigan 
Joint Agency:  
Comments: NOTE: PI moved to University of Michigan in fall 2019; previous affiliation was Massachusetts Institute of Technology 
Co-Investigator(s)
Affiliation: 
Kim, So Young  Ph.D. NASA Jet Propulsion Laboratory 
Luo, Victor  M.S. NASA Jet Propulsion Laboratory 
Miller, David  Ph.D. Massachusetts Institute of Technology 
Project Information: Grant/Contract No. 80NSSC19K0703 
Responsible Center: NASA JSC 
Grant Monitor: Williams, Thomas  
Center Contact: 281-483-8773 
thomas.j.will1@nasa.gov 
Unique ID: 12300 
Solicitation / Funding Source: 2017-2018 HERO 80JSC017N0001-BPBA Topics in Biological, Physiological, and Behavioral Adaptations to Spaceflight. Appendix C 
Grant/Contract No.: 80NSSC19K0703 
Project Type: GROUND 
Flight Program:  
TechPort: No 
No. of Post Docs:  
No. of PhD Candidates:  
No. of Master's Candidates:  
No. of Bachelor's Candidates:  
No. of PhD Degrees:  
No. of Master's Degrees:  
No. of Bachelor's Degrees:  
Human Research Program Elements: (1) HFBP:Human Factors & Behavioral Performance (IRP Rev H)
Human Research Program Risks: (1) HSIA:Risk of Adverse Outcomes Due to Inadequate Human Systems Integration Architecture
Human Research Program Gaps: (1) HSIA-101:We need to identify the Human Systems Integration (HSI) – relevant crew health and performance outcomes, measures, and metrics, needed to characterize and mitigate risk, for future exploration missions.
(2) HSIA-201:We need to evaluate the demands of future exploration habitat/vehicle systems and mission scenarios (e.g. increased automation, multi-modal communication) on individuals and teams, and determine the risks these demands pose to crew health and performance.
(3) HSIA-301:We need to determine the on-board, intelligent systems that will support crew health and performance, and we need to establish the thresholds that will define how these systems should be implemented (including in-mission and at landing).
(4) HSIA-401:We need to determine how HSI can be applied in the vehicle/habitat and computer interface Design Phase to mitigate potential decrements in operationally-relevant performance (e.g. problem-solving, execution procedures), during increasingly earth-independent, future exploration missions (including in-mission and at landing).
(5) HSIA-501:We need to determine how HSI will be used in the development of dynamic and adaptive mission procedures and processes, to mitigate individual and team performance decrements during increasingly earth-independent, future exploration missions (including in-mission and at landing).
(6) HSIA-701:We need to determine how human-automation-robotic systems can be optimized for effective enhancement and monitoring of crew capabilities, health, and performance, during increasingly earth-independent, future exploration missions (including in-mission and at landing).
Task Description: This task is part of the Human Capabilities Assessments for Autonomous Missions (HCAAM) Virtual NASA Specialized Center of Research (VNSCOR).

The objective of this research is to provide recommendations for augmenting human situation awareness (SA) and task performance through multimodal displays and communication pathways based on empirical evidence. Specifically, we will evaluate the effectiveness of several multimodal Virtual Reality (VR) techniques in providing spatial and temporal SA to a human operator controlling multiple semi-autonomous agents. Our testbed will simulate a Long-Duration Exploration Mission (LDEM) inspection task using the ground-based Massachusetts Institute of Technology (MIT) Synchronized Position Hold, Engage, Reorient, Experimental Satellites (SPHERES) platform enhanced with NASA Jet Propulsion Laboratory (NASA-JPL) automatic scene reconstruction capability. A human study will be conducted with the human supervisor providing commands to the SPHERES using images rendered in a virtual environment. The results of this project will provide empirical evidence for revising portions of NASA-STD-3001 and the NASA Human Integration Design Handbook (HIDH) that guide interface design for effective SA and task performance. There is a need to expand current guidance on responsive displays, especially when integrated with VR technologies, to enable SA for relevant operational tasks.

The proposed project will integrate current NASA-JPL technology within a small robotic satellite testbed to examine the bi-directional communication between the human-robot team to enable improved SA. We propose the following specific aims: (1) Integrate and extend existing capabilities at JPL and MIT into a testbed for examining information communication between human-autonomy teams and (2) Evaluate SA, trust, and task performance within a ground-based study with selected communication modalities and information displays.

Research Impact/Earth Benefits:

Task Progress & Bibliography Information FY2019 
Task Progress: New project for FY2019.

Bibliography: Description: (Last Updated: 11/09/2023) 

Show Cumulative Bibliography
 
 None in FY 2019