Menu

 

The NASA Task Book
Advanced Search     

Project Title:  HCAAM VNSCOR: Responsive Multimodal Human-Automation Communication for Augmenting Human Situation Awareness in Nominal and Off-Nominal Scenarios (80NSSC20K0409) Reduce
Images: icon  Fiscal Year: FY 2026 
Division: Human Research 
Research Discipline/Element:
HRP HFBP:Human Factors & Behavioral Performance (IRP Rev H)
Start Date: 12/04/2019  
End Date: 12/03/2024  
Task Last Updated: 02/27/2025 
Download Task Book report in PDF pdf
Principal Investigator/Affiliation:   Stirling, Leia  Ph.D. / University of Michigan 
Address:  Industrial and Operations Engineering 
1205 Beal Avenue, G634 IOE Building 
Ann Arbor , MI 48109 
Email: leias@umich.edu 
Phone: 617-324-7410  
Congressional District: 12 
Web:  
Organization Type: UNIVERSITY 
Organization Name: University of Michigan 
Joint Agency:  
Comments: NOTE: PI moved to University of Michigan in fall 2019; previous affiliation was Massachusetts Institute of Technology 
Co-Investigator(s)
Affiliation: 
Blossom, Jonathon  NASA Jet Propulsion Laboratory  
Atkins, Ella  Ph.D. Virginia Tech 
Liu, Andrew  Ph.D. Massachusetts Institute of Technology 
Key Personnel Changes / Previous PI: N/A
Project Information: Grant/Contract No. 80NSSC20K0409 
Responsible Center: NASA JSC 
Grant Monitor: Whitmire, Alexandra  
Center Contact:  
alexandra.m.whitmire@nasa.gov 
Unique ID: 12797 
Solicitation / Funding Source: 2017-2018 HERO 80JSC017N0001-BPBA Topics in Biological, Physiological, and Behavioral Adaptations to Spaceflight. Appendix C 
Grant/Contract No.: 80NSSC20K0409 
Project Type: Ground 
Flight Program:  
No. of Post Docs:
No. of PhD Candidates:
No. of Master's Candidates:
No. of Bachelor's Candidates:
No. of PhD Degrees:
No. of Master's Degrees:
No. of Bachelor's Degrees:
Human Research Program Elements: (1) HFBP:Human Factors & Behavioral Performance (IRP Rev H)
Human Research Program Risks: None
Human Research Program Gaps: None
Flight Assignment/Project Notes: NOTE: End date changed to 12/03/2024 per NSSC information (Ed., 11/10/23).

Task Description: [Ed. note April 2020: Continuation of "HCAAM VNSCOR: Responsive Multimodal Human-Automation Communication for Augmenting Human Situation Awareness in Nominal and Off-Nominal Scenarios," grant 80NSSC19K0703, with the same Principal Investigator (PI) Leia Stirling, Ph.D., due to PI move to University of Michigan from Massachusetts Institute of Technology in fall 2019]

This task is part of the Human Capabilities Assessments for Autonomous Missions (HCAAM) Virtual NASA Specialized Center of Research (VNSCOR).

Crew extravehicular activity (EVA) is limited on spaceflight missions. Multiple, small robotic spacecraft with varying levels of autonomy are needed to perform tasks that might have been completed by an astronaut (e.g., an exterior surface inspection or repair). Crews on long duration exploration missions (LDEM) will have less access to ground support during task operations. As a result, they will need to process more information and communicate with autonomous robots effectively to ensure tasks are progressing safely and on schedule.

The objective of these studies is to investigate the use of augmented reality (AR) multimodal interface displays and communication pathways for improving human-robot communication, situation awareness (SA), trust, and task performance. This will lead to developing guidelines for designing human-robot system interactions that enable operational performance for crews on spaceflight missions.

The specific aims are to:

1) Develop a simulation testbed for examining communication between human-robot teams.

2) Develop a hardware testbed for examining communication between human-robot teams.

3) Evaluate human SA, trust, and task performance within a short duration and long-duration ground-based study (simulation and/or hardware) through testing various interface communication modalities and information displays.

4) (Option) Perform additional studies for alternate parameters of interest that could be tested using the study testbeds. Additional parameters include timing and persistence of information, gesture command mapping, varying the levels of robot automation, evaluating precision enabled by each command mode.

Research Impact/Earth Benefits: Augmented Reality (AR) has the opportunity to support decision making across a variety of use-case scenarios, including but not limited to manufacturing, automated vehicles, military training, and entertainment. This research compares AR to other visual modalities for telerobotics applications (considering robotic control, anomaly inspection, and docking) and for astronaut assessment (sensorimotor, dynamic balance, and operational tasks). Results from these studies inform how AR is integrated for task-specific applications, as there may be tasks that have increased benefit from AR, whereas others may have additional considerations that emerge.

Task Progress & Bibliography Information FY2026 
Task Progress: The objective of these studies was to investigate the use of augmented reality (AR) multimodal interface displays and communication pathways for improving human-robot communication, situation awareness (SA), trust, and task performance. The collection of studies performed supports the development of guidelines for designing human-robot system interactions that enable operational performance for crews on spaceflight missions. Several recommendations can be made based on the studies performed.

Our first two studies examined an inspection task in a simulation environment, where a small robotic agent was external to the space station being inspected. Both studies support decreased detection accuracy using AR for this task and increase in detection accuracy during synchronous interactions. Factors influencing the decrease in performance of AR for this task include the reduced contrast and resolution of the presented image, as well as reliability of hand interactions. For tasks that require higher resolution and contrast, like inspection, computer screens are recommend over AR implementations for current technology. For tasks that require short periods of vigilance, synchronous actions between human-robot teams are recommended. However, the optimal range of times for synchronous actions should be further studied.

We then adapted this simulation environment to examine agent movement as a communication modality. Autonomous agent motion plans can convey system intent to the monitor supporting the decision-making process. In this study, participants viewed an egocentric view from the agent in a supervisory role to determine accuracy and appropriateness of the docking maneuver. Participants appropriately took over control, but the takeover location was dependent on path characteristics. Path characteristics that led to egocentric views with the dock in the primary field of view led to earlier correct takeovers. Larger deviations from a central point-to-point connection were less preferred. Based on these findings, when supervising a robotic agent with an egocentric view, path directionality and heading should support a central view of the target location and paths should minimize deviations from the shortest-line path between waypoints. Path planning methods should incorporate human-aware constraints to support path selection from available solutions.

In our final set of studies, we further examined how AR impacts sensorimotor and balance related tasks. In these studies, participants performed a task using AR cues versus physical cues. Participants were able to complete all assessment tasks successfully in AR and meaningful measures of performance were obtained from the embedded sensors. These data support that AR has the potential to be used to track performance, BUT the baseline thresholds of performance can differ from other assessment modalities. Based on these studies, we recommend that AR interfaces should: (1) feature wider targets (minimum 0.025 m) than required in physical interfaces, (2) use visual cues to support depth perception (e.g., proximity lighting), and (3) take care in selecting the interaction region for a given object. Improvements are needed in AR to support increased permissibility for hand gesture detection and performance baselines should be determined within AR for use as an assessment platform.

Bibliography: Description: (Last Updated: 05/15/2025) 

Show Cumulative Bibliography
 
Articles in Peer-reviewed Journals Weiss H, Stirling L. "Augmented reality assessments to support human spaceflight performance evaluation." Aerosp Med Hum Perform. 2024 Nov;95(11):831-40. https://www.doi.org/10.3357/AMHP.6393.2024 ; PMID: 39711343 , Nov-2024
Articles in Peer-reviewed Journals Apodaca B, Stirling L, Atkins E. "Orbital dynamic effects on fuel use in sampling-based plans for proximity operations." J Spacecr Rockets. 2025 Mar 16;1-12. https://doi.org/10.2514/1.A35934 , Mar-2025
Project Title:  HCAAM VNSCOR: Responsive Multimodal Human-Automation Communication for Augmenting Human Situation Awareness in Nominal and Off-Nominal Scenarios (80NSSC20K0409) Reduce
Images: icon  Fiscal Year: FY 2025 
Division: Human Research 
Research Discipline/Element:
HRP HFBP:Human Factors & Behavioral Performance (IRP Rev H)
Start Date: 12/04/2019  
End Date: 12/03/2024  
Task Last Updated: 09/16/2024 
Download Task Book report in PDF pdf
Principal Investigator/Affiliation:   Stirling, Leia  Ph.D. / University of Michigan 
Address:  Industrial and Operations Engineering 
1205 Beal Avenue, G634 IOE Building 
Ann Arbor , MI 48109 
Email: leias@umich.edu 
Phone: 617-324-7410  
Congressional District: 12 
Web:  
Organization Type: UNIVERSITY 
Organization Name: University of Michigan 
Joint Agency:  
Comments: NOTE: PI moved to University of Michigan in fall 2019; previous affiliation was Massachusetts Institute of Technology 
Co-Investigator(s)
Affiliation: 
Blossom, Jonathon  NASA Jet Propulsion Laboratory  
Atkins, Ella  Ph.D. Virginia Tech 
Liu, Andrew  Ph.D. Massachusetts Institute of Technology 
Key Personnel Changes / Previous PI: N/A
Project Information: Grant/Contract No. 80NSSC20K0409 
Responsible Center: NASA JSC 
Grant Monitor: Whitmire, Alexandra  
Center Contact:  
alexandra.m.whitmire@nasa.gov 
Unique ID: 12797 
Solicitation / Funding Source: 2017-2018 HERO 80JSC017N0001-BPBA Topics in Biological, Physiological, and Behavioral Adaptations to Spaceflight. Appendix C 
Grant/Contract No.: 80NSSC20K0409 
Project Type: Ground 
Flight Program:  
TechPort: No 
No. of Post Docs:
No. of PhD Candidates:
No. of Master's Candidates:
No. of Bachelor's Candidates:
No. of PhD Degrees:
No. of Master's Degrees:
No. of Bachelor's Degrees:
Human Research Program Elements: (1) HFBP:Human Factors & Behavioral Performance (IRP Rev H)
Human Research Program Risks: None
Human Research Program Gaps: None
Flight Assignment/Project Notes: NOTE: End date changed to 12/03/2024 per NSSC information (Ed., 11/10/23).

Task Description: [Ed. note April 2020: Continuation of "HCAAM VNSCOR: Responsive Multimodal Human-Automation Communication for Augmenting Human Situation Awareness in Nominal and Off-Nominal Scenarios," grant 80NSSC19K0703, with the same Principal Investigator (PI) Leia Stirling, Ph.D., due to PI move to University of Michigan from Massachusetts Institute of Technology in fall 2019]

This task is part of the Human Capabilities Assessments for Autonomous Missions (HCAAM) Virtual NASA Specialized Center of Research (VNSCOR).

Crew extravehicular activity (EVA) is limited on spaceflight missions. Multiple, small robotic spacecraft with varying levels of autonomy are needed to perform tasks that might have been completed by an astronaut (e.g., an exterior surface inspection or repair). Crews on long duration exploration missions (LDEM) will have less access to ground support during task operations. As a result, they will need to process more information and communicate with autonomous robots effectively to ensure tasks are progressing safely and on schedule.

The objective of these studies is to investigate the use of augmented reality (AR) multimodal interface displays and communication pathways for improving human-robot communication, situation awareness (SA), trust, and task performance. This will lead to developing guidelines for designing human-robot system interactions that enable operational performance for crews on spaceflight missions.

The specific aims are to:

1) Develop a simulation testbed for examining communication between human-robot teams.

2) Develop a hardware testbed for examining communication between human-robot teams.

3) Evaluate human SA, trust, and task performance within a short duration and long-duration ground-based study (simulation and/or hardware) through testing various interface communication modalities and information displays.

4) (Option) Perform additional studies for alternate parameters of interest that could be tested using the study testbeds. Additional parameters include timing and persistence of information, gesture command mapping, varying the levels of robot automation, evaluating precision enabled by each command mode.

Research Impact/Earth Benefits: Augmented Reality (AR) has the opportunity to support decision making across a variety of use-case scenarios, including but not limited to manufacturing, automated vehicles, military training, and entertainment. This research compares AR to other visual modalities for telerobotics applications, specifically considering robotic control and anomaly inspection. Results from this study can inform how AR is integrated for task-specific applications, as there may be tasks that have increased benefit from AR, whereas others may have additional considerations that emerge.

Task Progress & Bibliography Information FY2025 
Task Progress: The objective of these studies is to investigate the use of augmented reality (AR) multimodal interface displays and communication pathways for improving human-robot communication, situation awareness (SA), trust, and task performance. This will lead to developing guidelines for designing human-robot system interactions that enable operational performance for crews on spaceflight missions.

During this reporting period, our team performed synergistic follow-on studies using the simulation space station environment developed in a previous reporting period. The first study examined how the motion plan trajectory shape influenced a decision to takeover control during a docking maneuver from an egocentric viewpoint. Here takeovers occurred when the human supervisor deemed that the agent was going to an incorrect docking location, or a collision could occur. The second study examined how viewpoint (egocentric vs. exocentric) influenced the decision to takeover control during a docking maneuver. We observed that both path trajectory and viewpoints influenced the decisions to takeover control. We also observed that there were effects on the decision to handover control (i.e., when the human supervisor determined the agent was performing correctly and they no longer needed to monitor performance). Findings from these studies will be summarized for the NASA HIDH.

During the reporting period, our team also completed analysis from studies to support gaps in the NASA Human Integration Design Handbook (HIDH) for AR interfaces. We compared performance on sensorimotor and neurovestibular assessment tasks using physical objects and within AR. The sensorimotor task was a multi-directional Fitts’ Law target acquisition task, while the neurovestibular tasks included three clinical balance assessments (Four Square Step test (FSST), the Star Excursion, tandem walking/TW), and three operational tasks (capsule ingress and egress, geology sampling, and obstacle avoidance). For the sensorimotor task, the touchscreen modality yielded improved performance over AR as measured by accuracy, precision, error rates, throughput, and movement time. AR designers can improve performance when designing AR interfaces by selecting larger buttons when accuracy and efficiency are required and by embedding perception cues to button target surfaces such as depth and proximity cues. For the neurovestibular tasks, while participants were able to perform all assessment tasks successfully, their strategies differed when comparing AR performance to physical object use. Task completion times were longer when administered in AR. In the Ingress/Egress task and the Obstacle Avoidance task, head pitch increased while overall head motion and torso motion decreased. These adaptations align with maintaining head stability for viewing the holographic content. The participants were able to complete all tasks within AR and meaningful measures of task performance and postural control were obtained, indicating that AR may be a useful instrumentation solution with embedded sensors to evaluate task performance and postural control. However, care should be taken when comparing performance within AR to other assessment modalities. Results from these studies support research gaps identified in NASA’s Human Research Roadmaps, provide design guidance for AR in NASA’s Human Integration Design Handbook, and support determining whether AR is a viable tool for evaluating astronauts’ vestibular performance throughout mission timelines.

Bibliography: Description: (Last Updated: 05/15/2025) 

Show Cumulative Bibliography
 
Abstracts for Journals and Proceedings Apodaca B, Atkins E, Stirling L. "Effect of orbital dynamics on fuel use in sampling-based plans for proximity operations." 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Detroit, MI, October 1-5, 2023.

2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Detroit, MI, October 1-5, 2023. , Oct-2023

Articles in Peer-reviewed Journals Weiss H, Tang J, Williams C, Stirling L. "Performance on a target acquisition task differs between augmented reality and touch screen displays." Applied Ergonomics. 2024 Apr;116:104185. https://doi.org/10.1016/j.apergo.2023.104185 ; PubMed PMID: 38043456 , Apr-2024
Articles in Peer-reviewed Journals Allred AR, Weiss H, Clark TK, Stirling L. "An augmented reality hand-eye sensorimotor impairment assessment for spaceflight operations." Aerosp Med Hum Perform. 2024 Feb 1;95(2):69-78. https://doi.org/10.3357/AMHP.6313.2024 ; PubMed PMID: 38263106 , Feb-2024
Papers from Meeting Proceedings Larson H, Stirling L. "Autonomous Spacecraft Motion Plan Characteristics Influence Perceived Path Appropriateness." Human Factors and Ergonomics Society Annual Meeting, Phoenix, AR, September 9-13, 2024.

Human Factors and Ergonomics Society Annual Meeting, Phoenix, AR, September 9-13, 2024. , Sep-2024

Papers from Meeting Proceedings Apodaca B, Atkins E, Stirling L. "RRTZ: A Path Planner for Zero Gravity." IEEE Aerospace Conference, Big Sky, MT, March 2-9, 2024.

IEEE Aerospace Conference, Big Sky, MT, March 2-9, 2024. , Mar-2024

Project Title:  HCAAM VNSCOR: Responsive Multimodal Human-Automation Communication for Augmenting Human Situation Awareness in Nominal and Off-Nominal Scenarios (80NSSC20K0409) Reduce
Images: icon  Fiscal Year: FY 2024 
Division: Human Research 
Research Discipline/Element:
HRP HFBP:Human Factors & Behavioral Performance (IRP Rev H)
Start Date: 12/04/2019  
End Date: 12/03/2024  
Task Last Updated: 09/26/2023 
Download Task Book report in PDF pdf
Principal Investigator/Affiliation:   Stirling, Leia  Ph.D. / University of Michigan 
Address:  Industrial and Operations Engineering 
1205 Beal Avenue, G634 IOE Building 
Ann Arbor , MI 48109 
Email: leias@umich.edu 
Phone: 617-324-7410  
Congressional District: 12 
Web:  
Organization Type: UNIVERSITY 
Organization Name: University of Michigan 
Joint Agency:  
Comments: NOTE: PI moved to University of Michigan in fall 2019; previous affiliation was Massachusetts Institute of Technology 
Co-Investigator(s)
Affiliation: 
Blossom, Jonathon  NASA Jet Propulsion Laboratory  
Atkins, Ella  Ph.D. Virginia Tech 
Liu, Andrew  Ph.D. Massachusetts Institute of Technology 
Key Personnel Changes / Previous PI: N/A
Project Information: Grant/Contract No. 80NSSC20K0409 
Responsible Center: NASA JSC 
Grant Monitor: Whitmire, Alexandra  
Center Contact:  
alexandra.m.whitmire@nasa.gov 
Unique ID: 12797 
Solicitation / Funding Source: 2017-2018 HERO 80JSC017N0001-BPBA Topics in Biological, Physiological, and Behavioral Adaptations to Spaceflight. Appendix C 
Grant/Contract No.: 80NSSC20K0409 
Project Type: Ground 
Flight Program:  
TechPort: No 
No. of Post Docs:
No. of PhD Candidates:
No. of Master's Candidates:
No. of Bachelor's Candidates:
No. of PhD Degrees:
No. of Master's Degrees:
No. of Bachelor's Degrees:
Human Research Program Elements: (1) HFBP:Human Factors & Behavioral Performance (IRP Rev H)
Human Research Program Risks: None
Human Research Program Gaps: None
Flight Assignment/Project Notes: NOTE: End date changed to 12/03/2024 per NSSC information (Ed., 11/10/23).

Task Description: [Ed. note April 2020: Continuation of "HCAAM VNSCOR: Responsive Multimodal Human-Automation Communication for Augmenting Human Situation Awareness in Nominal and Off-Nominal Scenarios," grant 80NSSC19K0703, with the same Principal Investigator (PI) Leia Stirling, Ph.D., due to PI move to University of Michigan from Massachusetts Institute of Technology in fall 2019]

This task is part of the Human Capabilities Assessments for Autonomous Missions (HCAAM) Virtual NASA Specialized Center of Research (VNSCOR).

Crew extravehicular activity (EVA) is limited on spaceflight missions. Multiple, small robotic spacecraft with varying levels of autonomy are needed to perform tasks that might have been completed by an astronaut (e.g., an exterior surface inspection or repair). Crews on long duration exploration missions (LDEM) will have less access to ground support during task operations. As a result, they will need to process more information and communicate with autonomous robots effectively to ensure tasks are progressing safely and on schedule.

The objective of these studies is to investigate the use of augmented reality (AR) multimodal interface displays and communication pathways for improving human-robot communication, situation awareness (SA), trust, and task performance. This will lead to developing guidelines for designing human-robot system interactions that enable operational performance for crews on spaceflight missions.

The specific aims are to:

1) Develop a simulation testbed for examining communication between human-robot teams.

2) Develop a hardware testbed for examining communication between human-robot teams.

3) Evaluate human SA, trust, and task performance within a short duration and long-duration ground-based study (simulation and/or hardware) through testing various interface communication modalities and information displays.

4) (Option) Perform additional studies for alternate parameters of interest that could be tested using the study testbeds. Additional parameters include timing and persistence of information, gesture command mapping, varying the levels of robot automation, evaluating precision enabled by each command mode.

Research Impact/Earth Benefits: Augmented Reality (AR) has the opportunity to support decision making across a variety of use-case scenarios, including but not limited to manufacturing, automated vehicles, military training, and entertainment. This research compares AR to other visual modalities for telerobotics applications, specifically considering robotic control and anomaly inspection. Results from this study can inform how AR is integrated for task-specific applications, as there may be tasks that have increased benefit from AR, whereas others may have additional considerations that emerge.

Task Progress & Bibliography Information FY2024 
Task Progress: The objective of these studies is to investigate the use of augmented reality (AR) multimodal interface displays and communication pathways for improving human-robot communication, situation awareness (SA), trust, and task performance. This will lead to developing guidelines for designing human-robot system interactions that enable operational performance for crews on spaceflight missions.

During this reporting period, our team supported the NASA Human Exploration Research Analog (HERA) Campaign 6 missions in data collection and performed data analysis for the missions completed. This study used the Unity-based simulation testbed developed in an earlier year of the program for evaluating the aims of assessing visual presentation modality across two tasks. The visual modalities considered were: (1) 2D camera images from fixed cameras placed externally on a simulated space station, (2) 3D reconstruction shown on a 2D projection, and (3) 3D reconstruction shown in an Augmented Reality environment using the HoloLens v2 platform. The reconstruction simulates using 2D camera images from the inspector to create the 3D object. The tasks evaluated were Synchronous and Asynchronous inspection task. In the Synchronous task, participants flew an inspector robot around the spacecraft to identify any surface anomalies that required closer inspection. The inspector could be flown in automatic mode along a predetermined path or manual mode to move off the assigned path. The robot was controlled with a 3-DOF joystick. In the Asynchronous task, participants analyzed the imagery from a previous inspection flight to identify the potential anomalies on the spacecraft exterior. In both tasks, detected anomalies were captured by taking a picture of the anomaly within the viewpoint. Initial HERA results support that detection accuracy was the highest for the 2D display for the Synchronous Inspection task. Additional interactive 3D viewpoints decreased detection accuracy and increased task completion time. Augmented Reality provided no significant improvement to local navigation, i.e., minimum distance to station or portion of time within two meters, suggesting that the technology did not enhance the perception level of situation awareness. Based on these findings, mission planning operations, when applicable, should include synchronous human-in-the-loop presence for telerobotic inspection of spacecraft. Additional details on the single session study are available in Weiss et al. (2021) and on the initial HERA findings in Liu et al. (2022).

During the reporting period, our team also performed studies to support gaps in the NASA Human Integration Design Handbook (HIDH) for augmented reality (AR) interfaces. We compared performance on sensorimotor and neurovestibular assessment tasks using physical objects and within AR. The sensorimotor task was a multi-directional Fitts’ Law target acquisition task, while the neurovestibular tasks included three clinical balance assessments (Four Square Step test (FSST), the Star Excursion, tandem walking (TW)), and three operational tasks (capsule ingress and egress, geology sampling, and obstacle avoidance). For the sensorimotor task, the touchscreen modality yielded improved performance over AR as measured by accuracy, precision, error rates, throughput, and movement time. AR designers can improve performance when designing AR interfaces by selecting larger buttons when accuracy and efficiency are required and by embedding perception cues to button target surfaces such as depth and proximity cues. For the neurovestibular tasks, while participants were able to perform all assessment tasks successfully, their strategies differed when comparing AR performance to physical object use. Task completion times were longer when administered in AR. Preliminary results yielded higher step heights in the TW, FSST, and capsule egress task as well as higher foot placement variability in the FSST. To maintain head stability for viewing the holographic content within the TW task, participants restricted their torso movement. While performing the capsule egress task, the AR and physical objects differed significantly in the downward pitch of the head, but not the torso. The participants were able to complete all tasks within AR and meaningful measures of task performance and postural control were obtained, indicating that AR may be a useful instrumentation solution with embedded sensors to evaluate task performance and postural control. However, care should be taken when comparing performance within AR to other assessment modalities. Results from these studies support research gaps identified in NASA’s Human Research Roadmaps, provide design guidance for AR in NASA’s Human Integration Design Handbook, and support determining whether AR is a viable tool for evaluating astronauts’ vestibular performance throughout mission timelines.

Bibliography: Description: (Last Updated: 05/15/2025) 

Show Cumulative Bibliography
 
Articles in Peer-reviewed Journals Weiss H, Liu A, Byon A, Blossom J, Stirling L. "Comparison of display modality and human-in-the-loop presence for on-orbit inspection of spacecraft." Hum Factors. 2023 Sep;65(6):1059-73. https://doi.org/10.1177/00187208211042782 ; PubMed PMID: 34558994 , Sep-2023
Articles in Peer-reviewed Journals Larson H, Stirling L. "Examination of human spatial reasoning capability and simulated autonomous rendezvous and docking monitoring performance." Proc Hum Factors Ergon Soc Annu Meet. Oct 25;21695067231192262. https://doi.org/10.1177/2169506723119226 , Oct-2023
Papers from Meeting Proceedings Weiss H, Stirling L. "Usability Evaluation of an Augmented Reality Sensorimotor Assessment Tool for Astronauts." IEEE Aerospace Conference, Big Sky, Montana, March 4-11, 2023.

IEEE Aerospace Conference, Big Sky, Montana, March 4-11, 2023. , Mar-2023

Papers from Meeting Proceedings Tang J, Weiss H, Stirling L. "A Comparison of Sensorimotor Assessment Accuracy between Augmented Reality and Touchscreen Environments." 67th International Annual Meeting of the Human Factors and Ergonomics Society, Washington, District of Columbia, October 23-27, 2023.

67th International Annual Meeting of the Human Factors and Ergonomics Society, Washington, District of Columbia, October 23-27, 2023. , Oct-2023

Papers from Meeting Proceedings Larson H, Stirling L. "Examination of Human Spatial Reasoning Capability and Simulated Autonomous Rendezvous and Docking Monitoring Performance." 67th International Annual Meeting of the Human Factors and Ergonomics Society, Washington, District of Columbia, October 23-27, 2023.

67th International Annual Meeting of the Human Factors and Ergonomics Society, Washington, District of Columbia, October 23-27, 2023. , Oct-2023

Project Title:  HCAAM VNSCOR: Responsive Multimodal Human-Automation Communication for Augmenting Human Situation Awareness in Nominal and Off-Nominal Scenarios (80NSSC20K0409) Reduce
Images: icon  Fiscal Year: FY 2023 
Division: Human Research 
Research Discipline/Element:
HRP HFBP:Human Factors & Behavioral Performance (IRP Rev H)
Start Date: 12/04/2019  
End Date: 12/03/2023  
Task Last Updated: 09/29/2022 
Download Task Book report in PDF pdf
Principal Investigator/Affiliation:   Stirling, Leia  Ph.D. / University of Michigan 
Address:  Industrial and Operations Engineering 
1205 Beal Avenue, G634 IOE Building 
Ann Arbor , MI 48109 
Email: leias@umich.edu 
Phone: 617-324-7410  
Congressional District: 12 
Web:  
Organization Type: UNIVERSITY 
Organization Name: University of Michigan 
Joint Agency:  
Comments: NOTE: PI moved to University of Michigan in fall 2019; previous affiliation was Massachusetts Institute of Technology 
Co-Investigator(s)
Affiliation: 
Blossom, Jonathon  NASA Jet Propulsion Laboratory  
Atkins, Ella  Ph.D. University of Michigan, Transitioned to Virginia Tech 
Liu, Andrew  Ph.D. Massachusetts Institute of Technology 
Key Personnel Changes / Previous PI: Co-I Ella Atkins moved from University of Michigan to Virginia Tech in August 2022. This change in location will not affect the proposed effort and collaboration continues with Prof. Atkins co-advising a graduate student with Prof. Stirling.
Project Information: Grant/Contract No. 80NSSC20K0409 
Responsible Center: NASA JSC 
Grant Monitor: Whitmire, Alexandra  
Center Contact:  
alexandra.m.whitmire@nasa.gov 
Unique ID: 12797 
Solicitation / Funding Source: 2017-2018 HERO 80JSC017N0001-BPBA Topics in Biological, Physiological, and Behavioral Adaptations to Spaceflight. Appendix C 
Grant/Contract No.: 80NSSC20K0409 
Project Type: Ground 
Flight Program:  
TechPort: No 
No. of Post Docs:
No. of PhD Candidates:
No. of Master's Candidates:
No. of Bachelor's Candidates:
No. of PhD Degrees:
No. of Master's Degrees:
No. of Bachelor's Degrees:
Human Research Program Elements: (1) HFBP:Human Factors & Behavioral Performance (IRP Rev H)
Human Research Program Risks: None
Human Research Program Gaps: None
Task Description: [Ed. note April 2020: Continuation of "HCAAM VNSCOR: Responsive Multimodal Human-Automation Communication for Augmenting Human Situation Awareness in Nominal and Off-Nominal Scenarios," grant 80NSSC19K0703, with the same Principal Investigator (PI) Leia Stirling, Ph.D., due to PI move to University of Michigan from Massachusetts Institute of Technology in fall 2019]

This task is part of the Human Capabilities Assessments for Autonomous Missions (HCAAM) Virtual NASA Specialized Center of Research (VNSCOR).

Crew extravehicular activity (EVA) is limited on spaceflight missions. Multiple, small robotic spacecraft with varying levels of autonomy are needed to perform tasks that might have been completed by an astronaut (e.g., an exterior surface inspection or repair). Crews on long duration exploration missions (LDEM) will have less access to ground support during task operations. As a result, they will need to process more information and communicate with autonomous robots effectively to ensure tasks are progressing safely and on schedule.

The objective of these studies is to investigate the use of augmented reality (AR) multimodal interface displays and communication pathways for improving human-robot communication, situation awareness (SA), trust, and task performance. This will lead to developing guidelines for designing human-robot system interactions that enable operational performance for crews on spaceflight missions.

The specific aims are to:

1) Develop a simulation testbed for examining communication between human-robot teams.

2) Develop a hardware testbed for examining communication between human-robot teams.

3) Evaluate human SA, trust, and task performance within a short duration and long-duration ground-based study (simulation and/or hardware) through testing various interface communication modalities and information displays.

4) (Option) Perform additional studies for alternate parameters of interest that could be tested using the study testbeds. Additional parameters include timing and persistence of information, gesture command mapping, varying the levels of robot automation, evaluating precision enabled by each command mode.

Research Impact/Earth Benefits: Augmented Reality (AR) has the opportunity to support decision making across a variety of use-case scenarios, including but not limited to manufacturing, automated vehicles, military training, and entertainment. This research compares AR to other visual modalities for telerobotics applications, specifically considering robotic control and anomaly inspection. Results from this study can inform how AR is integrated for task-specific applications, as there may be tasks that have increased benefit from AR, whereas others may have additional considerations that emerge.

Task Progress & Bibliography Information FY2023 
Task Progress: The objective of these studies is to investigate the use of augmented reality (AR) multimodal interface displays and communication pathways for improving human-robot communication, situation awareness (SA), trust, and task performance. This will lead to developing guidelines for designing human-robot system interactions that enable operational performance for crews on spaceflight missions.

During this reporting period, our team supported the NASA Human Exploration Research Analog (HERA) Campaign 6 missions in data collection and performed data analysis for the missions completed. This study used the Unity-based simulation testbed for evaluating the aims of assessing visual presentation modality across two tasks. The visual modalities considered were: (1) 2D camera images from fixed cameras placed externally on a simulated space station, (2) 3D reconstruction shown on a 2D projection, and (3) 3D reconstruction shown in an Augmented Reality environment using the HoloLens v2 platform. The reconstruction simulates using 2D camera images from the inspector to create the 3D object. The tasks evaluated are Synchronous and Asynchronous inspection tasks. In the Synchronous task, participants fly an inspector robot around the spacecraft to identify any surface anomalies that require closer inspection. The inspector can be flown in automatic mode along a predetermined path or manual mode to move off the assigned path. The robot is controlled with a 3-DOF joystick. In the Asynchronous task, participants analyze the imagery from a previous inspection flight to identify the potential anomalies on the spacecraft exterior. In both tasks, detected anomalies are captured by taking a picture of the anomaly within the viewpoint. Initial HERA results support that detection accuracy was the highest for the 2D display for the Synchronous Inspection task. Additional interactive 3D viewpoints decreased detection accuracy and increased task completion time. Augmented Reality provided no significant improvement to local navigation, i.e., minimum distance to station or portion of time within two meters, suggesting that the technology did not enhance the perception level of situation awareness. Based on these findings, mission planning operations, when applicable, should include synchronous human-in-the-loop presence for telerobotic inspection of spacecraft. Additional details on the single session study are available in Weiss et al. (2021) and on the initial HERA findings in Liu et al. (2022).

During the reporting period, our team also continued to develop a hardware-based implementation of the anomaly-detection task. We previously built a scaled physical mock-up of a space station and configured a quadcopter for a hardware-based ground evaluation. During this report period, pilot testing was performed to examine the hardware testbed and communication between human-robot teams. Additional details on these tests are reported in Weiss et al. (2022). Localization of the quadcopter for the automated mode is performed with motion capture, while the manual mode is performed using a handheld controller. Effort continues on the automation algorithms to support coverage mapping and real-time object reconstruction using this testbed.

Finally, our team began our effort to compare augmented reality task performance with physical task performance to support new knowledge for the NASA Human Integration Design Handbook (HIDH). Tasks selected align with sensorimotor and neurovestibular assessment and include a Fitts’ Law task examining precision and accuracy, as well as dynamic and operational balance assessments that make use of the virtual environment. This reporting period included software development and initial user studies. Human studies testing has begun and will continue into the next reporting period.

Bibliography: Description: (Last Updated: 05/15/2025) 

Show Cumulative Bibliography
 
Abstracts for Journals and Proceedings Liu AM, Weiss H, Stirling L. "Effects of visual display modality on simulated on-orbit inspection performance: Initial results from Human Exploration Research Analog Campaign 6. " 66th International Annual Meeting of the Human Factors and Ergonomics Society, Atlanta, GA, October 10-14, 2022.

Abstracts. 66th International Annual Meeting of the Human Factors and Ergonomics Society, Atlanta, GA, October 10-14, 2022. , Oct-2022

Abstracts for Journals and Proceedings Weiss H, Stirling L. "Methods for evaluating neurovestibular and sensorimotor performance using augmented reality and wearable sensors. " 13th International Conference on Applied Human Factors and Ergonomics (AHFE 2022), New York, NY, July 24 - 28, 2022.

Abstracts. 13th International Conference on Applied Human Factors and Ergonomics (AHFE 2022), New York, NY, July 24 - 28, 2022. , Jul-2022

Abstracts for Journals and Proceedings Weiss H, Patel A, Liu A, Stirling L. "Evaluation of human-in-the-loop presence on an anomaly inspection task using a quadcopter." 2022 NASA Human Research Program Investigators’ Workshop, Virtual, February 7-10, 2022.

Abstracts. 2022 NASA Human Research Program Investigators’ Workshop, Virtual, February 7-10, 2022. , Feb-2022

Papers from Meeting Proceedings Weiss H, Patel A, Romano M, Apodoca B, Kuevor P, Atkins E, Stirling L. "Methods for evaluation of human-in-the-loop inspection of a Space Station mockup using a quadcopter. " 2022 IEEE Aerospace Conference (AERO), Big Sky, MT, March 5-12, 2022.

Proceedings from the 2022 IEEE Aerospace Conference (AERO), Big Sky, MT, March 5-12, 2022. http://dx.doi.org/10.1109/AERO53065.2022.9843466 , Mar-2022

Project Title:  HCAAM VNSCOR: Responsive Multimodal Human-Automation Communication for Augmenting Human Situation Awareness in Nominal and Off-Nominal Scenarios (80NSSC20K0409) Reduce
Images: icon  Fiscal Year: FY 2022 
Division: Human Research 
Research Discipline/Element:
HRP HFBP:Human Factors & Behavioral Performance (IRP Rev H)
Start Date: 12/04/2019  
End Date: 12/03/2023  
Task Last Updated: 09/17/2021 
Download Task Book report in PDF pdf
Principal Investigator/Affiliation:   Stirling, Leia  Ph.D. / University of Michigan 
Address:  Industrial and Operations Engineering 
1205 Beal Avenue, G634 IOE Building 
Ann Arbor , MI 48109 
Email: leias@umich.edu 
Phone: 617-324-7410  
Congressional District: 12 
Web:  
Organization Type: UNIVERSITY 
Organization Name: University of Michigan 
Joint Agency:  
Comments: NOTE: PI moved to University of Michigan in fall 2019; previous affiliation was Massachusetts Institute of Technology 
Co-Investigator(s)
Affiliation: 
Blossom, Jonathon  NASA Jet Propulsion Laboratory  
Atkins, Ella  Ph.D. University of Michigan 
Liu, Andrew  Ph.D. Massachusetts Institute of Technology 
Project Information: Grant/Contract No. 80NSSC20K0409 
Responsible Center: NASA JSC 
Grant Monitor: Whitmire, Alexandra  
Center Contact:  
alexandra.m.whitmire@nasa.gov 
Unique ID: 12797 
Solicitation / Funding Source: 2017-2018 HERO 80JSC017N0001-BPBA Topics in Biological, Physiological, and Behavioral Adaptations to Spaceflight. Appendix C 
Grant/Contract No.: 80NSSC20K0409 
Project Type: Ground 
Flight Program:  
TechPort: No 
No. of Post Docs:
No. of PhD Candidates:
No. of Master's Candidates:
No. of Bachelor's Candidates:
No. of PhD Degrees:
No. of Master's Degrees:
No. of Bachelor's Degrees:
Human Research Program Elements: (1) HFBP:Human Factors & Behavioral Performance (IRP Rev H)
Human Research Program Risks: None
Human Research Program Gaps: None
Task Description: [Ed. note April 2020: Continuation of "HCAAM VNSCOR: Responsive Multimodal Human-Automation Communication for Augmenting Human Situation Awareness in Nominal and Off-Nominal Scenarios," grant 80NSSC19K0703, with the same Principal Investigator (PI) Leia Stirling, Ph.D., due to PI move to University of Michigan from Massachusetts Institute of Technology in fall 2019]

This task is part of the Human Capabilities Assessments for Autonomous Missions (HCAAM) Virtual NASA Specialized Center of Research (VNSCOR).

Crew extravehicular activity (EVA) is limited on spaceflight missions. Multiple, small robotic spacecraft with varying levels of autonomy are needed to perform tasks that might have been completed by an astronaut (e.g., an exterior surface inspection or repair). Crews on long duration exploration missions (LDEM) will have less access to ground support during task operations. As a result, they will need to process more information and communicate with autonomous robots effectively to ensure tasks are progressing safely and on schedule.

The objective of these studies is to investigate the use of augmented reality (AR) multimodal interface displays and communication pathways for improving human-robot communication, situation awareness (SA), trust, and task performance. This will lead to developing guidelines for designing human-robot system interactions that enable operational performance for crews on spaceflight missions.

The specific aims are to:

1) Develop a simulation testbed for examining communication between human-robot teams.

2) Develop a hardware testbed for examining communication between human-robot teams.

3) Evaluate human SA, trust, and task performance within a short duration and long-duration ground-based study (simulation and/or hardware) through testing various interface communication modalities and information displays.

4) (Option) Perform additional studies for alternate parameters of interest that could be tested using the study testbeds. Additional parameters include timing and persistence of information, gesture command mapping, varying the levels of robot automation, evaluating precision enabled by each command mode.

Research Impact/Earth Benefits: Augmented Reality (AR) has the opportunity to support decision making across a variety of use-case scenarios, including but not limited to manufacturing, automated vehicles, military training, and entertainment. This research compares AR to other visual modalities for telerobotics applications, specifically considering robotic control and anomaly inspection. Results from this study can inform how AR is integrated for task-specific applications, as there may be tasks that have increased benefit from AR, whereas others may have additional considerations that emerge.

Task Progress & Bibliography Information FY2022 
Task Progress: The objective of these studies is to investigate the use of augmented reality (AR) multimodal interface displays and communication pathways for improving human-robot communication, situation awareness (SA), trust, and task performance. This will lead to developing guidelines for designing human-robot system interactions that enable operational performance for crews on spaceflight missions.

During this reporting period, our team used the Unity-based simulation testbed for evaluating the aims of assessing visual presentation modality across two tasks. The visual modalities considered were (1) 2D camera images from fixed cameras placed externally on a simulated space station, (2) 3D reconstruction shown on a 2D projection, and (3) 3D reconstruction shown in an Augmented Reality environment using the HoloLens v2 platform. The reconstruction simulates using 2D camera images from the inspector to create the 3D object. The tasks evaluated are Synchronous and Asynchronous inspection tasks. In the Synchronous task, participants fly an inspector robot around the spacecraft to identify any surface anomalies that require closer inspection. The inspector can be flown in automatic mode along a predetermined path or manual mode to move off the assigned path. The robot is controlled with a 3-DOF joystick. In the Asynchronous task, participants analyze the imagery from a previous inspection flight to identify the potential anomalies on the spacecraft exterior. In both tasks, detected anomalies are captured by taking a picture of the anomaly within the viewpoint. We found that detection accuracy was the highest for the 2D display for the Synchronous Inspection task. Additional interactive 3D viewpoints decreased detection accuracy and increased task completion time. Augmented Reality provided no significant improvement to local navigation, i.e., minimum distance to station or portion of time within two meters, suggesting that the technology did not enhance the perception level of situation awareness. Based on these findings, mission planning operations, when applicable, should include synchronous human-in-the-loop presence for telerobotic inspection of spacecraft. Additional details on this study are available in our Human Factors journal article, Weiss et al. (2021). [Ed. Note: see Bibliography].

During the reporting period, our team also designed and built a scaled physical mock-up of a space station and configured a quadcopter for a hardware-based ground evaluation. Localization of the quadcopter for the automated mode is performed with motion capture, while the manual mode is performed using a handheld controller. Initial pilot testing with the integrated hardware system is ongoing and will continue into the next reporting period.

Finally, our team prepared for NASA Human Exploration Research Analog (HERA) testing, which will perform the same simulated study with the Unity-based environment, but will be able to explore the effects of isolation and time duration on use of the visual modalities to support robotic-assisted inspections. As the reporting period is concluding, initial training on HERA-study participants is beginning. Data collection will continue into the next reporting period.

Bibliography: Description: (Last Updated: 05/15/2025) 

Show Cumulative Bibliography
 
Abstracts for Journals and Proceedings Liu A, Weiss H, Todd J, Stirling L. "Evaluation of an augmented reality interface to control teleoperated satellites." 91st Aerospace Medicine Association Meeting, Denver, CO, August 29-September 2, 2021.

Abstracts. 91st Aerospace Medicine Association Meeting, Denver, CO, August 29-September 2, 2021. , Aug-2021

Abstracts for Journals and Proceedings Weiss H, Liu A, Stirling L. "Opportunities for augmented reality and wearables to support humans in space." 2021 SpaceCHI: Human-Computer Interaction for Space Exploration Workshop, Association for Computing Machinery CHI Conference, Virtual, May 14, 2021.

Abstracts. 2021 SpaceCHI: Human-Computer Interaction for Space Exploration Workshop, Association for Computing Machinery CHI Conference, Virtual, May 14, 2021 , May-2021

Abstracts for Journals and Proceedings Weiss H, Liu A, Stirling L. "Comparison of display modality for telerobotic on-orbit inspection of a spacecraft." 2021 NASA Human Research Program Investigators’ Workshop, Virtual, February 1-4, 2021.

Abstracts. 2021 NASA Human Research Program Investigators’ Workshop, Virtual, February 1-4, 2021. , Feb-2021

Articles in Peer-reviewed Journals Weiss H, Liu A, Byon A, Blossom J, Stirling L. "Comparison of display modality and human-in-the-loop presence for on-orbit inspection of spacecraft." Hum Factors. 2021 Sep 24;187208211042782. Online ahead of print. https://doi.org/10.1177/00187208211042782 ; PMID: 34558994 , Sep-2021
Project Title:  HCAAM VNSCOR: Responsive Multimodal Human-Automation Communication for Augmenting Human Situation Awareness in Nominal and Off-Nominal Scenarios (80NSSC20K0409) Reduce
Images: icon  Fiscal Year: FY 2021 
Division: Human Research 
Research Discipline/Element:
HRP HFBP:Human Factors & Behavioral Performance (IRP Rev H)
Start Date: 12/04/2019  
End Date: 12/03/2023  
Task Last Updated: 10/21/2020 
Download Task Book report in PDF pdf
Principal Investigator/Affiliation:   Stirling, Leia  Ph.D. / University of Michigan 
Address:  Industrial and Operations Engineering 
1205 Beal Avenue, G634 IOE Building 
Ann Arbor , MI 48109 
Email: leias@umich.edu 
Phone: 617-324-7410  
Congressional District: 12 
Web:  
Organization Type: UNIVERSITY 
Organization Name: University of Michigan 
Joint Agency:  
Comments: NOTE: PI moved to University of Michigan in fall 2019; previous affiliation was Massachusetts Institute of Technology 
Co-Investigator(s)
Affiliation: 
Blossom, Jonathon  NASA Jet Propulsion Laboratory  
Atkins, Ella  Ph.D. University of Michigan 
Liu, Andrew  Ph.D. Massachusetts Institute of Technology 
Project Information: Grant/Contract No. 80NSSC20K0409 
Responsible Center: NASA JSC 
Grant Monitor: Whitmire, Alexandra  
Center Contact:  
alexandra.m.whitmire@nasa.gov 
Unique ID: 12797 
Solicitation / Funding Source: 2017-2018 HERO 80JSC017N0001-BPBA Topics in Biological, Physiological, and Behavioral Adaptations to Spaceflight. Appendix C 
Grant/Contract No.: 80NSSC20K0409 
Project Type: Ground 
Flight Program:  
TechPort: No 
No. of Post Docs:
No. of PhD Candidates:
No. of Master's Candidates:
No. of Bachelor's Candidates:
No. of PhD Degrees:
No. of Master's Degrees:
No. of Bachelor's Degrees:
Human Research Program Elements: (1) HFBP:Human Factors & Behavioral Performance (IRP Rev H)
Human Research Program Risks: None
Human Research Program Gaps: None
Task Description: [Ed. note April 2020: Continuation of "HCAAM VNSCOR: Responsive Multimodal Human-Automation Communication for Augmenting Human Situation Awareness in Nominal and Off-Nominal Scenarios," grant 80NSSC19K0703, with the same Principal Investigator (PI) Leia Stirling, Ph.D., due to PI move to University of Michigan from Massachusetts Institute of Technology in fall 2019]

This task is part of the Human Capabilities Assessments for Autonomous Missions (HCAAM) Virtual NASA Specialized Center of Research (VNSCOR).

Crew extravehicular activity (EVA) is limited on spaceflight missions. Multiple, small robotic spacecraft with varying levels of autonomy are needed to perform tasks that might have been completed by an astronaut (e.g., an exterior surface inspection or repair). Crews on long duration exploration missions (LDEM) will have less access to ground support during task operations. As a result, they will need to process more information and communicate with autonomous robots effectively to ensure tasks are progressing safely and on schedule.

The objective of these studies is to investigate the use of augmented reality (AR) multimodal interface displays and communication pathways for improving human-robot communication, situation awareness (SA), trust, and task performance. This will lead to developing guidelines for designing human-robot system interactions that enable operational performance for crews on spaceflight missions.

The specific aims are to:

1) Develop a simulation testbed for examining communication between human-robot teams.

2) Develop a hardware testbed for examining communication between human-robot teams.

3) Evaluate human SA, trust, and task performance within a short duration and long-duration ground-based study (simulation and/or hardware) through testing various interface communication modalities and information displays.

4) (Option) Perform additional studies for alternate parameters of interest that could be tested using the study testbeds. Additional parameters include timing and persistence of information, gesture command mapping, varying the levels of robot automation, evaluating precision enabled by each command mode.

Research Impact/Earth Benefits: Augmented Reality (AR) has opportunity to support decision making across a variety of use-case scenarios, including but not limited to manufacturing, automated vehicles, military training, and entertainment. This research compares AR to other visual modalities for telerobotics applications, specifically considering robotic control and anomaly inspection. Results from this study can inform how AR is integrated for task-specific applications, as there may be tasks that have increased benefit from AR, whereas others may have additional considerations that emerge.

Task Progress & Bibliography Information FY2021 
Task Progress: The objective of these studies is to investigate the use of augmented reality (AR) multimodal interface displays and communication pathways for improving human-robot communication, situation awareness (SA), trust, and task performance. This will lead to developing guidelines for designing human-robot system interactions that enable operational performance for crews on spaceflight missions.

During this reporting period, our team developed a Unity-based simulation testbed for evaluating the aims of assessing visual presentation modality across two tasks. The visual modalities considered are (1) 2D camera images from fixed cameras placed external on a simulated space station, (2) 3D reconstruction shown on a 2D projection, and (3) 3D reconstruction shown in an Augmented Reality environment using the HoloLens v2 platform. The reconstruction simulates using 2D camera images from the inspector to create the 3D object. The tasks evaluated are Telerobotic Control and Inspection Task (TCIT) and the Analysis Task (AT). In TCIT, participants fly an inspector robot around the spacecraft to identify any surface anomalies that require closer inspection. The inspector can be flown in automatic mode along a predetermined path or manual mode to move off the assigned path. The robot is controlled with a 3-DOF (degree of freedom) joystick. In AT, participants analyze the imagery from a previous inspection flight to identify the potential anomalies on the spacecraft exterior. In both tasks, detected anomalies are captured by taking a picture of the anomaly within the viewpoint. The 2D and 3D interfaces are shown on a 24" computer monitor, while the AR image is presented on the Microsoft HoloLens. A second 24" computer monitor is used to display additional task information.

For the development of the simulation environment, a mock-up of a space station and an inspector satellite were created. Different anomaly sets were designed of equivalent difficulty to enable randomization of anomaly placement across trials. The automated mode was designed and implemented enabling the inspector to have edge tracking of the space station. The joystick was enabled to permit control of the inspector, picture taking tools, and mode switching.

Due to the pandemic, user tests planned for the summer months were not able to be performed. However, the simulation environment was configured in the home of three research team members to perform initial user testing within the team. A pipeline for data analysis was also developed and tested. We recently received approval to restart human studies and are preparing to begin the study to evaluate visual presentation modality. We were also able to virtually support integration of the hardware and software into the Human Exploration Research Analog (HERA) for the upcoming campaign.

Bibliography: Description: (Last Updated: 05/15/2025) 

Show Cumulative Bibliography
 
Abstracts for Journals and Proceedings Todd JE, Liu AM, Stirling LA. "Investigation of Augmented Reality in Enabling Telerobotic On-Orbit Inspection of Spacecraft." 2020 NASA Human Research Program Investigators’ Workshop, Galveston, TX, January 27 – 30, 2020.

Poster. 2020 NASA Human Research Program Investigators’ Workshop, Galveston, TX, January 27-30, 2020. , Jan-2020

Papers from Meeting Proceedings Todd JE, Liu AM, Stirling LA. "Investigation of augmented reality in enabling telerobotic on-orbit inspection of spacecraft." 50th International Conference on Environmental Systems - ICES 2020, Lisbon, Portugal, July 12-16, 2020.

International Conference on Environmental Systems - ICES 2020, Lisbon, Portugal, July 12-16, 2020. ICES paper ICES-2020-538. , Jul-2020

Project Title:  HCAAM VNSCOR: Responsive Multimodal Human-Automation Communication for Augmenting Human Situation Awareness in Nominal and Off-Nominal Scenarios (80NSSC20K0409) Reduce
Images: icon  Fiscal Year: FY 2020 
Division: Human Research 
Research Discipline/Element:
HRP HFBP:Human Factors & Behavioral Performance (IRP Rev H)
Start Date: 12/04/2019  
End Date: 12/03/2023  
Task Last Updated: 04/15/2020 
Download Task Book report in PDF pdf
Principal Investigator/Affiliation:   Stirling, Leia  Ph.D. / University of Michigan 
Address:  Industrial and Operations Engineering 
1205 Beal Avenue, G634 IOE Building 
Ann Arbor , MI 48109 
Email: leias@umich.edu 
Phone: 617-324-7410  
Congressional District: 12 
Web:  
Organization Type: UNIVERSITY 
Organization Name: University of Michigan 
Joint Agency:  
Comments: NOTE: PI moved to University of Michigan in fall 2019; previous affiliation was Massachusetts Institute of Technology 
Co-Investigator(s)
Affiliation: 
Blossom, Jonathon  NASA Jet Propulsion Laboratory  
Atkins, Ella  Ph.D. University of Michigan 
Project Information: Grant/Contract No. 80NSSC20K0409 
Responsible Center: NASA JSC 
Grant Monitor: Williams, Thomas  
Center Contact: 281-483-8773 
thomas.j.will1@nasa.gov 
Unique ID: 12797 
Solicitation / Funding Source: 2017-2018 HERO 80JSC017N0001-BPBA Topics in Biological, Physiological, and Behavioral Adaptations to Spaceflight. Appendix C 
Grant/Contract No.: 80NSSC20K0409 
Project Type: Ground 
Flight Program:  
TechPort: No 
No. of Post Docs:  
No. of PhD Candidates:  
No. of Master's Candidates:  
No. of Bachelor's Candidates:  
No. of PhD Degrees:  
No. of Master's Degrees:  
No. of Bachelor's Degrees:  
Human Research Program Elements: (1) HFBP:Human Factors & Behavioral Performance (IRP Rev H)
Human Research Program Risks: None
Human Research Program Gaps: None
Task Description: [Ed. note April 2020: Continuation of "HCAAM VNSCOR: Responsive Multimodal Human-Automation Communication for Augmenting Human Situation Awareness in Nominal and Off-Nominal Scenarios," grant 80NSSC19K0703, with the same Principal Investigator (PI) Leia Stirling, Ph.D., due to PI move to University of Michigan from Massachusetts Institute of Technology in fall 2019]

his task is part of the Human Capabilities Assessments for Autonomous Missions (HCAAM) Virtual NASA Specialized Center of Research (VNSCOR).

Crew extravehicular activity (EVA) is limited on spaceflight missions. Multiple, small robotic spacecraft with varying levels of autonomy are needed to perform tasks that might have been completed by an astronaut (e.g., an exterior surface inspection or repair). Crews on long duration exploration missions (LDEM) will have less access to ground support during task operations. As a result, they will need to process more information and communicate with autonomous robots effectively to ensure tasks are progressing safely and on schedule.

The objective of these studies is to investigate the use of augmented reality (AR) multimodal interface displays and communication pathways for improving human-robot communication, situation awareness (SA), trust, and task performance. This will lead to developing guidelines for designing human-robot system interactions that enable operational performance for crews on spaceflight missions.

The specific aims are to:

1) Develop a simulation testbed for examining communication between human-robot teams.

2) Develop a hardware testbed for examining communication between human-robot teams.

3) Evaluate human SA, trust, and task performance within a short duration and long-duration ground-based study (simulation and/or hardware) through testing various interface communication modalities and information displays.

4) (Option) Perform additional studies for alternate parameters of interest that could be tested using the study testbeds. Additional parameters include timing and persistence of information, gesture command mapping, varying the levels of robot automation, evaluating precision enabled by each command mode.

Research Impact/Earth Benefits: Augmented Reality (AR) has opportunity to support decision making across a variety of use-case scenarios, including but not limited to manufacturing, automated vehicles, military training, and entertainment. This research compares AR to other visual modalities for telerobotics applications, specifically considering robotic control and anomaly inspection. Results from this study can inform how AR is integrated for task-specific applications, as there may be tasks that have increased benefit from AR, whereas others may have additional considerations that emerge.

Task Progress & Bibliography Information FY2020 
Task Progress: New project for FY2020.

NOTE (Ed., 4/15/2020) this is a continuation of "HCAAM VNSCOR: Responsive Multimodal Human-Automation Communication for Augmenting Human Situation Awareness in Nominal and Off-Nominal Scenarios," grant 80NSSC19K0703, due to PI Dr. Leia Stirling's move to University of Michigan from Massachusetts Institute of Technology in fall 2019. See that project for previous reporting.

Bibliography: Description: (Last Updated: 05/15/2025) 

Show Cumulative Bibliography
 
 None in FY 2020