Task Progress:
|
The objective of these studies is to investigate the use of augmented reality (AR) multimodal interface displays and communication pathways for improving human-robot communication, situation awareness (SA), trust, and task performance. This will lead to developing guidelines for designing human-robot system interactions that enable operational performance for crews on spaceflight missions.
During this reporting period, our team supported the NASA Human Exploration Research Analog (HERA) Campaign 6 missions in data collection and performed data analysis for the missions completed. This study used the Unity-based simulation testbed developed in an earlier year of the program for evaluating the aims of assessing visual presentation modality across two tasks. The visual modalities considered were: (1) 2D camera images from fixed cameras placed externally on a simulated space station, (2) 3D reconstruction shown on a 2D projection, and (3) 3D reconstruction shown in an Augmented Reality environment using the HoloLens v2 platform. The reconstruction simulates using 2D camera images from the inspector to create the 3D object. The tasks evaluated were Synchronous and Asynchronous inspection task. In the Synchronous task, participants flew an inspector robot around the spacecraft to identify any surface anomalies that required closer inspection. The inspector could be flown in automatic mode along a predetermined path or manual mode to move off the assigned path. The robot was controlled with a 3-DOF joystick. In the Asynchronous task, participants analyzed the imagery from a previous inspection flight to identify the potential anomalies on the spacecraft exterior. In both tasks, detected anomalies were captured by taking a picture of the anomaly within the viewpoint. Initial HERA results support that detection accuracy was the highest for the 2D display for the Synchronous Inspection task. Additional interactive 3D viewpoints decreased detection accuracy and increased task completion time. Augmented Reality provided no significant improvement to local navigation, i.e., minimum distance to station or portion of time within two meters, suggesting that the technology did not enhance the perception level of situation awareness. Based on these findings, mission planning operations, when applicable, should include synchronous human-in-the-loop presence for telerobotic inspection of spacecraft. Additional details on the single session study are available in Weiss et al. (2021) and on the initial HERA findings in Liu et al. (2022).
During the reporting period, our team also performed studies to support gaps in the NASA Human Integration Design Handbook (HIDH) for augmented reality (AR) interfaces. We compared performance on sensorimotor and neurovestibular assessment tasks using physical objects and within AR. The sensorimotor task was a multi-directional Fitts’ Law target acquisition task, while the neurovestibular tasks included three clinical balance assessments (Four Square Step test (FSST), the Star Excursion, tandem walking (TW)), and three operational tasks (capsule ingress and egress, geology sampling, and obstacle avoidance). For the sensorimotor task, the touchscreen modality yielded improved performance over AR as measured by accuracy, precision, error rates, throughput, and movement time. AR designers can improve performance when designing AR interfaces by selecting larger buttons when accuracy and efficiency are required and by embedding perception cues to button target surfaces such as depth and proximity cues. For the neurovestibular tasks, while participants were able to perform all assessment tasks successfully, their strategies differed when comparing AR performance to physical object use. Task completion times were longer when administered in AR. Preliminary results yielded higher step heights in the TW, FSST, and capsule egress task as well as higher foot placement variability in the FSST. To maintain head stability for viewing the holographic content within the TW task, participants restricted their torso movement. While performing the capsule egress task, the AR and physical objects differed significantly in the downward pitch of the head, but not the torso. The participants were able to complete all tasks within AR and meaningful measures of task performance and postural control were obtained, indicating that AR may be a useful instrumentation solution with embedded sensors to evaluate task performance and postural control. However, care should be taken when comparing performance within AR to other assessment modalities. Results from these studies support research gaps identified in NASA’s Human Research Roadmaps, provide design guidance for AR in NASA’s Human Integration Design Handbook, and support determining whether AR is a viable tool for evaluating astronauts’ vestibular performance throughout mission timelines.
|
|
Articles in Peer-reviewed Journals
|
Weiss H, Liu A, Byon A, Blossom J, Stirling L. "Comparison of display modality and human-in-the-loop presence for on-orbit inspection of spacecraft." Hum Factors. 2023 Sep;65(6):1059-73. https://doi.org/10.1177/00187208211042782 ; PubMed PMID: 34558994 , Sep-2023
|
|
Articles in Peer-reviewed Journals
|
Larson H, Stirling L. "Examination of human spatial reasoning capability and simulated autonomous rendezvous and docking monitoring performance." Proc Hum Factors Ergon Soc Annu Meet. Oct 25;21695067231192262. https://doi.org/10.1177/2169506723119226 , Oct-2023
|
|
Papers from Meeting Proceedings
|
Weiss H, Stirling L. "Usability Evaluation of an Augmented Reality Sensorimotor Assessment Tool for Astronauts." IEEE Aerospace Conference, Big Sky, Montana, March 4-11, 2023. IEEE Aerospace Conference, Big Sky, Montana, March 4-11, 2023. , Mar-2023
|
|
Papers from Meeting Proceedings
|
Tang J, Weiss H, Stirling L. "A Comparison of Sensorimotor Assessment Accuracy between Augmented Reality and Touchscreen Environments." 67th International Annual Meeting of the Human Factors and Ergonomics Society, Washington, District of Columbia, October 23-27, 2023. 67th International Annual Meeting of the Human Factors and Ergonomics Society, Washington, District of Columbia, October 23-27, 2023. , Oct-2023
|
|
Papers from Meeting Proceedings
|
Larson H, Stirling L. "Examination of Human Spatial Reasoning Capability and Simulated Autonomous Rendezvous and Docking Monitoring Performance." 67th International Annual Meeting of the Human Factors and Ergonomics Society, Washington, District of Columbia, October 23-27, 2023. 67th International Annual Meeting of the Human Factors and Ergonomics Society, Washington, District of Columbia, October 23-27, 2023. , Oct-2023
|
|