Menu

 

The NASA Task Book
Advanced Search     

Project Title:  HCAAM VNSCOR: Enhancing Situation Awareness of Automated Procedures Using Adaptive Multimodal Augmented Reality Displays Reduce
Images: icon  Fiscal Year: FY 2022 
Division: Human Research 
Research Discipline/Element:
HRP HFBP:Human Factors & Behavioral Performance (IRP Rev H)
Start Date: 05/15/2019  
End Date: 05/14/2023  
Task Last Updated: 03/16/2022 
Download report in PDF pdf
Principal Investigator/Affiliation:   Schreckenghost, Debra  M.E.E. / TRACLabs, Inc. 
Address:  16969 N Texas Ave 
Suite 300 
Webster , TX 77598-4085 
Email: ghost@ieee.org 
Phone: 281-461-7886  
Congressional District: 22 
Web:  
Organization Type: INDUSTRY 
Organization Name: TRACLabs, Inc. 
Joint Agency:  
Comments:  
Co-Investigator(s)
Affiliation: 
Holden, Kritina  Ph.D. NASA Johnson Space Center 
Dory, Jonathan  B.S. NASA Johnson Space Center 
Key Personnel Changes / Previous PI: March 2022 report: There are no key personnel changes.
Project Information: Grant/Contract No. 80NSSC19K0667 
Responsible Center: NASA JSC 
Grant Monitor: Whitmire, Alexandra  
Center Contact:  
alexandra.m.whitmire@nasa.gov 
Solicitation / Funding Source: 2017-2018 HERO 80JSC017N0001-BPBA Topics in Biological, Physiological, and Behavioral Adaptations to Spaceflight. Appendix C 
Grant/Contract No.: 80NSSC19K0667 
Project Type: GROUND 
Flight Program:  
TechPort: Yes 
No. of Post Docs:
No. of PhD Candidates:
No. of Master's Candidates:
No. of Bachelor's Candidates:
No. of PhD Degrees:
No. of Master's Degrees:
No. of Bachelor's Degrees:
Human Research Program Elements: (1) HFBP:Human Factors & Behavioral Performance (IRP Rev H)
Human Research Program Risks: (1) HSIA:Risk of Adverse Outcome Due to Inadequate Human Systems Integration Architecture (IRP Rev L)
Human Research Program Gaps: (1) HSIA-101:We need to identify the Human Systems Integration (HSI) – relevant crew health and performance outcomes, measures, and metrics, needed to characterize and mitigate risk, for future exploration missions (IRP Rev L)
(2) HSIA-201:We need to evaluate the demands of future exploration habitat/vehicle systems and mission scenarios (e.g. increased automation, multi-modal communication) on individuals and teams, and determine the risks these demands pose to crew health and performance (IRP Rev L)
(3) HSIA-401:We need to determine how HSI can be applied in the vehicle/habitat and computer interface Design Phase to mitigate potential decrements in operationally-relevant performance (e.g. problem-solving, execution procedures), during increasingly earth-independent, future exploration missions (including in-mission and at landing) (IRP Rev L)
(4) HSIA-701:We need to determine how human-automation-robotic systems can be optimized for effective enhancement and monitoring of crew capabilities, health, and performance, during increasingly earth-independent, future exploration missions (including in-mission and at landing) (IRP Rev L)
Flight Assignment/Project Notes: NOTE: End date changed to 5/14/2023 per S. Huppman/HRP and NSSC information (Ed., 3/3/2020)

Task Description: This task is part of the Human Capabilities Assessments for Autonomous Missions (HCAAM) Virtual NASA Specialized Center of Research (VNSCOR).

Future deep space missions will present new challenges for crew, and increased risks to human performance due to the stress, fatigue, radiation exposure, and isolation that characterizes these missions. In addition, crew will no longer be able to depend on timely support from Mission Control due to distance from the Earth, but will have to work autonomously, while maintaining high performance. Mission Controllers may not be available to answer questions, check system status, assist with procedures, monitor for errors, or troubleshoot problems. Greater crew autonomy will increase dependence on automated systems, and design of these automated systems must be driven by sound human-system integration standards and guidelines in order to ensure mission success. Historically, crew have had very limited dependence on automated systems, thus crew will be faced with a new way of working that may put situation awareness (SA) at risk. We must develop methods for promoting good situation awareness in the automated systems that will most certainly be part of future deep space vehicles and habitats.

Procedure automation is a promising technology for reducing crew workload. We define procedure automation as technology that automates the selection or execution of procedural tasks. Structuring the work of automation according to human procedures should improve the transparency of automation actions. This approach provides a means for establishing common ground about ongoing tasks to improve operator understanding of automation behavior.

New technologies such as adaptive, multimodal, augmented reality displays can offer the benefits of information presentation tailored to meet the needs of each crewmember, taking into consideration the current state of that crewmember (e.g., sleep-deprived, high workload), as well as the current state of his/her environment and ongoing activities (e.g., emergency situation, time-critical operations).

We propose to combine technology for procedure automation with technology for augmented reality multi-modal (ARMM) user interfaces using Microsoft Hololens head-mounted display to provide a virtual task assistant to assist crew in performing procedural work. This virtual task assistant will be capable of identifying which procedures should be performed, performing actions in crew procedures, and summarizing actions taken by the human-automation team to assist crew in preparing for tasks and taking over tasks from other team members.

Four studies are planned to evaluate the effects of a virtual task assistant combining procedure automation with augmented reality multi-modal (AARM) user interfaces on human task performance. These studies will achieve the following aims:

Aim 1. Determine best methods to improve situation awareness and improve crew autonomy when using a virtual task assistant to prepare for and perform manual maintenance.

Aim 2. Determine best methods to improve situation awareness and reduce workload when a virtual task assistant is used to handover maintenance tasks between users.

Aim 3. Determine best methods to improve situation awareness and reduce workload when using a virtual task assistant to help manage concurrent manual and automated tasks.

The proposed work addresses a number of gaps in the Human Research Program Human Factors and Behavioral Performance risks. This project will provide guidelines for designing effective human-automation systems and evaluate human-automation performance for exemplar procedure automation systems. This project also will provide guidance for the application of multi-modal and adaptive displays and control to Human-Computer Interaction (HCI) design for long duration operations.

Research Impact/Earth Benefits: Technologies for virtual task assistance are increasingly available in everyday life. One of the most common is voice enabled assistance, like Siri and Alexa, that aid some activities of daily living. And augmented and virtual reality technologies are becoming mainstream, with the introduction of new devices such as Microsoft HoloLens 2, and improved standards such as the WebXR standards ( https://www.w3.org/TR/webxr/ ) for accessing virtual and augmented reality devices.

The Virtual Intelligent Task Assistant (VITA) project is leveraging augmented reality platforms and new WebXR standards to develop a virtual task assistant that can be used to assist users with procedural task work on the job. Our technical approach is innovative in that new procedural tasks can be supported without custom software development. Our experimental research is distinguished by investigating effective task assistance for maintenance or assembly tasks where hands-free operation of task assistance is beneficial. For the first year we are investigating best techniques for using augmented reality task assistance when assembling small devices that are held in the hands during assembly.

This technology and associated research findings have potential benefit to NASA for the assembly, maintenance, and repair of aircraft, spacecraft, habitats, and robotics. This technology and associated research findings also have broader potential benefit for any organization performing assembly and maintenance procedural work. This includes assembly and maintenance of drilling equipment for the oil and gas industry, equipment used in chemical processing plants, and maintenance and repair of commercial aircraft.

Task Progress & Bibliography Information FY2022 
Task Progress: The first Virtual Intelligent Task Assistant (VITA) study was a pilot laboratory study to determine techniques for use in the second study, to be conducted in the NASA Human Exploration Research Analog (HERA) Campaign 6 (C6). This pilot study was completed in June 2021. Restrictions going onsite NASA Johnson Space Center (JSC) due to COVID-19 were lifted this year. This enabled us to conduct pilot sessions in the Human Factors Engineering Laboratory (HFEL) onsite at JSC using subjects from the Human Test Subject Facility (HTSF). A total of 16 participants performed a session with VITA. For these sessions, the participant performed tasks from electronic procedures on a tablet or tasks provided by VITA in a HoloLens 1 display.

The second VITA study being conducted in HERA Campaign 6 started in September 2021. The HERA study has four participants per mission and there will be four missions in Campaign 6, for a total of 16 participants. At the current time, the VITA sessions for two missions in HERA C6 are complete. We worked with our HERA Experiment Support Scientist (ESS), Michael Merta, to prepare for each mission. Prior to each mission, we train the crew on how to interact with the VITA. After each mission, we debrief each crewmember about his/her experience using VITA during the mission. Data collection and analysis for the VITA study in HERA C6 was started in Year 3 as well. We report preliminary results from the Pilot study and HERA C6 Mission 1 (C6M1) below.

Findings on Gaze-activated Control

Multi-modal interaction when using the VITA augmented reality software in a HoloLens includes visual presentation of task cues, hand gestures, and gaze-activated control. To improve support for hands-free operation, we are investigating the use of gaze to interact with the VITA user interface, including advancing to the next instruction, returning to a prior instruction, and recording data.

The operator uses gaze to mark instructions done and advance to the next instruction. Gaze is also used to zoom closer to or away from the VITA display, and to rotate a 3D model of the rover.

Gaze-activated controls are investigated as a means to reduce workload during assembly tasks by enabling interaction with the VITA intelligent agent without moving the user’s hands away from the assembly task. Subjective feedback from subjects on gaze-activated controls indicates such controls can increase workload, if not properly designed. If button response is too sensitive to user gaze, buttons can be accidentally activated, causing the user additional work to “undo” unintended actions. If button response is not sensitive enough to user gaze, repeated gaze actions and extended gaze times can be required, making it difficult to activate these controls and frustrating the user.

In response to feedback from the pilot study, a number of design changes were made to the gaze-activated controls in VITA. In the initial VITA design, buttons used to navigate through task instructions were located below the textual cue, to be near the bottom of the virtual field of view and closer to the user’s line of sight during assembly. However, this location appeared to possibly contribute to accidental button activations as the eyes moved during assembly, so the navigation buttons were moved above the task cue to be further away from the user’s line of sight when assembling the rover. The time that a user must gaze at a control button to activate it (called dwell time) was also increased to 250 msec. Navigation buttons were modified to blink briefly when activated, to improve user awareness of button activation. Finally, accidental activation of critical controls (like marking a task done and moving to the next task) is prevented by using an “arm-and-fire” design that requires two button activations to take an action.

Even with these changes, subjective feedback from the first mission of HERA Campaign 6 indicates that users felt gaze-control was slow and not sensitive enough. The HoloLens 1 tracks head direction but does not track eye movement. This reduced the precision of gaze direction, which can make it harder to activate buttons.

Findings on Placement of Virtual Cues

The VITA user interface arranges virtual task cues and gaze-activated controls in a planar layout. By default, this plane is placed at eye-level when the head is raised and looking forward. The user can use hand gestures to adjust the placement of this plane relative to head position and focal length.

During the pilot study, users were trained how to adjust the placement of this plane. Some subjects found it difficult to make this adjustment. Initially, many users intuitively placed the plane near their hand position when assembling the rover. Eventually, most users moved the plane above their hands and to one side, to prevent accidental activation of gaze-controlled buttons. If the plane was placed too far away, however, users had to move their heads more to see task cues.

During HERA C6M1, placement of virtual cues continued to be challenging for users. Some crew mentioned that shifting the focal plane between virtual task cues and the rover can be tiring over time.

Preliminary findings indicate that additional study of the placement of virtual cues with respect to the focal plane of the task is merited. The user interface design should make it easy to adjust placement of virtual cues relative to the location of task components. The user interface design should also try to minimize shifts in focal length between the physical task and the virtual task cue, as frequent shifts can cause visual fatigue. Designs should be investigated that simplify aligning the focal length of the virtual cues with the focal length of the task components, even when virtual cues are not placed near the task components.

Results

The VITA study is investigating workload when using only gaze-activated control, which earlier studies do not address. Subjective response to gaze-activated control has been mixed, with some users preferring it while others suggest using gesture or voice control. The reliance on gaze for all interaction with VITA makes it more likely that users may experience some visual fatigue, which is substantiated by observations during both the pilot study and HERA C6M1.

Observations from the pilot study and HERA C6M1 indicate that procedure information may need to be organized differently than in the tablet display for more effective use in virtual space. Currently, figures are associated with specific instructions. Users can easily glance at an earlier figure, when using a tablet to view the procedure. When using VITA, however, access to prior figures requires navigating back one instruction at a time. A number of users observed that the effort to go back using VITA discouraged them from looking at figures that would have helped with the current instruction.

Some participants in the pilot study and in HERA C6M1 reported discomfort when using the HoloLens 1 continuously for over 50 minutes. These reports are consistent with a study of simulator sickness. Gaze control was reported as fatiguing to some users in the pilot study. One participant in HERA C6M1 reported that having more than one session with an augmented reality headset in a day made symptoms worse, even when using different headsets (HoloLens 1 in one session, HoloLens 2 in another session). We are investigating in the HERA C6 study whether users adapt to this with repeated use.

We submitted a paper entitled “Lessons on Developing an Augmented Reality Interface to a Virtual Intelligent Task Assistant” to the Human Factors and Ergonomics Society annual meeting to be held in Atlanta, GA, on October 10 - 14, 2022. This paper reports preliminary results from the VITA pilot study and the first two missions in HERA Campaign 6.

Bibliography Type: Description: (Last Updated: 04/14/2022) 

Show Cumulative Bibliography Listing
 
Abstracts for Journals and Proceedings Schreckenghost D, Holden K, Milam T, Munson B, Nguyen A. "Enhancing situation awareness of automated procedures using adaptive multimodal augmented reality displays." 2022 NASA Human Research Program Investigators’ Workshop, Virtual, February 7-10, 2022.

Abstracts. 2022 NASA Human Research Program Investigators’ Workshop, Virtual, February 7-10, 2022. , Feb-2022

Articles in Peer-reviewed Journals Schreckenghost D, Holden K, Greene M, Milam T, Hamblin C. "Effect of automating procedural work on situation awareness and workload." Hum Factors. Special Issue on Human Factors and Ergonomics in Space Exploration. 2022 Jan 28:187208211060978. https://doi.org/10.1177/00187208211060978 ; PMID: 35089111 , Jan-2022
Project Title:  HCAAM VNSCOR: Enhancing Situation Awareness of Automated Procedures Using Adaptive Multimodal Augmented Reality Displays Reduce
Images: icon  Fiscal Year: FY 2021 
Division: Human Research 
Research Discipline/Element:
HRP HFBP:Human Factors & Behavioral Performance (IRP Rev H)
Start Date: 05/15/2019  
End Date: 05/14/2023  
Task Last Updated: 03/15/2021 
Download report in PDF pdf
Principal Investigator/Affiliation:   Schreckenghost, Debra  M.E.E. / TRACLabs, Inc. 
Address:  16969 N Texas Ave 
Suite 300 
Webster , TX 77598-4085 
Email: ghost@ieee.org 
Phone: 281-461-7886  
Congressional District: 22 
Web:  
Organization Type: INDUSTRY 
Organization Name: TRACLabs, Inc. 
Joint Agency:  
Comments:  
Co-Investigator(s)
Affiliation: 
Holden, Kritina  Ph.D. NASA Johnson Space Center 
Dory, Jonathan  B.S. NASA Johnson Space Center 
Key Personnel Changes / Previous PI: March 2021 report: There are no key personnel changes.
Project Information: Grant/Contract No. 80NSSC19K0667 
Responsible Center: NASA JSC 
Grant Monitor: Whitmire, Alexandra  
Center Contact:  
alexandra.m.whitmire@nasa.gov 
Solicitation / Funding Source: 2017-2018 HERO 80JSC017N0001-BPBA Topics in Biological, Physiological, and Behavioral Adaptations to Spaceflight. Appendix C 
Grant/Contract No.: 80NSSC19K0667 
Project Type: GROUND 
Flight Program:  
TechPort: Yes 
No. of Post Docs:
No. of PhD Candidates:
No. of Master's Candidates:
No. of Bachelor's Candidates:
No. of PhD Degrees:
No. of Master's Degrees:
No. of Bachelor's Degrees:
Human Research Program Elements: (1) HFBP:Human Factors & Behavioral Performance (IRP Rev H)
Human Research Program Risks: (1) HSIA:Risk of Adverse Outcome Due to Inadequate Human Systems Integration Architecture (IRP Rev L)
Human Research Program Gaps: (1) HSIA-101:We need to identify the Human Systems Integration (HSI) – relevant crew health and performance outcomes, measures, and metrics, needed to characterize and mitigate risk, for future exploration missions (IRP Rev L)
(2) HSIA-201:We need to evaluate the demands of future exploration habitat/vehicle systems and mission scenarios (e.g. increased automation, multi-modal communication) on individuals and teams, and determine the risks these demands pose to crew health and performance (IRP Rev L)
(3) HSIA-401:We need to determine how HSI can be applied in the vehicle/habitat and computer interface Design Phase to mitigate potential decrements in operationally-relevant performance (e.g. problem-solving, execution procedures), during increasingly earth-independent, future exploration missions (including in-mission and at landing) (IRP Rev L)
(4) HSIA-701:We need to determine how human-automation-robotic systems can be optimized for effective enhancement and monitoring of crew capabilities, health, and performance, during increasingly earth-independent, future exploration missions (including in-mission and at landing) (IRP Rev L)
Flight Assignment/Project Notes: NOTE: End date changed to 5/14/2023 per S. Huppman/HRP and NSSC information (Ed., 3/3/2020)

Task Description: This task is part of the Human Capabilities Assessments for Autonomous Missions (HCAAM) Virtual NASA Specialized Center of Research (VNSCOR).

Future deep space missions will present new challenges for crew, and increased risks to human performance due to the stress, fatigue, radiation exposure, and isolation that characterizes these missions. In addition, crew will no longer be able to depend on timely support from Mission Control due to distance from the Earth, but will have to work autonomously, while maintaining high performance. Mission Controllers may not be available to answer questions, check system status, assist with procedures, monitor for errors, or troubleshoot problems. Greater crew autonomy will increase dependence on automated systems, and design of these automated systems must be driven by sound human-system integration standards and guidelines in order to ensure mission success. Historically, crew have had very limited dependence on automated systems, thus crew will be faced with a new way of working that may put situation awareness (SA) at risk. We must develop methods for promoting good situation awareness in the automated systems that will most certainly be part of future deep space vehicles and habitats.

Procedure automation is a promising technology for reducing crew workload. We define procedure automation as technology that automates the selection or execution of procedural tasks. Structuring the work of automation according to human procedures should improve the transparency of automation actions. This approach provides a means for establishing common ground about ongoing tasks to improve operator understanding of automation behavior.

New technologies such as adaptive, multimodal, augmented reality displays can offer the benefits of information presentation tailored to meet the needs of each crewmember, taking into consideration the current state of that crewmember (e.g., sleep-deprived, high workload), as well as the current state of his/her environment and ongoing activities (e.g., emergency situation, time-critical operations).

We propose to combine technology for procedure automation with technology for augmented reality multi-modal (ARMM) user interfaces using Microsoft Hololens head-mounted display to provide a virtual task assistant to assist crew in performing procedural work. This virtual task assistant will be capable of identifying which procedures should be performed, performing actions in crew procedures, and summarizing actions taken by the human-automation team to assist crew in preparing for tasks and taking over tasks from other team members.

Four studies are planned to evaluate the effects of a virtual task assistant combining procedure automation with augmented reality multi-modal (AARM) user interfaces on human task performance. These studies will achieve the following aims:

Aim 1. Determine best methods to improve situation awareness and improve crew autonomy when using a virtual task assistant to prepare for and perform manual maintenance.

Aim 2. Determine best methods to improve situation awareness and reduce workload when a virtual task assistant is used to handover maintenance tasks between users.

Aim 3. Determine best methods to improve situation awareness and reduce workload when using a virtual task assistant to help manage concurrent manual and automated tasks.

The proposed work addresses a number of gaps in the Human Research Program Human Factors and Behavioral Performance risks. This project will provide guidelines for designing effective human-automation systems and evaluate human-automation performance for exemplar procedure automation systems. This project also will provide guidance for the application of multi-modal and adaptive displays and control to Human-Computer Interaction (HCI) design for long duration operations.

Research Impact/Earth Benefits: Technologies for virtual task assistance are increasingly available in everyday life. One of the most common is voice enabled assistance, like Siri and Alexa, that aid some activities of daily living. And augmented and virtual reality technologies are becoming mainstream, with the introduction of new devices such as Microsoft HoloLens 2, and improved standards such as the WebXR standards ( https://www.w3.org/TR/webxr/ ) for accessing virtual and augmented reality devices.

The Virtual Intelligent Task Assistant (VITA) project is leveraging augmented reality platforms and new WebXR standards to develop a virtual task assistant that can be used to assist users with procedural task work on the job. Our technical approach is innovative in that new procedural tasks can be supported without custom software development. Our experimental research is distinguished by investigating effective task assistance for maintenance or assembly tasks where hands-free operation of task assistance is beneficial. For the first year we are investigating best techniques for using augmented reality task assistance when assembling small devices that are held in the hands during assembly.

This technology and associated research findings have potential benefit to NASA for the assembly, maintenance, and repair of aircraft, spacecraft, habitats, and robotics. This technology and associated research findings also have broader potential benefit for any organization performing assembly and maintenance procedural work. This includes assembly and maintenance of drilling equipment for the oil and gas industry, equipment used in chemical processing plants, and maintenance and repair of commercial aircraft.

Task Progress & Bibliography Information FY2021 
Task Progress: The Virtual Intelligent Task Assistant (VITA) project investigates the effects of a virtual task assistant on human performance of procedural work. The virtual task assistant combines procedure automation with augmented reality multi-modal user interfaces. Procedure assistance will be provided in a Microsoft HoloLens headset that can present information in augmented reality overlays of the visual field. The virtual task assistant will assist users in becoming familiar with planned procedures, in performing procedure actions, and in maintaining awareness of procedure actions taken by other crew members or automation. Human performance will be compared with and without the virtual task assistant with the goal of informing best methods for delivering and using such virtual task assistants. The aims of this research are listed below.

Aim 1. Determine best methods to improve situation awareness and improve crew autonomy when using a virtual task assistant to prepare for and perform manual maintenance.

Aim 2. Determine best methods to improve situation awareness and reduce workload when a virtual task assistant is used to handover maintenance tasks between users

Aim 3. Determine best methods to improve situation awareness and reduce workload when using a virtual task assistant to help manage concurrent manual and automated tasks.

Research during the second year of this project addresses Aim 1. To achieve Aim 1, we defined and prepared for a study in the Human Exploration Research Analog (HERA). This study investigates the usability and effectiveness of the virtual task assistant to improve crew autonomy in the HERA Campaign 6. Effectiveness in increasing crew autonomy is indicated by the number and type of interactions with MCC (Mission Control Center) or other crew members made during this experiment. Usability is measured using the System Usability Scale (SUS). The VITA project is conducting two studies to determine the best methods to improve situation awareness and improve crew autonomy when using a virtual task assistant to prepare for and perform manual maintenance and assembly (Project Aim 1). These studies attempt to answer the question posed in the proposal:

“Can the virtual task assistant stand-in for MCC and help crew prepare for and perform manual tasks that are not done frequently, such as equipment maintenance and assembly?”

The HERA Campaign 6 experiment and pilot study were defined in the first year of the VITA project. We made progress in these studies in the second year as described below.

During the second year the VITA project met a number of milestones for HERA Campaign 6 that define our HERA experiment. The hardware and software to be used for the VITA experiment was delivered to HERA. And the functional testing of VITA in HERA facility was completed. We worked with our HERA Experiment Support Scientist (ESS) to accomplish these activities.

We finalized and tested a full set of electronic procedures for the rover assembly and disassembly tasks. We integrated these procedures with the VITA software. Multi-modal interaction when using the augmented reality software in a HoloLens includes visual presentation of information, hand gestures, and gaze tracking. To improve support for hands-free operation, we are investigating the use of gaze to interact with the VITA user interface, including advancing to the next instruction, recording data, and manipulating the 3D model.

Conducting research safely during the COVID-19 pandemic required modification of the Institutional Review Board (IRB) used for the VITA studies. Specifically, we modified the study procedures in July 2020 to include both participants and researchers wearing masks and gloves for the duration of the pilot session. We also renewed the VITA IRB in October 2020. This renewal included similar provision for conducting studies during the COVID-19 pandemic.

The VITA experiment as originally planned ran the VITA software on a laptop resident inside HERA. Restrictions going onsite Johnson Space Center (JSC) due to COVID-19 made it difficult to validate technology running on a laptop in HERA. During the reporting period we worked with HERA to modify our experiment protocol to improve access to software and procedures by running the VITA software on a cloud server instead of a laptop resident inside HERA. This approach has been implemented and tested by HERA and by the VITA research team.

Pilot sessions were originally planned to be conducted in the Human Factors Engineering Laboratory (HFEL) onsite at Johnson Space Center (JSC) using subjects from the Human Test Subject Facility (HTSF) at JSC. Restrictions going onsite Johnson Space Center (JSC) due to COVID-19 prevented us from conducting pilot sessions when originally planned.

To make progress on some pilot study objectives, we conducted walkthroughs at a team member’s home with family members and work colleagues. Safety protocols from IRB were followed. Although JSC is still under Phase 3 restrictions, the VITA project recently received a waiver to conduct pilot sessions onsite at JSC. We are currently contacting candidate participants through HTSF to resume pilot sessions. Preliminary findings to date from pilot sessions and walkthroughs include 1) more accurate estimate of session timing for the HERA study, 2) improved procedures for rover assembly and disassembly, and 3) techniques for more automated computation of task metrics.

The Principal Investigator met monthly with Dr. S. Robinson, Dr. B. Gore, and Principal Investigators for the Virtual NASA Specialized Center of Research (VNSCOR) for “Human Capabilities Assessments for Autonomous Missions” (HCAAM). These meetings improved communication among these projects and promoted coordination between projects.

A paper describing our approach for automatically computing task performance metrics for VITA was accepted for presentation at SpaceOps 2021 [1]. This approach combines actions logged in the electronic procedure database with tasks defined in the electronic procedure to compute performance metrics upon request.

The last quarter of Year 2 will focus on finishing the pilot study for HERA Campaign 6. This study evaluates experimental techniques for the HERA C6 study. The study to be conducted in HERA Campaign 6 should start in Year 3 (expected start in September 2021).

REFERENCE

1. Debra Schreckenghost, Tod Milam, David Kortenkamp, and Alize Nguyen. Near Realtime Computation of Task Performance using Electronic Procedures. SpaceOps 2021. May 2021.

Bibliography Type: Description: (Last Updated: 04/14/2022) 

Show Cumulative Bibliography Listing
 
Papers from Meeting Proceedings Schreckenghost D, Milam T, Kortenkamp D, Nguyen A. "Near Realtime Computation of Task Performance using Electronic Procedures." SpaceOps 2021. The 16th International Conference on Space Operations, Virtual, May 3-5, 2021.

SpaceOps 2021. The 16th International Conference on Space Operations, Virtual, May 3-5, 2021. Meeting paper. , May-2021

Project Title:  HCAAM VNSCOR: Enhancing Situation Awareness of Automated Procedures Using Adaptive Multimodal Augmented Reality Displays Reduce
Images: icon  Fiscal Year: FY 2020 
Division: Human Research 
Research Discipline/Element:
HRP HFBP:Human Factors & Behavioral Performance (IRP Rev H)
Start Date: 05/15/2019  
End Date: 05/14/2023  
Task Last Updated: 03/13/2020 
Download report in PDF pdf
Principal Investigator/Affiliation:   Schreckenghost, Debra  M.E.E. / TRACLabs, Inc. 
Address:  16969 N Texas Ave 
Suite 300 
Webster , TX 77598-4085 
Email: ghost@ieee.org 
Phone: 281-461-7886  
Congressional District: 22 
Web:  
Organization Type: INDUSTRY 
Organization Name: TRACLabs, Inc. 
Joint Agency:  
Comments:  
Co-Investigator(s)
Affiliation: 
Holden, Kritina  Ph.D. NASA Johnson Space Center 
Dory, Jonathan  B.S. NASA Johnson Space Center 
Key Personnel Changes / Previous PI: March 2020 report: There are no key personnel changes.
Project Information: Grant/Contract No. 80NSSC19K0667 
Responsible Center: NASA JSC 
Grant Monitor: Williams, Thomas  
Center Contact: 281-483-8773 
thomas.j.will1@nasa.gov 
Solicitation / Funding Source: 2017-2018 HERO 80JSC017N0001-BPBA Topics in Biological, Physiological, and Behavioral Adaptations to Spaceflight. Appendix C 
Grant/Contract No.: 80NSSC19K0667 
Project Type: GROUND 
Flight Program:  
TechPort: Yes 
No. of Post Docs:
No. of PhD Candidates:
No. of Master's Candidates:
No. of Bachelor's Candidates:
No. of PhD Degrees:
No. of Master's Degrees:
No. of Bachelor's Degrees:
Human Research Program Elements: (1) HFBP:Human Factors & Behavioral Performance (IRP Rev H)
Human Research Program Risks: (1) HSIA:Risk of Adverse Outcome Due to Inadequate Human Systems Integration Architecture (IRP Rev L)
Human Research Program Gaps: (1) HSIA-101:We need to identify the Human Systems Integration (HSI) – relevant crew health and performance outcomes, measures, and metrics, needed to characterize and mitigate risk, for future exploration missions (IRP Rev L)
(2) HSIA-201:We need to evaluate the demands of future exploration habitat/vehicle systems and mission scenarios (e.g. increased automation, multi-modal communication) on individuals and teams, and determine the risks these demands pose to crew health and performance (IRP Rev L)
(3) HSIA-401:We need to determine how HSI can be applied in the vehicle/habitat and computer interface Design Phase to mitigate potential decrements in operationally-relevant performance (e.g. problem-solving, execution procedures), during increasingly earth-independent, future exploration missions (including in-mission and at landing) (IRP Rev L)
(4) HSIA-701:We need to determine how human-automation-robotic systems can be optimized for effective enhancement and monitoring of crew capabilities, health, and performance, during increasingly earth-independent, future exploration missions (including in-mission and at landing) (IRP Rev L)
Flight Assignment/Project Notes: NOTE: End date changed to 5/14/2023 per S. Huppman/HRP and NSSC information (Ed., 3/3/2020)

Task Description: This task is part of the Human Capabilities Assessments for Autonomous Missions (HCAAM) Virtual NASA Specialized Center of Research (VNSCOR).

Future deep space missions will present new challenges for crew, and increased risks to human performance due to the stress, fatigue, radiation exposure, and isolation that characterizes these missions. In addition, crew will no longer be able to depend on timely support from Mission Control due to distance from the Earth, but will have to work autonomously, while maintaining high performance. Mission Controllers may not be available to answer questions, check system status, assist with procedures, monitor for errors, or troubleshoot problems. Greater crew autonomy will increase dependence on automated systems, and design of these automated systems must be driven by sound human-system integration standards and guidelines in order to ensure mission success. Historically, crew have had very limited dependence on automated systems, thus crew will be faced with a new way of working that may put situation awareness (SA) at risk. We must develop methods for promoting good situation awareness in the automated systems that will most certainly be part of future deep space vehicles and habitats.

Procedure automation is a promising technology for reducing crew workload. We define procedure automation as technology that automates the selection or execution of procedural tasks. Structuring the work of automation according to human procedures should improve the transparency of automation actions. This approach provides a means for establishing common ground about ongoing tasks to improve operator understanding of automation behavior.

New technologies such as adaptive, multimodal, augmented reality displays can offer the benefits of information presentation tailored to meet the needs of each crewmember, taking into consideration the current state of that crewmember (e.g., sleep-deprived, high workload), as well as the current state of his/her environment and ongoing activities (e.g., emergency situation, time-critical operations).

We propose to combine technology for procedure automation with technology for augmented reality multi-modal (ARMM) user interfaces using Microsoft Hololens head-mounted display to provide a virtual task assistant to assist crew in performing procedural work. This virtual task assistant will be capable of identifying which procedures should be performed, performing actions in crew procedures, and summarizing actions taken by the human-automation team to assist crew in preparing for tasks and taking over tasks from other team members.

Four studies are planned to evaluate the effects of a virtual task assistant combining procedure automation with augmented reality multi-modal (AARM) user interfaces on human task performance. These studies will achieve the following aims:

Aim 1. Determine best methods to improve situation awareness and improve crew autonomy when using a virtual task assistant to prepare for and perform manual maintenance.

Aim 2. Determine best methods to improve situation awareness and reduce workload when a virtual task assistant is used to handover maintenance tasks between users.

Aim 3. Determine best methods to improve situation awareness and reduce workload when using a virtual task assistant to help manage concurrent manual and automated tasks.

The proposed work addresses a number of gaps in the Human Research Program Human Factors and Behavioral Performance risks. This project will provide guidelines for designing effective human-automation systems (Human and Automated/Robotic Interactions (HARI)-02) and evaluate human-automation performance for exemplar procedure automation systems (HARI-03). This project also will provide guidance for the application of multi-modal and adaptive displays and control to Human-Computer Interaction (HCI) design for long duration operations (HCI-04).

Research Impact/Earth Benefits: Technologies for virtual task assistance are increasingly available in everyday life. One of the most common is voice enabled assistance, like Siri and Alexa, that aid some activities of daily living. And augmented and virtual reality technologies are becoming mainstream, with the introduction of new devices such as Microsoft HoloLens 2, and improved standards such as the WebXR standards ( https://www.w3.org/TR/webxr/ ) for accessing virtual and augmented reality devices.

The VITA project is leveraging augmented reality platforms and new WebXR standards to develop a virtual task assistant that can be used to assist users with procedural task work on the job. Our technical approach is innovative in that new procedural tasks can be supported without custom software development. Our experimental research is distinguished by investigating effective task assistance for maintenance or assembly tasks where hands-free operation of task assistance is beneficial. For the first year we are investigating best techniques for using augmented reality task assistance when assembling small devices that are held in the hands during assembly.

This technology and associated research findings have potential benefit to NASA for the assembly, maintenance, and repair of aircraft, spacecraft, habitats, and robotics. This technology and associated research findings also have broader potential benefit for any organization performing assembly and maintenance procedural work. This includes assembly and maintenance of drilling equipment for the oil and gas industry, equipment used in chemical processing plants, and maintenance and repair of commercial aircraft.

Task Progress & Bibliography Information FY2020 
Task Progress: The Virtual Intelligent Task Assistant (VITA) project investigates the effects of a virtual task assistant on human performance of procedural work. The virtual task assistant combines procedure automation with augmented reality multi-modal user interfaces. Procedure assistance will be provided in a Microsoft HoloLens headset that can present information in augmented reality overlays of the visual field. The virtual task assistant will assist users in becoming familiar with planned procedures, in performing procedure actions, and in maintaining awareness of procedure actions taken by other crewmembers or automation. Human performance will be compared with and without the virtual task assistant with the goal of informing best methods for delivering and using such virtual task assistants. The aims of this research are listed below.

Aim 1. Determine best methods to improve situation awareness and improve crew autonomy when using a virtual task assistant to prepare for and perform manual maintenance.

Aim 2. Determine best methods to improve situation awareness and reduce workload when a virtual task assistant is used to handover maintenance tasks between users

Aim 3. Determine best methods to improve situation awareness and reduce workload when using a virtual task assistant to help manage concurrent manual and automated tasks.

Research during the first year of this project addresses Aim 1. To achieve Aim 1, we will conduct a study in the Human Exploration Research Analog (HERA). This study will investigate the usability and effectiveness of the virtual task assistant to improve crew autonomy in the HERA Campaign 6. Effectiveness in increasing crew autonomy will be indicated by the number and type of interactions with Mission Control Center (MCC) or other crewmembers made during this experiment. Usability will be measured using the System Usability Scale (SUS). The VITA project is conducting two studies to determine the best methods to improve situation awareness and improve crew autonomy when using a virtual task assistant to prepare for and perform manual maintenance and assembly (Project Aim 1). These studies attempt to answer the question posed in the proposal:

“Can the virtual task assistant stand-in for MCC and help crew prepare for and perform manual tasks that are not done frequently, such as equipment maintenance and assembly?”

During the definition phase of the VITA project, the research team focused on the initialization of the VITA project including detailed coordination with the other teams in the Human Capabilities Assessments for Autonomous Missions (HCAAM) Virtual NASA Specialized Center of Research (VNSCOR), and with NASA’s Flight Analogs Program to ensure seamless integration for the HERA Campaign 6. NASA Johnson Space Center Institutional Review Board (JSC IRB) was submitted and approved for the VITA project. An integrated plan for using the HERA facility was developed with support from the HERA Experiment Support Scientist (ESS). The Science Requirements Document (SRD) for the VITA HERA experiment was created, reviewed, and signed. The VITA study timeline was developed with the HERA ESS.

The studies for VITA project aim 1 were defined and implementation of these studies was begun. The first study is a pilot laboratory study to determine techniques for use in the second study, to be conducted in HERA Campaign 6. Both studies will investigate situation awareness and workload when using VITA to help the user prepare for and perform a manual assembly task. Specifically, VITA will assist the human in assembling and disassembling a small rover. The rover can be configured with a gripper or with multiple alternative means of locomotion. The virtual task assistant will use augmented reality and multi-modal techniques to prompt and inform the user when performing assembly tasks on the rover. Participants will complete the rover assembly procedure task under three procedure completion conditions: 1) individual participants with electronic procedures on a tablet, 2) individual participants with VITA task assistant, and 3) a team of two participants with electronic procedures on a tablet. Participants will cycle across a unique set of rover procedures, following random assignment. SA and Workload measures while using the VITA task assistant will be compared with baseline performance using assembly procedures in typical electronic procedure displays for NASA (such as Orion or International Space Station (ISS) displays)) available on a portable tablet.

The technology for the VITA task assistant to be used in this experiment also was integrated during the definition phase. The HoloLens 1 was integrated with the PRIDE electronic procedure software extended for augmented reality (PRIDEAVR). This integrated technology was validated to produce contextual virtual-assistant directions to crewmember. The VITA task assistant user interface was designed and implemented. This user interface is implemented as a web user interface, which permits displaying it in either the HoloLens or on a tablet. During the definition phase, equipment also was procured for both the pilot study and the HERA Campaign 6 study.

At the time this report was submitted, the first VITA study was in progress. We do not yet have experimental findings to report from this study. The second study to be conducted in HERA Campaign 6 has been designed, technology has been integrated, and preparation for hardware and software delivery to HERA is in progress.

The last quarter of Year 1 will focus on preparation for the study to be conducted in HERA Campaign 6. This includes continuing the pilot study started in the definition phase to evaluate experimental techniques for the HERA C6 study. The study to be conducted in HERA Campaign 6 should start in Year 2 (expected start in August 2020).

Bibliography Type: Description: (Last Updated: 04/14/2022) 

Show Cumulative Bibliography Listing
 
 None in FY 2020
Project Title:  HCAAM VNSCOR: Enhancing Situation Awareness of Automated Procedures Using Adaptive Multimodal Augmented Reality Displays Reduce
Images: icon  Fiscal Year: FY 2019 
Division: Human Research 
Research Discipline/Element:
HRP HFBP:Human Factors & Behavioral Performance (IRP Rev H)
Start Date: 05/15/2019  
End Date: 05/14/2023  
Task Last Updated: 06/25/2019 
Download report in PDF pdf
Principal Investigator/Affiliation:   Schreckenghost, Debra  M.E.E. / TRACLabs, Inc. 
Address:  16969 N Texas Ave 
Suite 300 
Webster , TX 77598-4085 
Email: ghost@ieee.org 
Phone: 281-461-7886  
Congressional District: 22 
Web:  
Organization Type: INDUSTRY 
Organization Name: TRACLabs, Inc. 
Joint Agency:  
Comments:  
Co-Investigator(s)
Affiliation: 
Holden, Kritina  Ph.D. NASA Johnson Space Center 
Dory, Jonathan  B.S. NASA Johnson Space Center 
Project Information: Grant/Contract No. 80NSSC19K0667 
Responsible Center: NASA JSC 
Grant Monitor: Williams, Thomas  
Center Contact: 281-483-8773 
thomas.j.will1@nasa.gov 
Solicitation / Funding Source: 2017-2018 HERO 80JSC017N0001-BPBA Topics in Biological, Physiological, and Behavioral Adaptations to Spaceflight. Appendix C 
Grant/Contract No.: 80NSSC19K0667 
Project Type: GROUND 
Flight Program:  
TechPort: Yes 
No. of Post Docs:  
No. of PhD Candidates:  
No. of Master's Candidates:  
No. of Bachelor's Candidates:  
No. of PhD Degrees:  
No. of Master's Degrees:  
No. of Bachelor's Degrees:  
Human Research Program Elements: (1) HFBP:Human Factors & Behavioral Performance (IRP Rev H)
Human Research Program Risks: (1) HSIA:Risk of Adverse Outcome Due to Inadequate Human Systems Integration Architecture (IRP Rev L)
Human Research Program Gaps: (1) HSIA-101:We need to identify the Human Systems Integration (HSI) – relevant crew health and performance outcomes, measures, and metrics, needed to characterize and mitigate risk, for future exploration missions (IRP Rev L)
(2) HSIA-201:We need to evaluate the demands of future exploration habitat/vehicle systems and mission scenarios (e.g. increased automation, multi-modal communication) on individuals and teams, and determine the risks these demands pose to crew health and performance (IRP Rev L)
(3) HSIA-401:We need to determine how HSI can be applied in the vehicle/habitat and computer interface Design Phase to mitigate potential decrements in operationally-relevant performance (e.g. problem-solving, execution procedures), during increasingly earth-independent, future exploration missions (including in-mission and at landing) (IRP Rev L)
(4) HSIA-701:We need to determine how human-automation-robotic systems can be optimized for effective enhancement and monitoring of crew capabilities, health, and performance, during increasingly earth-independent, future exploration missions (including in-mission and at landing) (IRP Rev L)
Flight Assignment/Project Notes: NOTE: End date changed to 5/14/2023 per S. Huppman/HRP and NSSC information (Ed., 3/3/2020)

Task Description: This task is part of the Human Capabilities Assessments for Autonomous Missions (HCAAM) Virtual NASA Specialized Center of Research (VNSCOR).

Future deep space missions will present new challenges for crew, and increased risks to human performance due to the stress, fatigue, radiation exposure, and isolation that characterizes these missions. In addition, crew will no longer be able to depend on timely support from Mission Control due to distance from the Earth, but will have to work autonomously, while maintaining high performance. Mission Controllers may not be available to answer questions, check system status, assist with procedures, monitor for errors, or troubleshoot problems. Greater crew autonomy will increase dependence on automated systems, and design of these automated systems must be driven by sound human-system integration standards and guidelines in order to ensure mission success. Historically, crew have had very limited dependence on automated systems, thus crew will be faced with a new way of working that may put situation awareness (SA) at risk. We must develop methods for promoting good situation awareness in the automated systems that will most certainly be part of future deep space vehicles and habitats.

Procedure automation is a promising technology for reducing crew workload. We define procedure automation as technology that automates the selection or execution of procedural tasks. Structuring the work of automation according to human procedures should improve the transparency of automation actions. This approach provides a means for establishing common ground about ongoing tasks to improve operator understanding of automation behavior.

New technologies such as adaptive, multimodal, augmented reality displays can offer the benefits of information presentation tailored to meet the needs of each crewmember, taking into consideration the current state of that crewmember (e.g., sleep-deprived, high workload), as well as the current state of his/her environment and ongoing activities (e.g., emergency situation, time-critical operations).

We propose to combine technology for procedure automation with technology for augmented reality multi-modal (ARMM) user interfaces using Microsoft Hololens head-mounted display to provide a virtual task assistant to assist crew in performing procedural work. This virtual task assistant will be capable of identifying which procedures should be performed, performing actions in crew procedures, and summarizing actions taken by the human-automation team to assist crew in preparing for tasks and taking over tasks from other team members.

Four studies are planned to evaluate the effects of a virtual task assistant combining procedure automation with augmented reality multi-modal (AARM) user interfaces on human task performance. These studies will achieve the following aims:

Aim 1. Determine best methods to improve situation awareness and improve crew autonomy when using a virtual task assistant to prepare for and perform manual maintenance.

Aim 2. Determine best methods to improve situation awareness and reduce workload when a virtual task assistant is used to handover maintenance tasks between users.

Aim 3. Determine best methods to improve situation awareness and reduce workload when using a virtual task assistant to help manage concurrent manual and automated tasks.

The proposed work addresses a number of gaps in the Human Research Program Human Factors and Behavioral Performance risks. This project will provide guidelines for designing effective human-automation systems (Human and Automated/Robotic Interactions (HARI)-02) and evaluate human-automation performance for exemplar procedure automation systems (HARI-03). This project also will provide guidance for the application of multi-modal and adaptive displays and control to Human-Computer Interaction (HCI) design for long duration operations (HCI-04).

Research Impact/Earth Benefits:

Task Progress & Bibliography Information FY2019 
Task Progress: New project for FY2019.

Bibliography Type: Description: (Last Updated: 04/14/2022) 

Show Cumulative Bibliography Listing
 
 None in FY 2019