Menu

 

The NASA Task Book
Advanced Search     

Project Title:  Mixed Reality (MR) Care-Delivery Guidance System to Support Medical Event Management on Long Duration Exploration Missions Reduce
Fiscal Year: FY 2022 
Division: Human Research 
Research Discipline/Element:
TRISH--TRISH 
Start Date: 04/01/2020  
End Date: 03/31/2022  
Task Last Updated: 01/19/2023 
Download report in PDF pdf
Principal Investigator/Affiliation:   Dias, Roger Daglius M.D., Ph.D. / Brigham and Women's Hospital 
Address:  Department of Emergency Medicine 
75 Francis St 
Boston , MA 02115 
Email: rdias@bwh.harvard.edu 
Phone: 617-525-7627  
Congressional District:
Web:  
Organization Type: UNIVERSITY 
Organization Name: Brigham and Women's Hospital 
Joint Agency:  
Comments:  
Co-Investigator(s)
Affiliation: 
Gupta, Avni  M.P.H. Brigham and Women's Hospital 
Lipsitz, Stuart  Sc.D. Brigham and Women's Hospital 
Pozner, Charles  M.D. Brigham and Women's Hospital 
Robertson, Jamie  Ph.D. Brigham and Women's Hospital 
Smink, Douglas  M.D. Brigham and Women's Hospital 
Musson, David  M.D., Ph.D. McMaster University 
Doyle, Thomas  Ph.D. McMaster University 
Yule, Steven  Ph.D. Brigham and Women's Hospital 
Project Information: Grant/Contract No. NNX16AO69A-T0506 
Responsible Center: TRISH 
Grant Monitor:  
Center Contact:   
Unique ID: 13968 
Solicitation / Funding Source: 2020 TRISH BRASH1901: Translational Research Institute for Space Health (TRISH) Biomedical Research Advances for Space Health 
Grant/Contract No.: NNX16AO69A-T0506 
Project Type: GROUND 
Flight Program:  
TechPort: No 
No. of Post Docs:
No. of PhD Candidates:
No. of Master's Candidates:
No. of Bachelor's Candidates:
No. of PhD Degrees:
No. of Master's Degrees:
No. of Bachelor's Degrees:
Human Research Program Elements: None
Human Research Program Risks: None
Human Research Program Gaps: None
Task Description: Unanticipated medical events may potentially affect crew health, impact in-flight capacity, and compromise success of long-duration exploration missions. Like technical problem solving, medical events require crew members to rapidly coordinate in order to diagnose and manage situations that may be outside their primary technical expertise. Missions, such as those to Mars, will take upwards of three years and lack real-time communications with experts on the ground. As a result, we need to provide crew with tools and technology that can help them provide medical care autonomously.

Effective spaceflight medical training must be combined with in-flight support tools to ensure crew competence in management of medical events and caring for sick astronauts. Collectively called Augmented Clinical Tools (ACT), these include technologies and applications to assist medical decision-making and action. Mixed Reality (MR) -- the ability to place virtual and photo-realistic items into the field of view using holograms -- provides an immersive, realistic user experience that has also proven feasible for training and guidance during technical non-routine tasks.

We propose to utilize existing technology to develop MR software that provides realistic training scenarios for astronauts, and combine medical education with real-time clinical support for some probable medical events in deep space. This includes a “SMART checklist” which guides astronauts through managing medical events in real-time. MR allows us to create lifelike space environments for astronauts to practice their skills. We will involve a wide range of stakeholders in software development and testing for usability, engagement, and performance. The project will take two years to complete and we will provide innovative products and guidance that can be incorporated into astronaut training to ensure that they have the knowledge, skills, and support to manage the expected and unexpected challenges on successful deep space missions.

Research Impact/Earth Benefits: This project produced several benefits and certainly advanced space health, medical training, and extended reality fields. Although we deployed already existing technologies, they were used in an innovative way to bring photorealism and interactive storytelling from other industries, such as entertainment and games, to improve medical competence and clinical performance in the space context. By designing, developing, prototyping, and iteratively testing software and research methodologies our team learned a tremendous amount of knowledge and gained valuable insights that through scientific dissemination (presentations and peer-reviewed manuscripts) will inform and allow other researchers and developers to design and develop Extended Reality (XR)-based space health systems more efficiently and effectively. By applying a wide range of methodologies, from qualitative research (Delphi panel), through evidence review (systematic literature review), to rigorous quantitative research (randomized trial), this project produced a vast amount of novel knowledge and scientific evidence in the recently new field of XR applied to Space Health. Future initiatives in this arena will greatly benefit from this project's products since they can use the learning and findings we have generated to move the field forward without the need to start from scratch based on try and error. Our deliverables, including fully functional XR interactive training scenarios with a high level of realism, demonstrated the feasibility of such a platform and, furthermore, we investigated the advantages and disadvantages of different XR modalities, informing the selection and prioritization of certain modalities based on scientific data. In addition to training applications, this project also created a proof-of-concept of an augmented reality clinical guidance/ decision support system to help astronauts during real-life medical care in space. This is the first step toward the validation and effectiveness evaluation of such augmented clinical tools to seamlessly integrate the medical exploration system of space missions for progressively Earth-independent medical care in space.

Task Progress & Bibliography Information FY2022 
Task Progress: Unanticipated medical events may potentially affect crew health, impact in-flight capacity, and compromise the success of long-duration exploration missions. Missions, such as those to Mars, will take upwards of three years and lack real-time communications with experts on the ground. As a result, we need to provide the crew with tools and technology that can help them provide medical care autonomously. Effective spaceflight medical training must be combined with in-flight support tools to ensure crew competence in the management of medical events and caring for sick astronauts.

In this study, we proposed to utilize existing technology to develop Extended Reality (XR) software that provides immersive, interactive, realistic training scenarios for astronauts and combines medical education with real-time clinical support for some probable medical events in deep space. This includes a "smart checklist" application deployed as an augmented reality (AR) coach (AR-Coach) that guides astronauts through managing medical events in real-time. XR technology allowed us to create lifelike space environments for astronauts to practice their skills. We involved a wide range of stakeholders in software development and testing for usability, engagement, and performance. This project will provide innovative products and guidance that can be incorporated into astronaut training to ensure that they have the knowledge, skills, and support to manage the expected and unexpected challenges on successful deep space missions. The specific aims of this project were:

Aim 1: Identify the necessary features and functionalities of Mixed Reality (MR) medical education for long-duration exploration missions (LDEMs).

Aim 2: Adapt an existing validated MR platform to deliver immersive medical education modules incorporating a real-time care-delivery guidance tool (SMART checklist) to support autonomous medical event management on LDEMs.

Aim 3: Implement a data-driven integrative approach to evaluate the MR medical education platform.

In the first year of this project, we convened a multidisciplinary expert panel composed of 45 panelists. During online meetings and through surveys between meetings, we applied a Delphi Method to get a consensus on functionalities that are essential for both XR medical education and clinical guidance during LDEM. The expert panel has also provided recommendations on specific medical events that would be suitable for an XR platform, in addition to relevant clinical competencies and instructional design considerations for simulation scenarios and Augmented Clinical Tools (ACT) development. Based on these findings, we listed a total of 89 distinct XR functionalities, from which 13 were removed based on the level of essentiality, assessed by experts after 4 rounds of the Delphi Method. Based on the expert panel rating, we have also selected Tension Pneumothorax and Smoke Inhalation as the medical events that will be featured in the XR scenarios. Preliminary data from the expert panel on emergency medicine XR scenarios were presented as a poster presentation at the 2021 Society for Academic Emergency Medicine (SAEM) Annual Conference. A final manuscript reporting the results of the expert panel and proposing a framework for the design and development of XR-based Space Health applications, entitled "Using Extended Reality (XR) for Medical Training and Real-Time Clinical Support during Deep Space Missions" is under peer review in the Applied Ergonomics Journal. We have also completed a Systematic Review of literature on XR application for Space Health, and a manuscript entitled: "Applications of Extended Reality (XR) for Space Health: A Systematic Review" is under review in the Aerospace Medicine and Human Performance Journal.

A prototype of the AR-Coach system was developed to provide procedural guidance through a HoloLens mixed reality device to astronauts during ultrasound procedures in space. This work was published in the peer-reviewed proceedings of the International Conference on Applied Human Factors and Ergonomics (AHFE).

In the second year of this project, we used professional XR film production and volumetric video techniques to create two interactive XR scenarios, deployed in three different modalities: screen-based virtual reality (VR desktop), fully immersive VR (head-mounted device), and mobile augmented reality (AR through a tablet). After design, development, and testing, we conducted usability studies and, subsequently, a randomized trial, recruiting astronaut-like participants and randomizing them to participate in medical training and assessment using one of the three XR modalities. Several metrics, including demographics, technology acceptance, sense of presence, learning outcomes, and digital physiological biomarkers of cognitive load were captured and analyzed. The findings of this study showed that the AR modality produces the highest sense of presence, involvement, and experienced realism when compared with desktop and VR modalities. There was no statistically significant difference between the three modalities regarding the system usability score (SUS), although the AR modality presented a greater score with a trend toward statistical significance. By investigating the cognitive load of subjects, there was no statistically significant difference in the NASA Task Load Index (TLX) overall score; however, the "physical demand" domain was scored greater in the AR modality compared to desktop and VR. Among several physiological metrics, low frequency/high frequency (LF/HF) ratio – a biomarker of cognitive load – was higher in subjects using the AR modality. Performance efficiency, measured by the time in minutes to complete the scenarios, did not differ between groups. Interestingly, the medical knowledge test showed that subjects in the AR modality presented a lower score compared to the Desktop modality, but no difference compared with the VR modality.

Bibliography: Description: (Last Updated: 07/12/2023) 

Show Cumulative Bibliography
 
Abstracts for Journals and Proceedings Ebnali M, Burian BK, Robertson JM, Musson D, Atamna B, Yule S, Dias RD. "A taxonomy for design and development of extended reality medical training and real-time clinical guidance during space missions." 2022 NASA Human Research Program Investigators’ Workshop, Virtual, February 7-10, 2022.

Abstracts. 2022 NASA Human Research Program Investigators’ Workshop, Virtual, February 7-10, 2022. , Feb-2022

Abstracts for Journals and Proceedings Ebnali M, Goldsmith AJ, Burian B, Atamna B, Duggan NM, Fischetti C, Yule S, Dias R. "AR-Coach: Using Augmented Reality (AR) for real-time clinical guidance during medical emergencies on deep space exploration missions." 13th International Conference on Applied Human Factors and Ergonomics (AHFE 2022), New York, NY, July 24-28, 2022.

Abstracts. 13th International Conference on Applied Human Factors and Ergonomics (AHFE 2022), New York, NY, July 24-28, 2022. , Jul-2022

Abstracts for Journals and Proceedings Dias RD, Robertson JM, Miccile C, Pozner CN, Mormann B, Smink DS, Lipsitz S, Doyle T, Mansouri M, Cerqueira R, Musson D, Burian BK, Yule S. "Extended Reality (XR) for emergency care training and real time clinical guidance during long-duration space missions." Society for Academic Emergency Medicine (SAEM) 2021 Meeting, Virtual, May 11-14, 2021.

Abstracts. Acad Emerg Med. 2021;28(Suppl. 1):S9–S398. , May-2021

Abstracts for Journals and Proceedings Dias RD, Ebnali M, Cerqueira R, Robertson JM, Burian BK, Miccile C, Mansouri M, Mormann B, Smink DS, Lipsitz S, Doyle T, Musson D, Pozner CN, Yule S. "Extended reality (XR) medical scenarios for in-flight emergency care training during deep space missions." 2022 NASA Human Research Program Investigators’ Workshop, Virtual, February 7-10, 2022.

2022 NASA Human Research Program Investigators’ Workshop, Virtual, February 7-10, 2022. , Feb-2022

Articles in Peer-reviewed Journals Ebnali M, Goldsmith A, Burian B, Atamna B, Duggan N, Fischetti C, Yule S, Dias R. "AR-coach: Using augmented reality (AR) for real-time clinical guidance during medical emergencies on deep space exploration missions." In: Jay Kalra and Nancy Lightner (eds). Healthcare and Medical Devices. AHFE International, 2022. p. 67-75. http://doi.org/10.54941/ahfe1002100 , Jul-2022
Articles in Peer-reviewed Journals Ebnali M, Paladugu P, Miccile C, Park SH, Burian B, Yule S, Dias RD. "Extended reality applications for space health." Aerosp Med Hum Perform. 2023 Mar 1;94(3):122-30. https://doi.org/10.3357/AMHP.6131.2023 ; PMID: 36829279 , Mar-2023
Articles in Peer-reviewed Journals Burian BK, Ebnali M, Robertson JM, Musson D, Pozner CN, Doyle T, Smink DS, Miccile C, Paladugu P, Atamna B, Lipsitz S, Yule S, Dias RD. "Using extended reality (XR) for medical training and real-time clinical support during deep space missions." Appl Ergon. 2023 Jan;106:103902. https://doi.org/10.1016/j.apergo.2022.103902 ; PMID: 36162274 , Jan-2023
Project Title:  Mixed Reality (MR) Care-Delivery Guidance System to Support Medical Event Management on Long Duration Exploration Missions Reduce
Fiscal Year: FY 2021 
Division: Human Research 
Research Discipline/Element:
TRISH--TRISH 
Start Date: 04/01/2020  
End Date: 03/31/2022  
Task Last Updated: 07/22/2021 
Download report in PDF pdf
Principal Investigator/Affiliation:   Dias, Roger Daglius M.D., Ph.D. / Brigham and Women's Hospital 
Address:  Department of Emergency Medicine 
75 Francis St 
Boston , MA 02115 
Email: rdias@bwh.harvard.edu 
Phone: 617-525-7627  
Congressional District:
Web:  
Organization Type: UNIVERSITY 
Organization Name: Brigham and Women's Hospital 
Joint Agency:  
Comments:  
Co-Investigator(s)
Affiliation: 
Gupta, Avni  M.P.H. Brigham and Women's Hospital 
Lipsitz, Stuart  Sc.D. Brigham and Women's Hospital 
Pozner, Charles  M.D. Brigham and Women's Hospital 
Robertson, Jamie  Ph.D. Brigham and Women's Hospital 
Smink, Douglas  M.D. Brigham and Women's Hospital 
Musson, David  M.D., Ph.D. McMaster University 
Doyle, Thomas  Ph.D. McMaster University 
Yule, Steven  Ph.D. Brigham and Women's Hospital 
Project Information: Grant/Contract No. NNX16AO69A-T0506 
Responsible Center: TRISH 
Grant Monitor:  
Center Contact:   
Unique ID: 13968 
Solicitation / Funding Source: 2020 TRISH BRASH1901: Translational Research Institute for Space Health (TRISH) Biomedical Research Advances for Space Health 
Grant/Contract No.: NNX16AO69A-T0506 
Project Type: GROUND 
Flight Program:  
TechPort: No 
No. of Post Docs:
No. of PhD Candidates:
No. of Master's Candidates:
No. of Bachelor's Candidates:
No. of PhD Degrees:
No. of Master's Degrees:
No. of Bachelor's Degrees:
Human Research Program Elements: None
Human Research Program Risks: None
Human Research Program Gaps: None
Task Description: Unanticipated medical events may potentially affect crew health, impact in-flight capacity, and compromise success of long-duration exploration missions. Like technical problem solving, medical events require crew members to rapidly coordinate in order to diagnose and manage situations that may be outside their primary technical expertise. Missions, such as those to Mars, will take upwards of three years and lack real-time communications with experts on the ground. As a result, we need to provide crew with tools and technology that can help them provide medical care autonomously.

Effective spaceflight medical training must be combined with in-flight support tools to ensure crew competence in management of medical events and caring for sick astronauts. Collectively called Augmented Clinical Tools (ACT), these include technologies and applications to assist medical decision-making and action. Mixed Reality (MR) -- the ability to place virtual and photo-realistic items into the field of view using holograms -- provides an immersive, realistic user experience that has also proven feasible for training and guidance during technical non-routine tasks.

We propose to utilize existing technology to develop MR software that provides realistic training scenarios for astronauts, and combine medical education with real-time clinical support for some probable medical events in deep space. This includes a “SMART checklist” which guides astronauts through managing medical events in real-time. MR allows us to create lifelike space environments for astronauts to practice their skills. We will involve a wide range of stakeholders in software development and testing for usability, engagement, and performance. The project will take two years to complete and we will provide innovative products and guidance that can be incorporated into astronaut training to ensure that they have the knowledge, skills, and support to manage the expected and unexpected challenges on successful deep space missions.

Research Impact/Earth Benefits: Impact for Space

This project will foster effective in-flight medical training for both on-demand and continued medical education.

This project will provide clinical decision support tools for astronauts managing in-flight medical emergencies under varied levels of autonomy.

The deliverables from this project have the potential to mitigate the risks related to Adverse Health Outcomes & Decrements in Performance due to inflight Medical Conditions during long-duration exploration missions (LDEM).

Impact for Earth

Similar to LDEM, many terrestrial settings require medical education and clinical guidance tools for autonomous training and clinical care. The deliverables from this project can be used to support healthcare professionals who work in austere environments. Even in resourceful environments, like hospitals, medical trainees practice medicine under varied levels of autonomy through their clinical training. The project's deliverables can also be used in this environment to support medical trainees during clinical training (e.g., medical residency) when they do not have immediate access to their supervisors (e.g., during shifts when attending is on call).

Task Progress & Bibliography Information FY2021 
Task Progress: Unanticipated medical events may potentially affect crew health; impact in-flight capacity, and compromise the success of long-duration exploration missions (LDEM). Like technical problem solving, medical events require crew members to rapidly coordinate in order to diagnose and manage situations that may be outside their primary technical expertise. Missions, such as those to Mars, will take upwards of three years and lack real-time communications with experts on the ground. As a result, we need to provide the crew with tools and technology that can help them provide medical care autonomously.

Effective spaceflight medical training must be combined with in-flight support tools to ensure crew competence in the management of medical events and caring for sick astronauts. Collectively called Augmented Clinical Tools (ACT), these include technologies and applications to assist medical decision-making and action. Extended Reality (XR) - the ability to place virtual and photo-realistic items into the field of view using holograms - provides an immersive, realistic user experience that has also proven feasible for training and guidance during technical non-routine tasks.

We propose to utilize existing technology to develop XR software that provides realistic training scenarios for astronauts and combines medical education with real-time clinical support for some probable medical events in deep space. This includes a SMART checklist which guides astronauts through managing medical events in real-time. XR allows us to create lifelike space environments for astronauts to practice their skills. We will involve a wide range of stakeholders in software development and testing for usability, engagement, and performance. The project will take two years to complete and we will provide innovative products and guidance that can be incorporated into astronaut training to ensure that they have the knowledge, skills, and support to manage the expected and unexpected challenges on successful deep space missions.

In the first year of this project, we completed a total of 5 monthly videoconferences with a multidisciplinary expert panel composed of 45 panelists. During the meetings and through surveys between meetings, we applied a Delphi method to get consensus on functionalities that are essential for both XR medical education and clinical guidance during LDEM. The expert panel has also provided recommendations on specific medical events that would be suitable for an XR platform in addition to relevant clinical competencies and instructional design considerations for simulation scenarios and ACT development. Based on these findings, we listed a total of 89 distinct XR functionalities, from which 13 were removed based on the level of essentiality assessed by experts after 4 rounds of the Delphi Method. Based on the expert panel rating, we have also selected Tension Pneumothorax and Smoke Inhalation as the medical events that will be featured in the XR scenarios. Preliminary data from the expert panel on emergency medicine XR scenarios was submitted and accepted for poster presentation at the 2021 Society for Academic Emergency Medicine (SAEM) Annual Conference.

In the second year of this project, we will develop two simulated scenarios for three different modalities: screen-based virtual reality (VR), immersive VR, and augmented reality (AR). After development, we will run an experiment by recruiting astronaut-like participants and randomizing them in one of these modalities. A series of learning outcomes and engagement metrics, including learners' physiological signals, will be captured and analyzed.

Bibliography: Description: (Last Updated: 07/12/2023) 

Show Cumulative Bibliography
 
 None in FY 2021
Project Title:  Mixed Reality (MR) Care-Delivery Guidance System to Support Medical Event Management on Long Duration Exploration Missions Reduce
Fiscal Year: FY 2020 
Division: Human Research 
Research Discipline/Element:
TRISH--TRISH 
Start Date: 04/01/2020  
End Date: 03/31/2022  
Task Last Updated: 07/30/2020 
Download report in PDF pdf
Principal Investigator/Affiliation:   Dias, Roger Daglius M.D., Ph.D. / Brigham and Women's Hospital 
Address:  Department of Emergency Medicine 
75 Francis St 
Boston , MA 02115 
Email: rdias@bwh.harvard.edu 
Phone: 617-525-7627  
Congressional District:
Web:  
Organization Type: UNIVERSITY 
Organization Name: Brigham and Women's Hospital 
Joint Agency:  
Comments:  
Co-Investigator(s)
Affiliation: 
Yule, Steven  Ph.D. Brigham and Women's Hospital, Inc. 
Doyle, Thomas  Ph.D. McMaster University, Canada 
Gupta, Avni  M.P.H. Brigham and Women's Hospital, Inc. 
Lipsitz, Stuart  Sc.D. Brigham and Women's Hospital, Inc. 
Pozner, Charles  M.D. Brigham and Women's Hospital, Inc. 
Robertson, Jamie  Ph.D. Brigham and Women's Hospital, Inc. 
Smink, Douglas  M.D. Brigham and Women's Hospital, Inc. 
Musson, David  M.D., Ph.D. McMaster University, Canada 
Key Personnel Changes / Previous PI: Principal Investigator (PI) Roger Dias, MD, PhD, became the main PI when the project started; original PI in the proposal was Steven Yule, PhD, who relocated to the University of Edinburgh in Scotland while retaining a faculty position at Brigham & Women’s Hospital/ Harvard Medical School and is now CoInvestigator on the project.
Project Information: Grant/Contract No. NNX16AO69A-T0506 
Responsible Center: TRISH 
Grant Monitor:  
Center Contact:   
Unique ID: 13968 
Solicitation / Funding Source: 2020 TRISH BRASH1901: Translational Research Institute for Space Health (TRISH) Biomedical Research Advances for Space Health 
Grant/Contract No.: NNX16AO69A-T0506 
Project Type: GROUND 
Flight Program:  
TechPort: No 
No. of Post Docs:  
No. of PhD Candidates:  
No. of Master's Candidates:  
No. of Bachelor's Candidates:  
No. of PhD Degrees:  
No. of Master's Degrees:  
No. of Bachelor's Degrees:  
Human Research Program Elements: None
Human Research Program Risks: None
Human Research Program Gaps: None
Task Description: Unanticipated medical events may potentially affect crew health, impact in-flight capacity, and compromise success of long-duration exploration missions. Like technical problem solving, medical events require crew members to rapidly coordinate in order to diagnose and manage situations that may be outside their primary technical expertise. Missions, such as those to Mars, will take upwards of three years and lack real-time communications with experts on the ground. As a result, we need to provide crew with tools and technology that can help them provide medical care autonomously.

Effective spaceflight medical training must be combined with in-flight support tools to ensure crew competence in management of medical events and caring for sick astronauts. Collectively called Augmented Clinical Tools (ACT), these include technologies and applications to assist medical decision-making and action. Mixed Reality (MR) -- the ability to place virtual and photo-realistic items into the field of view using holograms -- provides an immersive, realistic user experience that has also proven feasible for training and guidance during technical non-routine tasks.

We propose to utilize existing technology to develop MR software that provides realistic training scenarios for astronauts, and combine medical education with real-time clinical support for some probable medical events in deep space. This includes a “SMART checklist” which guides astronauts through managing medical events in real-time. MR allows us to create lifelike space environments for astronauts to practice their skills. We will involve a wide range of stakeholders in software development and testing for usability, engagement, and performance. The project will take two years to complete and we will provide innovative products and guidance that can be incorporated into astronaut training to ensure that they have the knowledge, skills, and support to manage the expected and unexpected challenges on successful deep space missions.

Research Impact/Earth Benefits:

Task Progress & Bibliography Information FY2020 
Task Progress: New project for FY2020.

Bibliography: Description: (Last Updated: 07/12/2023) 

Show Cumulative Bibliography
 
 None in FY 2020