Task Progress:
|
Unanticipated medical events may potentially affect crew health, impact in-flight capacity, and compromise the success of long-duration exploration missions. Missions, such as those to Mars, will take upwards of three years and lack real-time communications with experts on the ground. As a result, we need to provide the crew with tools and technology that can help them provide medical care autonomously. Effective spaceflight medical training must be combined with in-flight support tools to ensure crew competence in the management of medical events and caring for sick astronauts.
In this study, we proposed to utilize existing technology to develop Extended Reality (XR) software that provides immersive, interactive, realistic training scenarios for astronauts and combines medical education with real-time clinical support for some probable medical events in deep space. This includes a "smart checklist" application deployed as an augmented reality (AR) coach (AR-Coach) that guides astronauts through managing medical events in real-time. XR technology allowed us to create lifelike space environments for astronauts to practice their skills. We involved a wide range of stakeholders in software development and testing for usability, engagement, and performance. This project will provide innovative products and guidance that can be incorporated into astronaut training to ensure that they have the knowledge, skills, and support to manage the expected and unexpected challenges on successful deep space missions. The specific aims of this project were:
Aim 1: Identify the necessary features and functionalities of Mixed Reality (MR) medical education for long-duration exploration missions (LDEMs).
Aim 2: Adapt an existing validated MR platform to deliver immersive medical education modules incorporating a real-time care-delivery guidance tool (SMART checklist) to support autonomous medical event management on LDEMs.
Aim 3: Implement a data-driven integrative approach to evaluate the MR medical education platform.
In the first year of this project, we convened a multidisciplinary expert panel composed of 45 panelists. During online meetings and through surveys between meetings, we applied a Delphi Method to get a consensus on functionalities that are essential for both XR medical education and clinical guidance during LDEM. The expert panel has also provided recommendations on specific medical events that would be suitable for an XR platform, in addition to relevant clinical competencies and instructional design considerations for simulation scenarios and Augmented Clinical Tools (ACT) development. Based on these findings, we listed a total of 89 distinct XR functionalities, from which 13 were removed based on the level of essentiality, assessed by experts after 4 rounds of the Delphi Method. Based on the expert panel rating, we have also selected Tension Pneumothorax and Smoke Inhalation as the medical events that will be featured in the XR scenarios. Preliminary data from the expert panel on emergency medicine XR scenarios were presented as a poster presentation at the 2021 Society for Academic Emergency Medicine (SAEM) Annual Conference. A final manuscript reporting the results of the expert panel and proposing a framework for the design and development of XR-based Space Health applications, entitled "Using Extended Reality (XR) for Medical Training and Real-Time Clinical Support during Deep Space Missions" is under peer review in the Applied Ergonomics Journal. We have also completed a Systematic Review of literature on XR application for Space Health, and a manuscript entitled: "Applications of Extended Reality (XR) for Space Health: A Systematic Review" is under review in the Aerospace Medicine and Human Performance Journal.
A prototype of the AR-Coach system was developed to provide procedural guidance through a HoloLens mixed reality device to astronauts during ultrasound procedures in space. This work was published in the peer-reviewed proceedings of the International Conference on Applied Human Factors and Ergonomics (AHFE).
In the second year of this project, we used professional XR film production and volumetric video techniques to create two interactive XR scenarios, deployed in three different modalities: screen-based virtual reality (VR desktop), fully immersive VR (head-mounted device), and mobile augmented reality (AR through a tablet). After design, development, and testing, we conducted usability studies and, subsequently, a randomized trial, recruiting astronaut-like participants and randomizing them to participate in medical training and assessment using one of the three XR modalities. Several metrics, including demographics, technology acceptance, sense of presence, learning outcomes, and digital physiological biomarkers of cognitive load were captured and analyzed. The findings of this study showed that the AR modality produces the highest sense of presence, involvement, and experienced realism when compared with desktop and VR modalities. There was no statistically significant difference between the three modalities regarding the system usability score (SUS), although the AR modality presented a greater score with a trend toward statistical significance. By investigating the cognitive load of subjects, there was no statistically significant difference in the NASA Task Load Index (TLX) overall score; however, the "physical demand" domain was scored greater in the AR modality compared to desktop and VR. Among several physiological metrics, low frequency/high frequency (LF/HF) ratio – a biomarker of cognitive load – was higher in subjects using the AR modality. Performance efficiency, measured by the time in minutes to complete the scenarios, did not differ between groups. Interestingly, the medical knowledge test showed that subjects in the AR modality presented a lower score compared to the Desktop modality, but no difference compared with the VR modality.
|
|
Abstracts for Journals and Proceedings
|
Ebnali M, Burian BK, Robertson JM, Musson D, Atamna B, Yule S, Dias RD. "A taxonomy for design and development of extended reality medical training and real-time clinical guidance during space missions." 2022 NASA Human Research Program Investigators’ Workshop, Virtual, February 7-10, 2022. Abstracts. 2022 NASA Human Research Program Investigators’ Workshop, Virtual, February 7-10, 2022. , Feb-2022
|
|
Abstracts for Journals and Proceedings
|
Ebnali M, Goldsmith AJ, Burian B, Atamna B, Duggan NM, Fischetti C, Yule S, Dias R. "AR-Coach: Using Augmented Reality (AR) for Real-Time Clinical Guidance During Medical Emergencies on Deep Space Exploration Missions." 13th International Conference on Applied Human Factors and Ergonomics (AHFE 2022), New York, NY, July 24 - 28, 2022. Abstracts. 13th International Conference on Applied Human Factors and Ergonomics (AHFE 2022), New York, NY, July 24 - 28, 2022., Jul-2022. , Jul-2022
|
|
Abstracts for Journals and Proceedings
|
Dias RD, Robertson JM, Miccile C, Pozner CN, Mormann B, Smink DS, Lipsitz S, Doyle T, Mansouri M, Cerqueira R, Musson D, Burian BK, Yule S. "Extended Reality (XR) for Emergency Care Training and Real Time Clinical Guidance During Long-Duration Space Missions" Society for Academic Emergency Medicine (SAEM) 2021 Meeting, Virtual, May 11-14, 2021. Acad Emerg Med. 2021;28(Suppl. 1):S9–S398. , May-2021
|
|
Abstracts for Journals and Proceedings
|
Dias RD, Ebnali M, Cerqueira R, Robertson JM, Burian BK, Miccile C, Mansouri M, Mormann B, Smink DS, Lipsitz S, Doyle T, Musson D, Pozner CN, Yule S. "Extended reality (XR) medical scenarios for in-flight emergency care training during deep space missions." 2022 NASA Human Research Program Investigators’ Workshop, Virtual, February 7-10, 2022. 2022 NASA Human Research Program Investigators’ Workshop, Virtual, February 7-10, 2022. , Feb-2022
|
|
Articles in Peer-reviewed Journals
|
Ebnali M, Goldsmith A, Burian B, Atamna B, Duggan N, Fischetti C, Yule S, Dias R. "AR-coach: Using augmented reality (AR) for real-time clinical guidance during medical emergencies on deep space exploration missions." In: Jay Kalra and Nancy Lightner (eds). Healthcare and Medical Devices. AHFE International, 2022. p. 67-75. http://doi.org/10.54941/ahfe1002100 , Jul-2022
|
|
Articles in Peer-reviewed Journals
|
Burian BK, Ebnali M, Robertson JM, Musson D, Pozner CN, Doyle T, Smink DS, Miccile C, Paladugu P, Atamna B, Lipsitz S, Yule S, Dias RD. "Using extended reality (XR) for medical training and real-time clinical support during deep space missions." Appl Ergon. 2023 Jan;106:103902. https://doi.org/10.1016/j.apergo.2022.103902 ; PMID: 36162274 , Jan-2023
|
|