Menu

 

The NASA Task Book
Advanced Search     

Project Title:  Displays and Controls Interfaces Reduce
Fiscal Year: FY 2013 
Division: Human Research 
Research Discipline/Element:
HRP SHFH:Space Human Factors & Habitability (archival in 2017)
Start Date: 08/30/2010  
End Date: 09/30/2013  
Task Last Updated: 11/14/2013 
Download report in PDF pdf
Principal Investigator/Affiliation:   Sandor, Aniko  Ph.D. / Lockheed-Martin/NASA Johnson Space Center 
Address:  2101 Nasa Parkway 
Mail Code: C46 
Houston , TX 77058 
Email: Aniko.Sandor-1@nasa.gov 
Phone: 281.483.9726  
Congressional District: 22 
Web:  
Organization Type: NASA CENTER 
Organization Name: Lockheed-Martin/NASA Johnson Space Center 
Joint Agency:  
Comments:  
Co-Investigator(s)
Affiliation: 
Archer, Ronald  Lockheed-Martin/ NASA Johnson Space Center  
Thompson, Shelby  Lockheed Martin 
Cross, Ernest Vincent Lockheed Martin 
Wenzel, Elizabeth  NASA Ames Research Center 
Godfroy, Martine  San Jose State UniversityFoundation 
Miller, Joel  San Jose State University Foundation 
Begault, Durand  NASA Ames Research Center 
Project Information: Grant/Contract No. Directed Research 
Responsible Center: NASA JSC 
Grant Monitor: Sullivan, Thomas  
Center Contact:  
thomas.a.sullivan@nasa.gov 
Solicitation / Funding Source: Directed Research 
Grant/Contract No.: Directed Research 
Project Type: GROUND 
Flight Program:  
TechPort: No 
No. of Post Docs:
No. of PhD Candidates:
No. of Master's Candidates:
No. of Bachelor's Candidates:
No. of PhD Degrees:
No. of Master's Degrees:
No. of Bachelor's Degrees:
Human Research Program Elements: (1) SHFH:Space Human Factors & Habitability (archival in 2017)
Human Research Program Risks: (1) HCI:Risk of Inadequate Human-Computer Interaction
Human Research Program Gaps: (1) SHFE-HCI-03:We need HCI guidelines (e.g., display configuration, screen-navigation) to mitigate the performance decrements identified in SHFE-HCI-08 due to the spaceflight environment (IRP Rev D)
Flight Assignment/Project Notes: NOTE: End date changed to 9/30/2013 per HRP Master Task List information dtd 11/11/2011 (Ed., 1/5/2012)

Task Description: Future exploration missions will require much greater crew autonomy, particularly for suited operations. Crews will be extremely dependent on the information available within the spacesuit for monitoring their health and suit resources, and for performing tasks. Suit data such as battery power, oxygen remaining, crew biomedical data, procedure and task information, and navigational data are all needed by extravehicular activity (EVA) crewmembers to successfully complete their mission. If informational displays are poorly designed, or not easily accessible, crews will not have access to critical data, putting their mission and personal safety at risk. Suits pose special challenges in terms of information display and interaction, given limited display real estate, and gloves and helmets compromise vision, hearing, and touch. The methods by which information is delivered need to support, not hinder, task completion. Current EVA crewmembers depend heavily on communication with the ground for completion of their tasks. Future missions to more distant destinations will require a much different approach to ensure crew independence.

This line of research will focus on: 1) special techniques for formatting data delivered in a spacesuit, and 2) mechanisms for delivering and interacting with that data, given suit constraints. Researchers will first identify the different classes of information needed by the suited crewmember, then determine the modality and format of the data required for each class, and finally investigate the best technology solution to provide the data. Researchers will work with EVA Physiology, Systems and Performance (EPSP) researchers and developers using the metabolic data display issue as a case study. Various information designs and technology solutions will be empirically compared and requirements developed.

Methods to be used consist of the following: Task analysis, to identify and understand the suited tasks to be performed, including interviews with EVA astronauts to understand suited information needs and issues from the astronauts perspective; literature reviews on different information display techniques for different classes of data (e.g., procedures, alarms, metabolic data) and available technologies (e.g., Head Mounted Displays (HMDs), cuff checklists, voice); and usability testing and experimental studies to assess human performance with the proposed designs using metrics such as error rates, task completion times, verbal protocol comments, and questionnaire responses, ratings, and rankings. Standard parametric and non-parametric statistical methods will be used for data analysis. Multiple methods, metrics, and information developed as part of the Information Presentation (2008-2010) Directed Research Project (DRP) will be leveraged in this project, including information on labels, alarms, cursor control devices, HMDs, and health and status displays. Products developed as part of the Usability (2008-2009) Directed Research Project will be validated as part of this new DRP, including methods and metrics for error rates, legibility, and consistency.

Rationale for HRP Directed Research: This research is directed because it contains highly constrained research, which requires focused and constrained data gathering and analysis that is more appropriately obtained through a non-competitive proposal.

Research Impact/Earth Benefits: Results of the conducted research are applicable to multimodal interface design. The project developed an innovative extravehicular activity interface prototype and conducted research on multimodal interaction with it. The interface element concepts can be used in any type of display that summarizes many parameters on multiple features.

Task Progress & Bibliography Information FY2013 
Task Progress: The Displays and Controls Interfaces DRP addresses the following Human Research Program (HRP) Risk and Gap:

• HRP Risk: Risk of inadequate human-computer interaction. Given that human-computer interaction (HCI) and information architecture designs must support crew tasks, and given the greater dependence on HCI in the context of long-duration spaceflight operations, there is a risk that critical information systems will not support crew tasks effectively, resulting in flight and ground crew errors and inefficiencies, failed mission and program objectives, and an increase in crew injuries.

• HRP Gap: Space Human Factors Engineering (SHFE)-HCI-03. We need HCI guidelines (e.g., display configuration, screen-navigation) to mitigate the performance decrements identified in SHFE-HCI-08 due to the spaceflight environment.

The study results provided HCI guidelines specific to EVA displays that will lead to improved human performance and contribute to the closure of the gap. Future research should investigate multimodal interfaces and guidelines for EVA displays in realistic scenarios.

1. EVA consumables display recommendations

On current extravehicular activity (EVA) missions, crewmembers depend on ground support personnel to monitor activities and suit systems. On deep space missions without the help of ground personnel, crewmembers will be responsible for monitoring their own and their team members’ consumable information when performing an EVA. Therefore, it is necessary to investigate approaches for concise representation of EVA consumable information. Based on information gathered through interviews with subject matter experts (SMEs) and crewmembers, it was found that there are four consumables of major interest: oxygen, battery power, cooling water, and carbon dioxide (Sándor, Archer, & Boyer, 2011). A quick-look summary display for easily assessing critical information, such as time remaining on each consumable, would be desirable.

The study investigated the visual presentation of EVA consumables data with tables and multidimensional icons, Chernoff faces, and stick figures (Chernoff, 1973; Pickett & Grinstein, 1988). Multidimensional icons are recommended to be used for multivariate data representation when there is limited display real estate. For the Chernoff faces, various features of the face were used to convey consumable status. For the stick figures, each limb represented a consumable, with position of the limb showing good or bad status. The study focused on two tasks for each design approach: a) identifying the consumable that has low time remaining, and b) identifying the crewmember in the worst condition.

The results showed that all of the formats are adequate for searching for a consumable with low time remaining, since this requires identifying a single specific piece of information. In contrast, identifying the crewmember in the worst condition requires searching and comparing multiple features, thus displays which provide easy comparison of multiple features – tables and stick figures - led to better performance.

Multidimensional icons were adequate for making simple decisions about crewmembers, such as identifying the crewmember with a limiting consumable. Considering that tables have labels for consumables and contain the exact time remaining for each consumable, they have a major advantage over icons: consumables are easy to identify and they present quantitative and not only qualitative information for consumables.

Therefore, for most tasks and for displays with no real estate constraints, tables are the recommended method for displaying consumables. We recommend stick figures when there is limited display real estate and there is a need to perform multi-feature comparisons of many crewmembers in a brief amount of time.

Chernoff faces can be viable in similar conditions as stick figures such as when monitoring a single crewmember or when an overall snapshot of the team health needs to be assessed. However, stick figures have the added benefit of allowing detailed comparisons for specific features amongst crewmembers.

2. EVA display prototype

This work was a continuation of prototype development started in FY2012: the development and evaluation of an EVA software interface prototype. In FY2012, a prototype was created based on EVA specialist interviews, EVA documentation, empirical experiments, and usability testing. In FY2013, this prototype was further developed and modified to fit two types of hardware: a small display used as a mock cuff-display, and a head-mounted display (HMD). The prototype focuses on the presentation of consumables, such as oxygen, water, and battery data, and the organization of other elements of the interface (e.g., navigation and consumable information). Stylistically, the interface follows the Orion Display Project Format Standards (NASA, 2009b). After developing a few of versions of the prototype, an evaluation was conducted in which participants provided feedback while completing basic tasks that required use of the interface.

The purpose of the development process was to create an interface with dynamic and modular elements, so as EVA mission specifications are developed; elements of the design could be reused with new concepts and hardware.

3. Evaluation of EVA Prototype Data Display: Spatial Auditory Display for Remote Planetary Exploration

The research addresses the organization of information on displays that may be limited in size and integrate information relevant to situational awareness regarding navigation in the task environment as well as the health and status of the crew and mission systems during EVA. By identifying best approaches to display complex information with limited resources, and using the most appropriate modality (visual and auditory), access to information can be made intuitive and non-disruptive to the task at hand.

Specific Objectives: The primary goal of this research was to compare the performance in localization of different targets during a simulated extra-vehicular exploration on planetary surface with different types of displays for aiding navigation (NavAid): a 3D spatial auditory orientation aid (A), a 2D North-up visual map (V), and the combination of the two in a bimodal orientation aid (B). Four different environmental conditions were tested combining high and low levels of visibility and ambiguity. In a separate experiment using a similar protocol, the impact of visual workload on performance was also investigated under high (dual-task paradigm) and low workload (single orientation task) levels.

Background: During Extra-Vehicular Activities (EVA), astronauts must maintain situational awareness (SA) of a number of spatially distributed "entities" that are often outside the immediate field of view (FOV) and visual resources are needed for other task demands. Spatialized (3D) auditory cues can provide information that is complementary to, or may substitute for, cues in the visual environment. It was expected that the target localization task would benefit from a bimodal presentation of the navigation aid, in particular in degraded environmental conditions (low visibility and high ambiguity). Method: In Study 1 (single task, ST), 48 participants performed a navigation task in a simulated visual-auditory environment. They were instructed to localize targets distributed outside their FOV with the three different NavAids and the two levels of visibility and ambiguity. In Study 2 (dual task, DT), the participants had to monitor and respond to four meters representing the levels of EVA mission consumables (carbon dioxide, oxygen, water, and battery) superimposed on the visual scene at the top left of the display, while simultaneously performing the orientation task. To date, preliminary data from 6 participants has been collected under non-degraded visual conditions. In future, additional participants and experimental conditions will be tested under normal and degraded visual environments in the dual task paradigm.

For both studies, the quantitative dependent variables were: percent correct orientation, left/right decision time, localization time, and localization accuracy. Qualitative measures (subjective ratings) were also collected after the experiments.

Results: In Study 1, the results showed that a combined presentation of 2D visual and 3D auditory cues lead to a significant improvement in performance (higher percent correct for orientation, faster reaction times (RTs)) compared to either unimodal condition, in particular when the visual information required mental spatial transformation or when the visual environmental conditions were degraded.

In Study 2, preliminary results in the high visibility condition showed that an increase in mental workload (monitoring task) differentially affects the performance as a function of the modality of presentation of the NavAid. For the percentage of correct responses, overall there was no significant decrease in performance compared to the ST condition. However, comparisons between modalities showed that the percentage of correct responses was lower for the V condition than in either the A or the B conditions. Overall, mean left-right decision times were significantly increased by the increase in workload, and similar to the percent correct data, performance in the V condition was significantly worse than in either in the A or the B conditions.

Conclusion: In the particular context of EVA missions, the availability and/ or the reliability of most of the sensory inputs available on Earth is reduced and typically the processing of visual information is highly dependent on 2D displays. Spatial auditory displays can aid situational awareness, navigation, and way finding by reducing the risk of errors and response latencies. Further, compared to a visual-only 2D display (V), NavAids utilizing spatial auditory cues (both the A and B conditions) can mitigate the negative impact on performance of the extra demands due to high visual workload.

Recommendations: The results presented here demonstrate that spatial audio displays, both alone and in combination with a visual navigation display, enhance performance and situational awareness and add to the intuitiveness of the information display:

• User acceptability for 3D audio is very high.

• 3D audio provides an intuitive, ecological, and low-workload solution for the presentation of spatial information.

• 3D audio can be used to efficiently substitute for visual information that is missing or degraded.

• Combined presentation of the A and the V spatial information leads to a significant reduction of incorrect orientation responses and a reduction in decision times.

• The use of an auditory localizer, a type of dynamic sonification display, has proved its efficacy, particularly under degraded visual conditions.

Thus, it is recommended that bimodal and/or multimodal displays be used for EVA missions. It is increasingly evident that the auditory channel will need to convey spatial information about localization and navigation in an environment where the visual channel is already saturated by the display of symbology and checklists. The ecological validity of using sound for localization, combined with the possibility of learning to use virtual auditory signals to navigate between virtual waypoints, support their integration in advanced EVA display systems. The integration of alternative ways to present information brings up additional questions such as the best methods for switching between different modes within and/or between the different sensory channels that have been made available to the operator. Issues that must be investigated include that: the use of each sensory channel must be prescribed for a given type of activity; the different functions available cannot overlap; and the sensory channels must combine appropriately to contribute to a reduction of the overall workload while increasing the sense of presence and situation awareness.

For example, 3D audio could provide a higher level of immersion and improved perception of the “6 DOF (degree of freedom) operational space.” The combination of spatial and/or moving sound images with visual stimuli may increase vection and improve the sense of spatial presence as well as mitigating spatial disorientation. Two potential benefits may be of particular interest: (1) providing immediate feedback on operator location as well as the actions performed in space, and (2) providing an auditory “frame of reference” such as an artificial auditory horizon combined with “auditory security boundaries” that define the crew’s position in space in relation to the external features of the environment. Further, during training, the use of spatial audio could provide an additional countermeasure against cyber-sickness (nausea, disorientation, and oculomotor disturbances) induced by scene oscillation along the different axes of motion (pitch, roll, and yaw).

Begault, D. R., Wenzel, E. M., Godfroy, M., Miller, J. D. & Anderson, M. R. (2010). Applying spatial audio to human interfaces: 25 years of NASA experience. Proc. Audio Engineering Soc. 40th International Conference on Spatial Audio, Tokyo, Oct. 8-10 2010.

Begault, D. R., Anderson, M. R. & Bittner, R. M. (2012) Modeling Auditory-Haptic Interface Cues from an Analog Multi-line Telephone. Audio Engineering Society 133rd Convention, October 26-29, 2012, San Francisco, CA.

Wenzel, E. M., and Godfroy, M. (2011) Spatial Auditory Displays to Enhance Situational Awareness During Remote Exploration. Workshop on Space Communications: Challenges for Auditory Displays & Interactive Spoken Dialogue Systems, Fourth IEEE International Conference on Space Mission Challenges for Information Technology (SMC-IT 2011), August 2-4, Palo Alto, CA.

Wenzel, E. M., Godfroy, M. & Miller, Joel D. (2012) Prototype Spatial Auditory Display for Remote Planetary Exploration. Audio Engineering Society 133rd Convention, October 26-29, 2012, San Francisco, CA, Paper Number: 8734.

Wenzel, E. M., Godfroy, M. & Miller, Joel D. (in preparation) Spatial Auditory Displays for Space Operations: Mitigation of Degraded Visual Environments. To be submitted to Human Factors.

Bibliography Type: Description: (Last Updated: 03/03/2016) 

Show Cumulative Bibliography Listing
 
Abstracts for Journals and Proceedings Wenzel EM, Godfroy M. "Spatial Auditory Displays to Enhance Situational Awareness During Remote Exploration. Workshop on Space Communications: Challenges for Auditory Displays & Interactive Spoken Dialogue Systems." Fourth IEEE International Conference on Space Mission Challenges for Information Technology (SMC-IT 2011), Palo Alto, CA, August 2-4, 2011.

Fourth IEEE International Conference on Space Mission Challenges for Information Technology (SMC-IT 2011), Palo Alto, CA, August 2-4, 2011. , Aug-2011

Abstracts for Journals and Proceedings Sandor A, Thompson SG, Pace JW, Wenzel EM, Begault DR, Godfroy M. "Displays and Controls Interfaces Directed Research Project." 2013 NASA Human Research Program Investigators’ Workshop, Galveston, TX, February 12-14, 2013.

2013 NASA Human Research Program Investigators’ Workshop, Galveston, TX, February 12-14, 2013. , Feb-2013

Abstracts for Journals and Proceedings Sandor A, Thompson SG, Cross EV, Wenzel E, Godfroy M, Miller J, Begault D. "Displays and Controls Interfaces." 2014 Human Research Program's Investigators' Workshop, Galveston, TX, February 12-13, 2014.

2014 Human Research Program's Investigators' Workshop, Galveston, TX, February 12-13, 2014. http://www.hou.usra.edu/meetings/hrp2014/pdf/3050.pdf , Feb-2014

Papers from Meeting Proceedings Wenzel EM, Godfroy M, Miller JD. "Prototype Spatial Auditory Display for Remote Planetary Exploration." 133rd Audio Engineering Society Convention, San Francisco, CA, October 26-29, 2012.

133rd Audio Engineering Society Convention, San Francisco, CA, October 26-29, 2012. Paper Number: 8734. http://www.aes.org/e-lib/browse.cfm?elib=16476 , Oct-2012

Project Title:  Displays and Controls Interfaces Reduce
Fiscal Year: FY 2012 
Division: Human Research 
Research Discipline/Element:
HRP SHFH:Space Human Factors & Habitability (archival in 2017)
Start Date: 08/30/2010  
End Date: 09/30/2013  
Task Last Updated: 06/22/2012 
Download report in PDF pdf
Principal Investigator/Affiliation:   Sandor, Aniko  Ph.D. / Lockheed-Martin/NASA Johnson Space Center 
Address:  2101 Nasa Parkway 
Mail Code: C46 
Houston , TX 77058 
Email: Aniko.Sandor-1@nasa.gov 
Phone: 281.483.9726  
Congressional District: 22 
Web:  
Organization Type: NASA CENTER 
Organization Name: Lockheed-Martin/NASA Johnson Space Center 
Joint Agency:  
Comments:  
Co-Investigator(s)
Affiliation: 
Archer, Ronald  Lockheed-Martin/ NASA Johnson Space Center  
Thompson, Shelby G. Lockheed Martin 
Project Information: Grant/Contract No. Directed Research 
Responsible Center: NASA JSC 
Grant Monitor: Sullivan, Thomas  
Center Contact:  
thomas.a.sullivan@nasa.gov 
Solicitation / Funding Source: Directed Research 
Grant/Contract No.: Directed Research 
Project Type: GROUND 
Flight Program:  
TechPort: No 
No. of Post Docs:
No. of PhD Candidates:
No. of Master's Candidates:
No. of Bachelor's Candidates:
No. of PhD Degrees:
No. of Master's Degrees:
No. of Bachelor's Degrees:
Human Research Program Elements: (1) SHFH:Space Human Factors & Habitability (archival in 2017)
Human Research Program Risks: (1) HCI:Risk of Inadequate Human-Computer Interaction
Human Research Program Gaps: (1) SHFE-HCI-03:We need HCI guidelines (e.g., display configuration, screen-navigation) to mitigate the performance decrements identified in SHFE-HCI-08 due to the spaceflight environment (IRP Rev D)
Flight Assignment/Project Notes: NOTE: End date changed to 9/30/2013 per HRP Master Task List information dtd 11/11/2011 (Ed., 1/5/2012)

Task Description: Future exploration missions will require much greater crew autonomy, particularly for suited operations. Crews will be extremely dependent on the information available within the spacesuit for monitoring their health and suit resources, and for performing tasks. Suit data such as battery power, oxygen remaining, crew biomedical data, procedure and task information, and navigational data are all needed by EVA crewmembers to successfully complete their mission. If informational displays are poorly designed, or not easily accessible, crews will not have access to critical data, putting their mission and personal safety at risk. Suits pose special challenges in terms of information display and interaction, given limited display real estate, and gloves and helmets compromise vision, hearing, and touch. The methods by which information is delivered need to support, not hinder, task completion. Current EVA crewmembers depend heavily on communication with the ground for completion of their tasks. Future missions to more distant destinations will require a much different approach to ensure crew independence.

This line of research will focus on: 1) special techniques for formatting data delivered in a spacesuit, and 2) mechanisms for delivering and interacting with that data, given suit constraints. Researchers will first identify the different classes of information needed by the suited crewmember, then determine the modality and format of the data required for each class, and finally investigate the best technology solution to provide the data. Researchers will work with EVA Physiology, Systems and Performance (EPSP) researchers and developers using the metabolic data display issue as a case study. Various information designs and technology solutions will be empirically compared and requirements developed.

Methods to be used consist of the following: Task analysis, to identify and understand the suited tasks to be performed, including interviews with EVA astronauts to understand suited information needs and issues from the astronauts perspective; literature reviews on different information display techniques for different classes of data (e.g., procedures, alarms, metabolic data) and available technologies (e.g., Head Mounted Displays (HMDs), cuff checklists, voice); and usability testing and experimental studies to assess human performance with the proposed designs using metrics such as error rates, task completion times, verbal protocol comments, and questionnaire responses, ratings, and rankings. Standard parametric and non-parametric statistical methods will be used for data analysis. Multiple methods, metrics, and information developed as part of the Information Presentation (2008-2010) DRP will be leveraged in this project, including information on labels, alarms, cursor control devices, HMDs, and health and status displays. Products developed as part of the Usability (2008-2009 ) Directed Research Project will be validated as part of this new DRP, including methods and metrics for error rates, legibility, and consistency.

Rationale for HRP Directed Research: This research is directed because it contains highly constrained research, which requires focused and constrained data gathering and analysis that is more appropriately obtained through a non-competitive proposal.

Research Impact/Earth Benefits: 0

Task Progress & Bibliography Information FY2012 
Task Progress: Data for suit health and status are displayed on specific EVA informational displays. Suits pose special challenges in terms of information display and interaction, given limited display real estate. Furthermore, EVA gloves and helmets compromise vision, hearing, and touch. The methods by which information is delivered need to support task completion. If informational displays are poorly designed, or not easily accessible, crews will not have access to critical data, putting their mission and personal safety at risk. Current EVA crewmembers also depend heavily on communication with the ground for completion of their tasks that adds to the complexity of interfaces. Future missions to more distant destinations will require an approach that makes information presentation to EVA crews more efficient to ensure crew independence.

In order to prepare for designing an EVA display for suit health and status information, the Displays and Controls Interfaces team conducted a literature review on data visualization methods. The purpose of this report was to review data visualization options for presenting the data needed on an EVA mission. For example, the current status of the oxygen level may be presented with an icon and the rate of the consumption with a trend graph. For easy interpretation, the type of icon and graphs needs to be selected carefully. Furthermore, the details shown on these visualizations have to follow human factors guidelines to make sure that the data visualization supports crew tasks providing high accuracy and short interpretation time. The report summarized visualization methods such as tables, graphs, and multidimensional icons.

One of the display types that may be used for EVA information presentation is head-mounted displays. Due to the specific features of these displays, it is important to look at existing specific standards for display design due to their size and short viewing distance used with these displays. A literature review was conducted on existing standards and studies for displaying information on head-mounted displays.

Based on the information the team collected from crew and flight controllers on EVA consumables, as well as from the literature review on data visualization methods, work could start on designing an EVA display prototype and icons that could present summarized information on consumables. A study was conducted to evaluate several approaches to tables, icons, representation with sounds.

A second study was conducted using only data presentation with sounds, a method called sonification. Each consumable was represented by a tone. The tones were the same for each consumable, except for the one that was predicted to run out first. Sonification can be used for simple data presentations and it can be useful in situations when the eyes are busy.

A third study looked at visualizing the health and status of multi-crew teams: multidimensional icons were used to show EVA consumable data for each crew member. This type of visualization is very useful for visually compare multidimensional data sets. The user can quickly assess differences among multidimensional icons, as well as getting an overall picture of the situation.

Finally, the team designed a prototype interface for an existing metabolic rate advisor system, the Legaci system, developed in 2008. This system uses EVA consumable data to calculate metabolic rates. In addition, a voice commanding system is integrated that allows users to access the data hands-free during an EVA. The new integrated system prototype was evaluated with crew representatives, flight controllers, and subject matter experts to get feedback on the design. The interface prototype will be further developed in FY13-FY14.

Bibliography Type: Description: (Last Updated: 03/03/2016) 

Show Cumulative Bibliography Listing
 
 None in FY 2012
Project Title:  Displays and Controls Interfaces Reduce
Fiscal Year: FY 2011 
Division: Human Research 
Research Discipline/Element:
HRP SHFH:Space Human Factors & Habitability (archival in 2017)
Start Date: 08/30/2010  
End Date: 09/30/2013  
Task Last Updated: 06/14/2011 
Download report in PDF pdf
Principal Investigator/Affiliation:   Sandor, Aniko  Ph.D. / Lockheed-Martin/NASA Johnson Space Center 
Address:  2101 Nasa Parkway 
Mail Code: C46 
Houston , TX 77058 
Email: Aniko.Sandor-1@nasa.gov 
Phone: 281.483.9726  
Congressional District: 22 
Web:  
Organization Type: NASA CENTER 
Organization Name: Lockheed-Martin/NASA Johnson Space Center 
Joint Agency:  
Comments:  
Co-Investigator(s)
Affiliation: 
Archer, Ronald  Lockheed-Martin/ NASA Johnson Space Center  
Boyer, Jennifer L. Lockheed Martin/NASA Johnson Space Center 
Project Information: Grant/Contract No. Directed Research 
Responsible Center: NASA JSC 
Grant Monitor: Woolford, Barbara  
Center Contact: 218-483-3701 
barbara.j.woolford@nasa.gov 
Solicitation / Funding Source: Directed Research 
Grant/Contract No.: Directed Research 
Project Type: GROUND 
Flight Program:  
TechPort: No 
No. of Post Docs:
No. of PhD Candidates:
No. of Master's Candidates:
No. of Bachelor's Candidates:
No. of PhD Degrees:
No. of Master's Degrees:
No. of Bachelor's Degrees:
Human Research Program Elements: (1) SHFH:Space Human Factors & Habitability (archival in 2017)
Human Research Program Risks: (1) HCI:Risk of Inadequate Human-Computer Interaction
Human Research Program Gaps: (1) SHFE-HCI-03:We need HCI guidelines (e.g., display configuration, screen-navigation) to mitigate the performance decrements identified in SHFE-HCI-08 due to the spaceflight environment (IRP Rev D)
Flight Assignment/Project Notes: NOTE: End date changed to 9/30/2013 per HRP Master Task List information dtd 11/11/2011 (Ed., 1/5/2012)

Task Description: Future exploration missions will require much greater crew autonomy, particularly for suited operations. Crews will be extremely dependent on the information available within the spacesuit for monitoring their health and suit resources, and for performing tasks. Suit data such as battery power, oxygen remaining, crew biomedical data, procedure and task information, and navigational data are all needed by EVA crewmembers to successfully complete their mission. If informational displays are poorly designed, or not easily accessible, crews will not have access to critical data, putting their mission and personal safety at risk. Suits pose special challenges in terms of information display and interaction, given limited display real estate, and gloves and helmets compromise vision, hearing, and touch. The methods by which information is delivered need to support, not hinder, task completion. Current EVA crewmembers depend heavily on communication with the ground for completion of their tasks. Future missions to more distant destinations will require a much different approach to ensure crew independence.

This line of research will focus on: 1) special techniques for formatting data delivered in a spacesuit, and 2) mechanisms for delivering and interacting with that data, given suit constraints. Researchers will first identify the different classes of information needed by the suited crewmember, then determine the modality and format of the data required for each class, and finally investigate the best technology solution to provide the data. Researchers will work with EVA Physiology, Systems and Performance (EPSP) researchers and developers using the metabolic data display issue as a case study. Various information designs and technology solutions will be empirically compared and requirements developed.

Methods to be used consist of the following: Task analysis, to identify and understand the suited tasks to be performed, including interviews with EVA astronauts to understand suited information needs and issues from the astronauts perspective; literature reviews on different information display techniques for different classes of data (e.g., procedures, alarms, metabolic data) and available technologies (e.g., Head Mounted Displays (HMDs), cuff checklists, voice); and usability testing and experimental studies to assess human performance with the proposed designs using metrics such as error rates, task completion times, verbal protocol comments, and questionnaire responses, ratings, and rankings. Standard parametric and non-parametric statistical methods will be used for data analysis. Multiple methods, metrics, and information developed as part of the Information Presentation (2008-2010) DRP will be leveraged in this project, including information on labels, alarms, cursor control devices, HMDs, and health and status displays. Products developed as part of the Usability (2008-2009 ) Directed Research Project will be validated as part of this new DRP, including methods and metrics for error rates, legibility, and consistency.

Rationale for HRP Directed Research: This research is directed because it contains highly constrained research, which requires focused and constrained data gathering and analysis that is more appropriately obtained through a non-competitive proposal.

Research Impact/Earth Benefits: 0

Task Progress & Bibliography Information FY2011 
Task Progress: Suit data are displayed on specific EVA informational displays. Suits pose special challenges in terms of information display and interaction, given limited display real estate. Furthermore, gloves and helmets compromise vision, hearing, and touch. The methods by which information is delivered need to support task completion. If informational displays are poorly designed, or not easily accessible, crews will not have access to critical data, putting their mission and personal safety at risk. Current EVA crewmembers also depend heavily on communication with the ground for completion of their tasks that adds to the complexity of interfaces. Future missions to more distant destinations will require an approach that makes information presentation to EVA crews more efficient to ensure crew independence. A literature review was conducted on EVA display and control module and related studies on head-mounted displays, voice input and cuff displays used with EVA. Most studies reviewed were conducted by NASA JSC and NASA GRC.

A second activity was conducting a series of meetings with EVA stakeholders on EVA consumable data needs. The purpose of this report was to summarize the data gathered on EVA consumable data information. This was the first step toward deciding 1) what data should be displayed for crew during and EVA, 2) what is the most critical data that need to be accessed at a glance, 3) in what format should the data be displayed. The results show that generally, a procedural quick look check of all critical consumable data would be important: at the beginning of an EVA, and more frequently toward the end of the EVA. The data should be presented in the same order and same format as much as possible for consistency. Color coding, icons, and graphs are good options if they are easy to interpret. Self-check reminders are good if they are customizable for frequency. Caution and warning messages should be associated with critical values and troubleshooting information should be made available along with the C&W message. The other crewmember’s (buddy’s) data should be available in the same format as own data, but is needed only in contingency situations. In case of teams of two or more the data can be checked when one checks their own data. When viewed, own data and buddy data could be presented side-by-side for easy comparison with clear indication of own and buddy data. These results will be used to design an software prototype display for EVA consumables.

In a third study we evaluated a legibility method developed under the Usability Directed Research Project in FY10. The main objective of the evaluation was the use of an HMD while completing a procedure suited. The legibility part was very similar to the software method developed in FY10: the target items on the HMD screen were presented with rapid serial visual presentation for 1s and the participant task was to verbally identify the item. The results show that the method can be used in suited conditions and accuracy can be calculated as specified in the Human Systems Integration Requirements verification.

A fourth study evaluated a modified legibility methodology in conditions when software cannot be used during a hardware evaluation. This objective was conducted as part of a larger suited evaluation in the Human Engineering Structural Mockup. The legibility method was a timed readability approach with time and accuracy recorded.

A brief review of voice recognition software was also conducted with the purpose of using it for voice commanding during EVA operations. The review focused on general background information to understand advantages and disadvantages of voice recognition software. Furthermore, previous NASA studies using voice recognition software for EVA purposes were also reviewed.

Bibliography Type: Description: (Last Updated: 03/03/2016) 

Show Cumulative Bibliography Listing
 
 None in FY 2011
Project Title:  Displays and Controls Interfaces Reduce
Fiscal Year: FY 2010 
Division: Human Research 
Research Discipline/Element:
HRP SHFH:Space Human Factors & Habitability (archival in 2017)
Start Date: 08/30/2010  
End Date: 09/30/2012  
Task Last Updated: 03/15/2011 
Download report in PDF pdf
Principal Investigator/Affiliation:   Sandor, Aniko  Ph.D. / Lockheed-Martin/NASA Johnson Space Center 
Address:  2101 Nasa Parkway 
Mail Code: C46 
Houston , TX 77058 
Email: Aniko.Sandor-1@nasa.gov 
Phone: 281.483.9726  
Congressional District: 22 
Web:  
Organization Type: NASA CENTER 
Organization Name: Lockheed-Martin/NASA Johnson Space Center 
Joint Agency:  
Comments:  
Co-Investigator(s)
Affiliation: 
Archer, Ronald  Lockheed-Martin/ NASA Johnson Space Center  
Project Information: Grant/Contract No. Directed Research 
Responsible Center: NASA JSC 
Grant Monitor: Woolford, Barbara  
Center Contact: 218-483-3701 
barbara.j.woolford@nasa.gov 
Solicitation / Funding Source: Directed Research 
Grant/Contract No.: Directed Research 
Project Type: GROUND 
Flight Program:  
TechPort: No 
No. of Post Docs:  
No. of PhD Candidates:  
No. of Master's Candidates:  
No. of Bachelor's Candidates:  
No. of PhD Degrees:  
No. of Master's Degrees:  
No. of Bachelor's Degrees:  
Human Research Program Elements: (1) SHFH:Space Human Factors & Habitability (archival in 2017)
Human Research Program Risks: (1) HCI:Risk of Inadequate Human-Computer Interaction
Human Research Program Gaps: (1) SHFE-HCI-03:We need HCI guidelines (e.g., display configuration, screen-navigation) to mitigate the performance decrements identified in SHFE-HCI-08 due to the spaceflight environment (IRP Rev D)
Task Description: Future exploration missions will require much greater crew autonomy, particularly for suited operations. Crews will be extremely dependent on the information available within the spacesuit for monitoring their health and suit resources, and for performing tasks. Suit data such as battery power, oxygen remaining, crew biomedical data, procedure and task information, and navigational data are all needed by EVA crewmembers to successfully complete their mission. If informational displays are poorly designed, or not easily accessible, crews will not have access to critical data, putting their mission and personal safety at risk. Suits pose special challenges in terms of information display and interaction, given limited display real estate, and gloves and helmets compromise vision, hearing, and touch. The methods by which information is delivered need to support, not hinder, task completion. Current EVA crewmembers depend heavily on communication with the ground for completion of their tasks. Future missions to more distant destinations will require a much different approach to ensure crew independence.

This line of research will focus on: 1) special techniques for formatting data delivered in a spacesuit, and 2) mechanisms for delivering and interacting with that data, given suit constraints. Researchers will first identify the different classes of information needed by the suited crewmember, then determine the modality and format of the data required for each class, and finally investigate the best technology solution to provide the data. Researchers will work with EVA Physiology, Systems and Performance (EPSP) researchers and developers using the metabolic data display issue as a case study. Various information designs and technology solutions will be empirically compared and requirements developed.

Methods to be used consist of the following: Task analysis, to identify and understand the suited tasks to be performed, including interviews with EVA astronauts to understand suited information needs and issues from the astronauts perspective; literature reviews on different information display techniques for different classes of data (e.g., procedures, alarms, metabolic data) and available technologies (e.g., Head Mounted Displays (HMDs), cuff checklists, voice); and usability testing and experimental studies to assess human performance with the proposed designs using metrics such as error rates, task completion times, verbal protocol comments, and questionnaire responses, ratings, and rankings. Standard parametric and non-parametric statistical methods will be used for data analysis. Multiple methods, metrics, and information developed as part of the Information Presentation (2008-2010) DRP will be leveraged in this project, including information on labels, alarms, cursor control devices, HMDs, and health and status displays. Products developed as part of the Usability (2008-2009 )Directed Research Project will be validated as part of this new DRP, including methods and metrics for error rates, legibility, and consistency.

Rationale for HRP Directed Research: This research is directed because it contains highly constrained research, which requires focused and constrained data gathering and analysis that is more appropriately obtained through a non-competitive proposal.

Research Impact/Earth Benefits: 0

Task Progress & Bibliography Information FY2010 
Task Progress: New project for FY2010.

Bibliography Type: Description: (Last Updated: 03/03/2016) 

Show Cumulative Bibliography Listing
 
 None in FY 2010