Task Progress:
|
Background and Introduction
The Space Human Factors and Habitability (SHFH) Element and the Behavioral Health and Performance (BHP) Element are interested in research regarding: Net Habitable Volume (NHV), the internal volume within a spacecraft or habitat that is available to crew for required activities, and the layout and accommodations within the volume. This work is funded by SHFH, and is designed to address aspects of the Risk of an Incompatible Vehicle/Habitat Design. NASA needs methods to unobtrusively collect NHV data, without impacting crew time. Data required includes metrics such as location and orientation of crew, volume used to complete tasks, internal translation paths, flow of work, and task completion times. In less constrained environments methods exist, yet many are obtrusive and require significant post-processing. Examples include infrared (IR) retro-reflective marker motion capture, GPS tracking, inertial tracking, and multi-camera methods. Due to constraints of space operations many such methods are infeasible. Inertial tracking systems typically rely upon a gravity vector to normalize sensor readings, and traditional IR systems are large and require extensive calibration. However, multiple technologies have not been applied to space operations for these purposes. Two of these include: 3D Radio Frequency Identification Real-Time Localization Systems (3D RFID-RTLS) and Depth imaging systems which allow for 3D motion capture and volumetric scanning (such as those using IR-depth cameras like the Microsoft Kinect or Light Detection and Ranging / Light-Radar systems, referred to as LIDAR).
Objective: Develop an automated methodology for collecting space utilization and NHV data: Adapt and integrate two independent technologies, 3D RFID-RTLS and Microsoft Kinect 3D volumetric and anatomical scanning tools, into a single solution, Format the data in such a way that it can be used in computational modeling, Validate the resulting system and data outcomes against standard measures, and Validation of the system in the Human Exploration Research Analog (HERA). Data collected by the system will include: The number of crew present in each area of the vehicle at any given time, The quantity of time crew spend at each workstation in the performance of tasks, The physical orientation of crew while utilizing the provided volume, Frequent or common translation paths and traffic flow patterns within the volume, Operational flow/volume required for mission tasks by single or multiple crew, 3D biomechanical and postural data related to individual and team based tasks.
Overall Approach and Schedule
The plan of work for this project includes several steps spread across a three-year period: Step 1 (2014-2016): Integration of the hardware technologies involved (3D RFID-RTLS and Kinect) and initial development of software interfaces. Step 2 (2016-2017): Development and refinement of the system and its software interfaces, following best practices in usability and human centered design, finalization of data formats, and conducting engineering pilot tests. Step 3 (2017-2018): Validation in HERA, delivery of final deliverables, and publication.
Status: The prototype system has been setup in a laboratory R&D environment at Johnson Space Center (JSC) Building 15, with additional lab environment at University of Nebraska-Lincoln (UNL). A great deal of programming work has occurred on multiple fronts including: Graphical User Interface (GUI) and integration framework for the system, Creation of rendering framework, integration with standard Windows form functionality, Visualization and analysis technologies, Environment loading, transparency modulation, wire-framing, Camera control modes (1st and 3rd person perspectives), Point cloud acquisition, bill-boarded geometry shader based rendering, accumulation/buffering over time, and hard disk based data storage, Skeletal tracking of up to 6 people simultaneously, with real-time calculation of joint angles, Volume calculation from point cloud data (multiple methods / algorithms implemented), setup and calibration of the UWB RFID-RTLS system, implementation of RFID tracking data into the integration framework using the Ubisense API, and the use of multiple Kinect devices (communicating in a client-server configuration using a separate computer for each Kinect, with the prototype system running on a central computer which collates and integrates the input from each system).
Continuing and Forward Work
Continue to program and mature the components of the system: Kinect data acquisition and management, more accurate convex hull volume determinations, analysis of heat-mapping of accumulated point cloud data, time and motion study analyses using 3D RFID-RTLS data, development of additional visualization methods. Additionally, this Fall the system will be pilot tested, with some design iteration occurring post-test to rectify any issues found, and final validation to follow.
|