300 likes | 491 Views
Outline . Goals and objectives of the SSTENASA current needs for the SSTEMetrics DevelopmentUse Case DevelopmentMetrics/Use Case Development Overall StrategyCurrent planCurrent progressSummary. . Goals of the Shared Simulation and Testing Environment (SSTE). Facilitate human-in-the-loop (HITL
E N D
1. Human Factors Metrics Development at NASA (part of the Shared Simulation and Testing Environment activities) Douglas T. Wong, NASA-JSC (281 483-6077)
James C. Maida, NASA-JSC (281 483-1113)
Anikó Sándor, LZT (281 483-9726)
Mihriban Whitmore, NASA-JSC (281 244-1004)
Habitability and Human Factors Branch (SF3)
NASA Johnson Space Center, Houston, TX
2. Outline Goals and objectives of the SSTE
NASA current needs for the SSTE
Metrics Development
Use Case Development
Metrics/Use Case Development Overall Strategy
Current plan
Current progress
Summary
3. Goals of the Shared Simulation and Testing Environment (SSTE) Facilitate human-in-the-loop (HITL) testing and simulation for NASA’s space vehicle development
Establishing a distributive testing/simulation environment
Providing the standard metrics
4. Objectives of the SSTE In time for human-rated Constellation Program (CxP) space vehicles development
Develop the standard HITL testing and simulation metrics
Develop use cases to convey the vision and demonstrate SSTE’s capabilities and potential
Develop distributive HITL testing/simulation environment
Leverage off the CxP Distributive Space Exploration Simulation (DSES) project
No new infrastructure: DSES will provide the infrastructure for distributive modeling and simulation
Minimal cost: SSTE only adds the HITL element to DSES Leveraging: The DSES is developing the architecture for the distributive modeling and simulation of space vehicles. The SSTE will make use of the existing architecture and infrastructure developed by the DSES.
The use case will demonstrate the potential and capabilities of the SSTE. The SSTE will be flexible enough to keep pace with the advancement in testing and simulation technologies.
The development time for the SSTE will be sort thus be available for the development of the Orion and the subsequent Constellation space vehicles.
Leveraging: The DSES is developing the architecture for the distributive modeling and simulation of space vehicles. The SSTE will make use of the existing architecture and infrastructure developed by the DSES.
The use case will demonstrate the potential and capabilities of the SSTE. The SSTE will be flexible enough to keep pace with the advancement in testing and simulation technologies.
The development time for the SSTE will be sort thus be available for the development of the Orion and the subsequent Constellation space vehicles.
7. Standard Metrics Development – Current Needs Discrete Event Simulation:
The Orion Project is collecting Descriptive Task Parameters (DTP) for Discrete Event Simulation (DES) using Orion spacecraft’s Guidance, Navigation, and Control (GNC) models
Estimating human performance
Verifying/validating model inputs/outputs
SSTE supports both NASA and the Orion Prime Contractor
Providing standard metrics: The interaction among the DTP and the DES models will not be simple plug-in and run
DES distributive data collection/sharing: Providing the environment
Model accreditation: Standardized/distributive human data will help NASA SMEs validate and approve GNC models
8. Model Accreditation:
SMEs on GNC indicate the need for standardized modeling and simulation data format
SSTE:
Providing human models interface requirements to space vehicle models
Developing the HITL testing/simulation metrics for GNC models validation
Playing a supporting role
Coordinating with the DSES activity
10. Standardized Test Data Collection Metrics include
Test conditions
Types of human data
“Human” definition
Either active or passive participants
Real or virtual
Three categories of metrics
Environmental
Human demographics
Human performance
More?
11. Standard Metrics Development Procedure Obtaining buy-ins from NASA/contractor facilities, SMEs, and program managers
Surveying current HITL measurement needs
Developing standards
Validating metrics with use cases
Documentation
Implementation
Formally announce to testing community and programs the availability of the metrics
Refinement
Continually refining metrics to accommodate new needs Survey: Conducting an HITL Measurement Needs Survey (HMNS) across the agency to determine what kinds of HITL measurements are generally needed.
Organizing the data from the HMNS into representative categories.
Determining the commonalities in HITL testing among NASA and the prime contractor’s facilities and establish a standard way to conduct HITL testing.
Survey: Conducting an HITL Measurement Needs Survey (HMNS) across the agency to determine what kinds of HITL measurements are generally needed.
Organizing the data from the HMNS into representative categories.
Determining the commonalities in HITL testing among NASA and the prime contractor’s facilities and establish a standard way to conduct HITL testing.
13. Use Case Development – Current Needs Validate Standard Metrics with
Multiple vehicle models
Multi-disciplinary solutions
Subject Matter Experts of different disciplines located in various centers
Astronauts as subjects for human factors evaluation
Supporting Requirements TV&V
Constellation requirements testing and validation:
Providing the distributive HITL testing environment for resolving the TBDs and TBRs (Currently, NASA needs to resolve Human System Integration Requirements (HSIR) document TBDs and TBRs)
GNC Automation vs. Manual Control requirements confirmation
Requirements verification: NASA plays an oversight role The CEV Task Description Sheet CEVLM-04-1030 calls for the confirmation that the CEV requirements related to AA&M control are feasible from a GN&C standpoint.
The CEV Task Description Sheet CEVLM-04-1030 calls for the confirmation that the CEV requirements related to AA&M control are feasible from a GN&C standpoint.
15. Use Case Development Procedure Find use cases relevant to a near-term space vehicle development
Completion in 1 year
Participating in the DSES to demonstrate SSTE
Potential for distributive HITL TV&V
Works best as part of the DSES network
Conducting the use cases tests
Establishing the standard: Once the SSTE idea is proven and lessons learned from the test cases, we will announce to the user community the SSTE will be the standard for HITL testing at NASA
HITL = human in the loop
TV&V = testing, validation, and verification
Establishing the standard: Once the SSTE idea is proven and lessons learned from the test cases, we will announce to the user community the SSTE will be the standard for HITL testing at NASA
HITL = human in the loop
TV&V = testing, validation, and verification
16. Metrics & Use Cases DevelopmentOverall Strategy Begin with a core set of metrics
Use cases relevant to current programs will be identified to validate the metrics
Once the metrics are validated, they will be formally established as the current standard for human-in-the-loop (HITL) testing
Continue to refine and enlarge the metrics set to cover the on-going new HITL testing demands
17. SSTE Current Plan Develop standard metrics (FY 07-08)
Standard HE Metrics development (6 months)
Use case development (12 months)
SSTE demo (6 months)
Implement 1st stage SSTE (FY09-10)
Incorporate metrics into DSES (6 months)
HE metrics (6 months)
HITL testing/simulation (6 months)
Training and Maintenance (FY10-11)
Develop Training Program for using SSTE (6 months)
Develop system maintenance/user manuals, Training (parallel, 6 months)
System maintenance (on-going)
Continue refinement (FY 11-)
18. Metrics Development
Identified the categories of the metrics: time stamp, events, and states
Developing the data definition matrix
Collecting inputs from JSC, Ames, Langley, and DoD
Use Case Development
Working with NASA and the Orion Prime Contractor to identify use cases from Orion requirements
HSIR workload requirements validation
HITL Distributive Testing/Simulation Development
Leveraging off infrastructure by DSES to develop distributive HITL displays and controls
Potential 1st SSTE demo in FY 08
Current Progress
19. SSTE will provide standardized metrics and environment for distributive human-in-the-loop TV&V
Support all human-related space programs
Major work in progress: Human engineering metrics development, use case development
Requires additional funding Summary
21. Environmental / Demographic Metrics Describes the conditions the test was conducted
Unchanged for at least one session of the test
Collected prior to the test and whenever they change
Environmental Conditions
Lighting, ambient noise, temperature, etc.
Human Demographics
Age, gender, training level, anthropometry, etc.
22. Human Performance Metrics Should include at least the following:
Response time
Button press, device activation, vocal response, etc
Completion time
Time needed to complete a step
Granularity defined before the test or reconstructed after the test
Accuracy
Quality of a behavior
Coding the correct and erroneous actions and counting the number of occurrences at the end
Workload
On primary and secondary tasks
23. Output Data Format
25. Metrics Definition File Example
26. Metrics Output File Example