290 likes | 410 Views
AFRL: Decision Models in Collaborative Integrated Solutions System Development (PMD-0204). Principal Investigators: Heather Nachtmann, Ph.D. Terry Collins, Ph.D. Research Assistants: Justin Hunter David Rieske Brian Waters Bryan Hill AFRL POC’s: Maj. Matt Goddard
E N D
AFRL: Decision Models in Collaborative Integrated Solutions System Development (PMD-0204) Principal Investigators: Heather Nachtmann, Ph.D. Terry Collins, Ph.D. Research Assistants: Justin Hunter David Rieske Brian Waters Bryan Hill AFRL POC’s: Maj. Matt Goddard Northrop Grumman: John Jacobs
Project Objective • To develop a Balanced Scorecard to measure the performance of flightline maintenance activities
Balanced Scorecard • The Balanced Scorecard (BSC) is a performance measurement system created to overcome deficiencies in traditional business performance measurement systems through: • Use of both leading and lagging performance indicators • Clarifying and incorporating long-term goals into daily decision-making • Showing the effects of decisions in one area of operations on another area • Considering multiple measures at once
Project Methodology • Selection of a strategically aligned performance measurement system (BSC) • Research into BSC development and implementation process • Investigation of current flightline MX processes • Production of associated development guidelines • Validation of these guidelines through a case application • Investigation of software implementations.
Deliverables • BSC Development Guide • BSC Field Study • Preliminary BSC Validation • Software Package Review
BSC Development Guide • Detailed guidelines to facilitate the development of a BSC by AF flightline personnel • Benefits include: • Use of detailed, step-by-step development process • Need for little or no previous experience with BSC • Focus on specific AF requirements
BSC Field Study • A case study using BSC Development Guide in AF flightline environments • Site visits were conducted to understand and observe daily flightline maintenance activities • A pool of actual AF metrics was stratified into families and perspectives for future use • A preliminary BSC was constructed using observations and aforementioned guidelines
Preliminary BSC Validation • The preliminary BSC was presented to logistics personnel in survey form • The survey was administered by AFRL personnel to 2003 LOA conference attendees • Survey respondents ranked each perspective and each measure in terms of criticality • Results were tallied and presented in graphical form • Additional measures were provided by survey respondents
Software Package Review • A review was conducted of existing BSC software packages • Review criteria were formed using industry standards • Features available on each package were described • Three leading packages were compared using criteria and features
AFRL: Quantifying Logistics Capabilities (PMD-0302) Principal Investigators: Heather Nachtmann, Ph.D. Justin Chimka, Ph.D. Manuel Rossetti, Ph.D. Research Assistants: Alex Andelman David Rieske AFRL POC’s: Edward Boyle Northrop Grumman: John Jacobs
Project Objective • Use statistical analysis to investigate how the personnel experience factor is affecting mission capability, and develop a metric and standard to relate personnel experience to Mission Capable rate
Select USAF System for study Review similar past Air Force Studies Acquire needed data Develop regression model through statistical analysis Develop metric and standard to relate personnel skill level to Mission Capable rate Develop acceptance plan Technology transfer Task List
Problem with Forecasting Readiness thesis • Assumed that only MC rate was considered dependent and all other variables were considered independent • This assumes that all other variables are controllable which is not the case • In our project, we assume that only personnel skill levels can be controlled and have a significant affect on other variables
Methodology • Develop regression models for each dependent variable • Verify that all the regression models meet the necessary assumptions for proper multiple linear regression • Use new data to validate the regression of each variable • Develop metric and standard for each of the dependent variables
Methodology • Each dependent variable has two models • Percentage of skill levels • Total number of skill levels • Any model that contained R-squared (adj.) values below 0.650 were removed from consideration. • A graph was created for each dependent variable that contained the number of variables used in each model on the x-axis and the R-squared (adj.) on the y-axis. • This allows a comparison of the models based on the model efficiency • Least efficient models were removed
Assumption Analysis • The following assumptions were checked for each of the remaining models • Normality of the residuals • The error terms have a zero mean • The error terms have a constant variance • The errors are uncorrelated • Models that do not fulfill all of these are eliminated
Assumption Analysis (cont.) • Normality of the residuals • Ryan-Joiner test • p-value >= 0.05 indicated normality • Zero mean of the error terms • 1-Sample t test • p-value >= 0.05 indicated a zero mean
Assumption Analysis (cont.) • Constant error term variance • 2-Sample t test • p-value >= 0.05 indicated the variance had a lack of significant pattern • Uncorrelated error terms • Correlation analysis • Models containing a correlation between subsequent instances of error were removed
Model Development • Following the efficiency and assumption analysis, a final model will be selected for each dependent variable from the remaining models • Preference is to use efficient models that describe events accurately
Future Work • Final models need to be chosen for each of the dependent variables • Matrices will be constructed for each variable that will display expected results based on certain skill level mixtures • Based on these results, a metric and standard relating personnel skill level to mission capability will be developed
AFRL: Maintenance Prognostics Decision Aiding Principal Investigators: Heather Nachtmann, Ph.D. Justin Chimka, Ph.D. Manuel D. Rossetti, Ph.D. Research Assistants: Vanessa Cabrera Jonathan King David Rieske
Project Objective • To explore the human interface of equipment prognostics and make system improvements through • Increased understanding of technician involvement in equipment prognostics • Improved prognostic decisions through enhanced technician empowerment
Planned Approach • Examine other prognostic environments such as medicine and meteorology for possible adaptation to equipment prognostics • Explore the current human interaction within equipment prognostics through observation of MX technicians • Possibly through observation of prognostic decisions of hypothetical equipment in a controlled manner • Using information gathered from observations, study how to improve the equipment prognostics system from the viewpoint of the technician user
Prognostics Decision Aiding Diagram State of the system Sensor WARNING”Bad Event” Likelihood Prognostics Prediction Engine Consequences to the actual state of nature Does Not Occur Does Occur (End) Diagnose Different Info. Retrain Assistance
Anticipated Outcomes • Determine the role of the technician in the prognostic decision-making process • Provide systems improvement analyses and recommendations to enhance technician empowerment to improve prognostic predictions