120 likes | 332 Views
Distributed Microsystems Laboratory: Developing Microsystems that Make Sense. Sensor Validation Techniques Sponsoring Agency: Center for Process Analytical Chemistry PI: D. Wilson Research Assistant: Garth Tan. Distributed Microsystems Laboratory: Developing Microsystems that Make Sense.
E N D
Distributed Microsystems Laboratory:Developing Microsystems that Make Sense Sensor Validation Techniques Sponsoring Agency: Center for Process Analytical Chemistry PI: D. Wilson Research Assistant: Garth Tan
Distributed Microsystems Laboratory:Developing Microsystems that Make Sense • Goals: To use various pattern recognition and preprocessing tools to validate incoming sensor data for enhancing system task accuracy and minimizing downtime • Sensor Response Validation using Hidden Markov Models: • Preliminary Analysis of sensor arrays to determine experiment health • Follow-up Analysis of individual sensors to pinpoint sensor health • Sensor Baseline Validation using Statistical Analysis • Gaussian Analysis: detects deviations from Gaussian calibration distribution for each individual sensor • Flags sensors, who in the absence of a stimulus, have drifted or otherwise deviated from calibration state sufficiently to diminish the accuracy of their contribution to system tasks • Analysis of Data Sets/Proof-of-Concept Applications: • Composite Polymer Film Chemical Sensor Arrays: Caltech • Honeywell Sensor Arrays for Monitoring Vacuum Drier Process
Distributed Microsystems Laboratory:Sensor Validation using HMMs What is a Hidden Markov Model (HMM)? • An HMM “is a doubly embedded stochastic process with an underlying stochastic process that is not observable (it is hidden), but can only be observed through an other set of stochastic processes that produce the sequence of observations.” wrote Lawrence R. Rabiner, 1989 [1] • “In simpler terms, it is a collection of states connected by transitions and emitting an output during each transition. The model is named ‘hidden’ since the state of the model at a time instant t is not observable directly.” wrote L. Satish and B.I. Gururaj, 1993 [2] • The particular HMM model that is used to characterize response in this application is the Left to Right model. This model is more computationally efficient; the model is forced to start in state one and transition sequentially. This results in a bi-diagonal transition matrix and eliminates the need for a probability vector describing the the model’s initial state.
Distributed Microsystems Laboratory:Sensor Validation using HMMs What is a Hidden Markov Model (HMM)? • The HMM is an analog state machine • What does this mean? • The model is trained to understand important transitions in the input “training” data • Time between important transitions are the states • The transitions themselves are modeled probabilistically • The rules for switching states are determined during the training phase • During the testing phase, the response travels through the model via a series of states, transitions between states having been defined by rules acquired during training.
Distributed Microsystems Laboratory:Sensor Validation using HMMs Outputs of the HMM • From each state: • Observation sequence = sensor response • Probability = that an observation (progression in sensor response) can occur in that particular state for that particular model. • The states through which the sensor progresses can be tracked over time. • Can be used to indicate where major transitions in the sensor response that would identify it with a particular HMM would occur. • The final output of the model is a probability that “This model produced the current sensor response”
Distributed Microsystems Laboratory:Sensor Validation using HMMs Recent Results: Chemically Sensitive Polymer Films • Array Composition • 10 types of sensors • 4 types of each sensor • 40 total sensors • HMM trained on each set of four redundant sensors (4 inputs to each model, 10 total models). • Data sets are cross-validated against different models. • Cross-validation of Training Data exposes classes of sensors which cannot be readily discriminated (for good reason). • Cross-validation of Testing Data captures unlike/invalid responses for all models. • Conclusion: the HMM method can detect broken or incorrect sensors at the sensor level and invalid/extreme environmental conditions at the system level.
Distributed Microsystems Laboratory:Sensor Validation using HMMs Sample Data: Acetone • Each row represents four sensors of the same type • The output is taken as a percentage change from the baseline resistance, which is then normalized across the entire sensor response. • Each figure shows results from each of 35 experiments.
Distributed Microsystems Laboratory:Sensor Validation using HMMs Recent Results: Model 1 for Sensor Type 1 (Acetone) Sensor Responses Progression of States
Distributed Microsystems Laboratory:Sensor Validation using HMMs Recent Results: Model 1 for Sensor Type 4 (Acetone) • The responses look similar to that of Sensor Type 1, but a closer look reveals differences in magnitudes, slopes, and the final settling points. • Sensor Type 4 was not able to move from model 1’s first state, thereby causing it to fail in the model and be invalidated as a potential sensor of Type 1. Sensor Responses Progression of States
Distributed Microsystems Laboratory:Sensor Validation using HMMs Results: Model 1 for Sensor Type 2 (Acetone) • The responses look similar to that of Sensor Type 1 • However, the sensor response did not finish in the third state as Sensor 1 does, but progresses onto the fourth state, thereby invalidating it as a match for the Model for Sensor 1. Sensor Responses Progression of States
Distributed Microsystems Laboratory:Sensor Validation using HMMs Recent Results: Acetone Sensor Validation Confusion Matrix • The matrix matches sensor model to correct recognition of sensor type (along the diagonal). For example, the HMM for sensor type 1 (Model 1, first row) recognized sensors of type 1 (first column of first row) 100% of the time.
Distributed Microsystems Laboratory:Sensor Validation using HMMs Discussion and Conclusions • The HMM Model picks out responses that are not characteristic. “Failures” of right sensors in right models tend to be related to noisy experiments rather than failure of the model. • Once sensors are invalidated, they can be removed from the decision making process to improve overall system accuracy. • Other work being done in the DMS lab determines the optimal number and combination of sensors in an array so that the array can be rearranged or reconstructed when a sensor is invalidated to provide the “next best” system accuracy. • If a model is trained with an uncharacteristic response, it will not affect the model’s ability to train and test correctly, unless a majority of the responses used to train the model have problems. Therefore, poor training data does not significantly impact the accuracy of the model.