1 / 37

Pattern Recognition in Acoustic Signal Processing

Learn about hidden variables, tracking, regression, classification, universal approximators, structured modeling, training algorithms, error control, validation testing, regularization, and more in acoustic signal processing.

josephh
Download Presentation

Pattern Recognition in Acoustic Signal Processing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Mark Hasegawa-Johnson ECE Department and Beckman Institute University of Illinois at Urbana-Champaign Pattern Recognition in Acoustic Signal Processing

  2. Motivation Perfect knowledge is unattainable. Solution: Identify variables with unknown values (“hidden variables” or “parameters”)‏ Devise an algorithm that will learn them, in situ Inference vs. Tracking: Inference: learn fixed but unknown system parameters Tracking: track rapidly changing hidden variables Regression vs. Classification Regression: hidden variables are continuous-valued Classification: hidden variables are discrete-valued

  3. Outline Regression examples Classification examples Inference Universal approximators Generalization error: Constraints vs. regularization Structured modeling: Constraints vs. regularization Tracking Convex programming: dynamics as constraints Bayesian formulation: dynamics as regularization

  4. Acoustic Imaging

  5. Passive Imaging/Tracking

  6. Articulatory Inference

  7. Acoustic Event Classification

  8. Speech Recognition

  9. Acoustic Event Transcription

  10. Inference

  11. The Known is a Function of the Unknown

  12. Universal Approximators Show sigmoid and RBF networks, mixture Gaussian PDFs, Parzen windows

  13. Training Corpus Error Algorithms for minimizing training error for sigmoid, RBF networks

  14. Generalization Error

  15. Law of Large Numbers Bounds on generalization error

  16. Parameter Count Show that dVC is less than or equal to the number of parameters in the classifier Suggest hyperplane classifiers with few parameters

  17. Parameter Volume Show that dVC is less than or equal to data volume divided by parameter volume, for a hyperplane SVM

  18. Validation Testing Validation testing is a method for controlling generalization error

  19. Error Control: Three Paths Describe the dichotomy: constrained vs. regularized learning vs. cross-validation

  20. Validation-Based Learning Outline of typical validation-based learning, e.g., Mixture Gaussian Neural net

  21. Constraint-Based Learning Outline of typical constrained learning, e.g., Model space limited by prior knowledge, e.g., phones, imaging classes

  22. Regularized Learning Bayesian/MAP learning MDL-type regularization terms SVM/SVR AdaBoost Emphasize: each of these is ultimately confirmed by validation testing!!

  23. Bayesian Regularization Examples where MAP is appropriate, e.g., speaker adaptation, speaker ID, language ID

  24. MDL Regularization Model selection from limited data, e.g., determining whether prosodically tagged allophones should be modeled separately or jointly (Sarah's old paper)‏

  25. Margin Regularization: SVM Example: landmark classification (Niyogi, Juneja, Borys)‏

  26. Margin Regularization: SVR Example: acoustics to articulation using SVR (Vikram's work)‏

  27. Margin Regularization:AdaBoost Example: AdaBoost feature selection for acoustic event detection (Zhuang and Zhou)‏

  28. Inference Summary Some aspects of inference, difficult to control using validation, are controlled by other methods Feature selection: knowledge, or boosting Feature design: knowledge, or SVM, or SVR Class definitions: knowledge, or MDL Some aspects of inference easier to control using validation e.g., how many Gaussians? How many features?

  29. Tracking

  30. Discrete Tracking Example:Speech Recognition Speech recognition

  31. Continuous Tracking Ex.:Passive Imaging Source tracking

  32. Bayesian Formulation General Bayesian formulation of the tracking problem

  33. Discrete Example:Acoustic Event Detection Zhuang and Zhou AED system description

  34. Continuous Example:Articulatory Inference Mixture Gaussian regression w/state

  35. Constrained Tracking Example: passive imaging with discrete number of dynamic options (cite the paper?)‏

  36. Long-Term Constraints Introduce Roth & Yih's ILP methods for NLP

  37. Conclusions conclusions

More Related