1 / 16

Artificial Neural Networks 0909.560.01/0909.454.01 Fall 2004

Artificial Neural Networks 0909.560.01/0909.454.01 Fall 2004. Lecture 7 October 25, 2004. Shreekanth Mandayam ECE Department Rowan University http://engineering.rowan.edu/~shreek/fall04/ann/. Plan. RBF Design Issues K-means clustering algorithm Adaptive techniques ANN Design Issues

bena
Download Presentation

Artificial Neural Networks 0909.560.01/0909.454.01 Fall 2004

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Artificial Neural Networks0909.560.01/0909.454.01Fall 2004 Lecture 7October 25, 2004 Shreekanth Mandayam ECE Department Rowan University http://engineering.rowan.edu/~shreek/fall04/ann/

  2. Plan • RBF Design Issues • K-means clustering algorithm • Adaptive techniques • ANN Design Issues • Input data processing • Selection of training and test data - cross-validation • Pre-processing: Feature Extraction • Approximation Theory • Universal approximation • Lab Project 3

  3. j 1 j 1 1 1 j 1 j 1 0.5 0 -5 5 RBF Network Hidden Layer Input Layer Output Layer x1 y1 Outputs x2 Inputs y2 x3 wij 1 j(t) t

  4. x2 x1 Centers Data points RBF - Center Selection

  5. K-means Clustering Algorithm • N data points, xi; i = 1, 2, …, N • At time-index, n, define K clusters with cluster centers cj(n); j = 1, 2, …, K • Initialization: At n=0, let cj(n)= xj; j = 1, 2, …, K(i.e. choose the first K data points as cluster centers) • Compute the Euclidean distance of each data point from the cluster center, d(xj , cj(n)) = dij • Assign xj to cluster cj(n)if dij = mini,j {dij}; i = 1, 2, …, N, j = 1, 2, …, K • For each cluster j = 1, 2, …, K, update the cluster center cj(n+1)= mean {xjcj(n)} • Repeat until ||cj(n+1)- cj(n)||< e

  6. Train Train Train Test Train Train Test Train Train Test Train Train Test Train Train Train Selection of Training and Test Data: Method of Cross-Validation • Vary network parameters until total mean squared error is minimum for all trials • Find network with the least mean squared output error Trial 1 Trial 2 Trial 3 Trial 4

  7. Feature Extraction Objective: • Increase information content • Decrease vector length • Parametric invariance • Invariance by structure • Invariance by training • Invariance by transformation

  8. Approximation Theory:Distance Measures • Supremum Norm • Infimum Norm • Mean Squared Norm

  9. Recall: Metric Space • Reflexivity • Positivity • Symmetry • Triangle Inequality

  10. K Approximation Theory: Terminology • Compactness • Closure F

  11. E min M f u0 Approximation Theory: Terminology • Best Approximation • Existence Set E ALL f min M u0

  12. F e g f Approximation Theory: Terminology • Denseness

  13. Fundamental Problem E • Theorem 1: Every compact set is an existence set (Cheney) • Theorem 2: Every existence set is a closed set (Braess) min M g ?

  14. F e g f x x1 x2 1 f(x1) f(x2) Stone-Weierstrass Theorem • Identity • Separability • Algebraic Closure F af+bg

  15. Lab Project 3: Radial Basis Function Neural Networks http://engineering.rowan.edu/~shreek/fall04/ann/lab3.html

  16. Summary

More Related