190 likes | 307 Views
. NEW COURSE: SKETCH RECOGNITION
E N D
NEW COURSE: SKETCH RECOGNITION Analysis, implementation, and comparison of sketch recognition algorithms, including feature-based, vision-based, geometry-based, and timing-based recognition algorithms; examination of methods to combine results from various algorithms to improve recognition using AI techniques, such as graphical models. Learn how to make your drawings come alive…
Rubine Features • Make sure your to convert to double before dividing in Java • Remove the second point not the first for duplicate points • Try to get your values as close to mine and move on.
Rubine Classification • Evaluate each gesture 0 <= c <= C. • Vc = value = goodness of fit for that gesture c. • Pick the largest Vc , and return gesture c
Rubine Classification • Wc0 = initial weight of gesture • Wci = weight for the I’th feature • Fi = ith feature value • Sum the features together
Collect E examples of each gesture • (e should be 15 according to paper) • Calculate the feature vector for each example • Fcei = the feature value of the ith feature for the eth example of the cth gesture
Find average feature values for gesture • For each gesture, compute the average feature value for each feature • Fci is the average value for the ith feature for the cth gesture
Compute gesture covariance matrix • How are the features of the shape related to each other? • Look at one example - look at two features – how much does each feature differ from the mean – take the average for all examples – that is one spot in the matrix • http://mathworld.wolfram.com/Covariance.html • Is there a dependency (umbrellas/raining)
Normalize • cov(X) or cov(X,Y) normalizes by N-1, if N>1, where N is the number of observations. This makes cov(X) the best unbiased estimate of the covariance matrix if the observations are from a normal distribution.For N=1, cov normalizes by N • They don’t normalize for ease of next step (so just sum, not average)
Normalization • Taking the average • But… we want to find the true variance. • Note that our sample mean is not exactly the true mean. • By definition, our data is closer to the sample mean than the true mean • Thus the numerator is too small • So we reduce the denominator to compensate
Common Covariance Matrix • How are the features related between all the examples? • Top = non normalize total covariance • Bottom = normalization factor = total number of examples – total number of shapes = 26*14
Weights • Wcj = weight for the jth feature of the cth shape • Sum for each feature • Common Covariance Matrix inverted* ij • Average feature value for the ith feature for the cth gesture
Initial Weight • Initial gesture weight = • Sum for each feature in class: • Feature weight * average feature value
Rubine Classification • Evaluate each gesture 0 <= c <= C. • Vc = value = goodness of fit for that gesture c. • Pick the largest Vc , and return gesture c
Rubine Classification • Wc0 = initial weight of gesture • Wci = weight for the I’th feature • Fi = ith feature value • Sum the features together
Eliminate Jiggle • Any input point within 3 pixels of the previous point is discarded
Rejection Technique 1 • If the top two gestures are near to each other, reject. • Vi > Vj for all j != i • Reject if less than .95
Rejection Technique 2 • Mahalanobis distance • Number of standard deviations g is from the mean of its chosen class i.
Syllabus • http://www.cs.tamu.edu/faculty/hammond/courses/SR/2006
Homework • Fix covariance matrix • Implement trainer • Data: 26 gestures; 15 examples • Eij = Compute common covariance matrix (13*13) • Wci = Compute weights for each feature (26*13) • Wc0 = Compute initial weights for each class (26) • Build Classifier • Data: 2 gestures for each letter + 8 random = 60 examples • Vc = Compute value for each gesture • Classify with highest gesture number • Turn in: Code & classification and value for highest value • For Friday: Build option jitter reducer, rerun your data on the jitter reducer. Comment on any differences. • For Friday: Implement rejection • For Friday: Read Chris Long paper • For Monday: Come up with your own features. Compare your results.