1 / 5

15-451 Avrim Blum 11/25/03

FFT Recap (or, what am I expected to know?) - Learning Finite State Environments. 15-451 Avrim Blum 11/25/03. FFT Recap. The basic result: Given vectors A = (a 0 , a 1 , a 2 , ..., a n-1 ), and B = (b 0 , b 1 , ..., b n-1 ),

Download Presentation

15-451 Avrim Blum 11/25/03

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. FFT Recap (or, what am I expected to know?)- Learning Finite State Environments 15-451 Avrim Blum 11/25/03

  2. FFT Recap The basic result: Given vectors • A = (a0, a1, a2, ..., an-1), and • B = (b0, b1, ..., bn-1), the FFT allows us in O(n log n) time to compute the convolution • C = (c0, c1, ..., c2n-2) where cj = a0bj + a1bj-1 + ... + ajb0. I.e., this is polynomial multiplication, where A,B,C are vectors of coefficients.

  3. How does it work? Compute F-1(F(A)¢F(B)), where “F” is FFT. • F(A) is evaluation of A at 1,w,w2,...,wm-1 • w is principal mth root of unity. m = 2n-1. E.g., w = e2pi/m. Or use modular arithmetic. • Able to do this quickly with divide-and-conquer. • F(A)¢F(B) gives C(x) at these points. • We then saw that F-1 = (1/m)F’, where F’ is same as F but using w-1.

  4. Applications (not on test) • signal analysis, lots more • Pattern matching with don’t cares: • Given text string X = x1,x2,...,xn. xi 2 {0..25} • Given pattern Y = y1,y2,...,yk. yi2 {0..25} [ {*}. • Want to find instances of Y inside X. • Idea [Adam Kalai based on Karp-Rabin]: • Pick random R: r1,r2,...,rk, ri2 1..N. E.g, N=n2 • Set ri = 0 if yk-i+1 = *. • Let T = r1yk + ... + rky1. (can do mod p > N) • Now do convolution of R and X. See if any entries match T. Each entry at most 1/N chance of false positive.

  5. OK, on to machine learning...

More Related