1 / 33

Randomized Algorithms

Pasi Fränti. Randomized Algorithms. 1.10.2014. Treasure island. Treasure worth 20.000 awaits. ?. Map for sale: 3000. 5000. 5000. 5000. ?. DAA expedition. To buy or not to buy. Buy the map:. 20000 – 5000 – 3000 = 12.000. Take a change:. 20000 – 5000 = 15.000.

Download Presentation

Randomized Algorithms

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Pasi Fränti Randomized Algorithms 1.10.2014

  2. Treasure island Treasure worth 20.000 awaits ? Map for sale: 3000 5000 5000 5000 ? DAAexpedition

  3. To buy or not to buy Buy the map: 20000 – 5000 – 3000 = 12.000 Take a change: 20000 – 5000 = 15.000 20000 – 5000 – 5000 = 10.000

  4. To buy or not to buy Buy the map: 20000 – 5000 – 3000 = 12.000 Take a change: 20000 – 5000 = 15.000 20000 – 5000 – 5000 = 10.000 Expected result: 0.5 ∙ 15000 + 0.5 ∙ 10000 = 12.500

  5. Three type of randomization • Las Vegas • Output is always correct result • Result is not always found • Probability of success p • Monte Carlo • Result is always found • Result can be inaccurate (or even false!) • Probability of success p • Sherwood • Balancing the worst case behavior

  6. Las Vegas

  7. Eating philosophizes Who eats?

  8. Las Vegas Input: Binary vector A[1, n] Output: Index of any 1-bit from A LV(A, n) REPEAT k ← RAND(1, n); UNTIL A[k]=1; RETURN k Revise

  9. 8-Queens puzzle INPUT: Eight chess queens and an 8×8 chessboard OUTPUT: Setup where no queens attack each other

  10. 8-Queens brute force • Brute force • Try all positions • Mark illegal squares • Backtrack if dead-end • 114 setups in total • Random • Select positions randomly • If dead-end, start over • Randomized • Select k rows randomly • Rest rows by Brute Force Where next…? 8 5 4 …

  11. Pseudo code 8-Queens(k) FOR i=1 TO k DO // k Queens randomly r  Random[1,8]; IF Board[i,r]=TAKEN THEN RETURN Fail; ELSE ConquerSquare(i,r); FOR i=k+1 TO 8 DO // Rest by Brute Force r1; foundNO; WHILE (r≤8) AND (NOT found) DO IF Board[i,r] NOT TAKEN THEN ConquerSquare(i,r); foundYES; IF NOT found THEN RETURN Fail; ConquerSquare(i,j) Board[i,j]  QUEEN; FOR z=i+1 TO 8 DO Board[z,j]  TAKEN; Board[z,j-(z-i)]  TAKEN; Board[z,j+(z-i)]  TAKEN;

  12. Probability of success s = processing time in case of success e = processing time in case of failure p = probability of success q = 1-p = probability of failure Example: s=e=1, p=1/6  t=1+5/1∙1=6

  13. Experiments with varying k Fastest expected time

  14. Swap-based clustering

  15. Clustering by Random Swap P. Fränti and J. Kivijärvi, "Randomised local search algorithm for the clustering problem", Pattern Analysis and Applications, 3 (4), 358-369, 2000. • RandomSwap(X) → C, P • C ← SelectRandomRepresentatives(X); • P ← OptimalPartition(X, C); • REPEAT T times • (Cnew, j) ← RandomSwap(X, C); • Pnew← LocalRepartition(X, Cnew,P, j); • Cnew, Pnew ← Kmeans(X, Cnew, Pnew); • IF f(Cnew, Pnew) < f(C,P) THEN • (C, P) ← Cnew, Pnew; • RETURN (C, P); Select random neighbor Accept only if it improves

  16. Clustering by Random Swap 1. Random swap: 2. Re-partition vectors from old cluster: 3. Create new cluster:

  17. Choices for swap O(M) clusters to be removed  O(M) clusters where to add = O(M2) different choices in total

  18. Probability for successful Swap Select a proper centroid for removal: • M clusters in total: premoval=1/M. Select a proper new location: • N choices: padd=1/N • M of them significantly different: padd=1/M In total: • M2significantly different swaps. • Probability of each is pswap=1/M2 • Open question: how many of these are good • Theorem: α are good for add and removal.

  19. Clustering by Random Swap Probability of not finding good swap: Iterated T times Estimated number of iterations:

  20. Bounds for the iterations Upper limit: Lower limit similarly; resulting in:

  21. Total time complexity Time complexity of single step (t): Number of iterations needed (T): t = O(αN) Total time:

  22. Monte Carlo

  23. Monte Carlo Input: A bit vector A[1, n], iterations I Output: An index of any 1 bit from A LV(A, n, I) i ← 0; DO k ← RAND(1, n); i ← i + 1; WHILE (A[k]≠1 AND i ≤ I) RETURN k Revise

  24. Monte Carlo • Potential problems to be considered: • Detecting prime numbers • Calculating integral of a function To appear in 2014… maybe…

  25. Sherwood

  26. 1 1 1 N-1 N-2 Selection of pivot element Something about Quicksort and Selection: • Practical example of re-sorting • Median selection Add material for 2014 N-3 … O(N2)

  27. Simulated dynamic linked list • Sorted array • Search efficient: O(logN) • Insert and Delete slow: O(N) • Dynamically linked list • Insert and Delete fast: O(1) • Search inefficient: O(N)

  28. Simulated dynamic linked listExample Linked list: Head 7 1 2 4 5 15 21 Simulated by array: Head=4

  29. Simulated dynamic linked listDivide-and-conquer with randomization Value searched SEARCH (A, x) i :=A.HEAD; max :=A[i].VALUE; FOR k:=1 TO N DO j:=RANDOM(1, N); y:=A[j].VALUE; IF (max<y) AND (y≤x) THEN i:=j; max:=y; RETURN LinearSearch(A, x, i); N random breakpoints Biggest breakpoint ≤ x Full search from breakpoint i

  30. Analysis of the search max search for (on average) • Divide into N segments • Each segment has N/N = N elements • Linear search within one segment. • Expected time complexity = N + N = O(N)

  31. Experiment with students 1 2 3 4 99 100 Data (N=100) consists of numbers from 1..100: Select N breaking points:

  32. Searching for… 42

  33. Empty space for notes

More Related