150 likes | 274 Views
Tips for Research A Personal Perspective. Honours Survival Course Tadao Takaoka . Tips. Standing on the shoulder of a giant Generalization Specialization Horizontal Shift Competition and curiosity Fundamental change in definition Average case analysis and parallelization.
E N D
Tips for ResearchA Personal Perspective Honours Survival Course Tadao Takaoka
Tips • Standing on the shoulder of a giant • Generalization • Specialization • Horizontal Shift • Competition and curiosity • Fundamental change in definition • Average case analysis and parallelization
Generalizationor extension • We generalize the current solution to k solutions. When k=1, we come to the original problem. • Problem k-P • Problem P
History variable of depth kIs there any room for improvement in assignment statement? • To swap the values of x and y • x = y; y = x • Is this wrong? Not totally wrong. • x = y; y = x<1> This is better than w=x; x=y; y=w • x is a history variable of depth 1 • At assignment x=y, the current value of x is automatically saved to the past history, and the past value can be retrieved by x<1> x x<1> x x<1> y=3 2 1 x = y 3 2
Fibonacci sequence0 1 1 2 3 5 8 13 21 … • y=0; x=1 • for i=1 to n do begin • w=x; x=x+y; y=w • end • x=0; x=1; • for i=1 to n do x=x+x<1>
13 21 K-shortest pathsk-maximum subarrays • first shortest • second shortest • second maximum • first maximum • k=2 Disjoint
Specialization • problem P (known) • problem P’ • Problem P’ is a specialized P • P’ is specified by a few parameters
Single source shortest path problemwith m edges and n vertices • Classical time complexity O(m+ nlog n) • Introducing the third parameter • If edge cost is limited by c>0, the time becomes O(m + nlog c). If c is a polynomial of n, this becomes the classical complexity. • If the size of strongly connected components is limited by k, the complexity becomes O(m+nlog k). Again if k=n, we have classical.
Horizontal shift • problem P problem P’ • Example while new while
while and new while • In “while” the condition is tested only at the position of Boolean expression, whereas in “nwhile”, the condition is tested anywhere in the loop, and if the condition becomes false, we get out of the loop. • x=0; while x<10 do x=x+1 • X=0; nwhile x<10 do x=x+1 no difference • x=0; y=a; {a : positive integer} • while x<y do begin x=x+1; y=y-1 end { x=a/2, y= a/2 } • x=0; y=a; • nwhile x<y do begin x=x+1; y=y-1 end { x=y=a/2 }
Competition with others • Example. All Pairs Shortest Path Problem with a dense graph (APSP) • Floyd and Dijkstra O(n3), classical • Fredman (1976), O(n3(loglog n /log n)1/3), slightly sub-cubic • Takaoka (1992), O(n3(loglog n /log n)1/2) • Han (2004), O(n3(loglog n /log n)5/7) • Takaoka (2005), O(n3(loglog n /log n)) • Han (2008), O(n3(loglog n /log n)5/4) • Chan (2007), O(n3((loglog n)3 /log n)2) • Han and Takaoka (2012), O(n3(loglog n /(log n)2))
What is computing time?An example from pattern matching • Is it after the input action is completed (off-line version) or after the user started the input action (on-line version)? t **** a • p **** b t[j]=a • **** c p[i]=b • h[i] • i=1; j=1; • while p <=m and j <=n do begin • while p>=0 and p[i] != t[j] do i := h[i]; • i:=i+1; j:=j+1 • end • if i<=m then “not found” • else “found at” j-i+1
Stealing user’s time • i=1; j=1; r:=2; h[1]:=0; • while p <=m and j <=n do begin • while p>=0 and p[i] != t[j] do i := h[i]; • i:=i+1; j:=j+1; • if i=r then begin • read(p[r]); • compute h[r]; r:=r+1 • end • end • if p<=m then “not found” • else “found at” j-i+1 • Computer does pattern matching with partial information of p[1..r]. When more information is deeded, it waits at read(p[r]). After the pattern is input, computer is almost at the destination..
My Ph. D student Sung EunBaechange of course • There is a mesh algorithm of 5n-5 steps for the APSP problem. I improved this to 3.5n steps, and challenged students to establish 3n steps to beat me. • He came to my research group to attack this problem, but it was hard, and changed his course and instead designed a mesh algorithm for the maximum subarray problem (MSP) of 2n-1 steps, which was the world first linear time algorithm for mesh MSP. He also designed many efficient sequential algorithms for the k-MSP. • If you spend 3 – 4 years on research, we cannot go exactly with the plan.
Conclusion • Stand on teacher’s shoulder. Trust him 99%. • If you hit too hard a wall, do not hesitate to change your course to some extent. • Even in a well-established area, there might be a gold mine. People do overlook. • Aho, Hopcroft and Ullman “Design and analysis computer algorithm”. There are many paradigmes for good algorithms,such as divide-and-conquer, greedy, dynamic programming, etc, etc. The best paradigm is not to be satisfied with existing algorithms.