70 likes | 108 Views
Amdahl's law. Example from Wikipedia. Suppose a task is split into four consecutive parts: P1 ( takes 11% of the time required), P2(18%), P3(23%) and P4(48%).
E N D
Example from Wikipedia • Suppose a task is split into four consecutive parts: P1 ( takes 11% of the time required), P2(18%), P3(23%) and P4(48%). • Further suppose then that part P1 is not sped up, while P2 is sped up by a factor of 5 times, P3 is sped up 20×, and P4 is sped up 1.6×.
Improvement in speed • We will work in a time unit where the original task takes 1 unit. That time is divided up into the four subtasks 0.11 + 0.18 + 0.23 + 0.48 = 1 • Now the improved version: the 11% still takes the same amount of time, while the 18% is divided by 5 (i.e. is 5 times faster), etc.
Speed • The original speed was one task per one unit of time. • If the original time was 1, the new time is 0.4575 – a little more than twice as fast. Speed is now “one task” / “new time” :
Amdahl’s law • In Amdahl’s Law we break the process into two pieces – a parallelizable part P and a non-parallelizable part (1-P). • If we assume that there are N processors (or pipelines or whatever) then the improvement of the parallelizable part is N.
Limit as N ∞ 1/(1-P) If a process is 90% parallelizable, then the maximal improvement in speed is 1/(1-.9) or 10. It can be made 10 times faster. “For this reason, parallel computing is only useful for either small numbers of processors, or problems with very high values of P: so-called embarrassingly parallel problems. A great part of the craft of parallel programming consists of attempting to reduce the component (1 – P) to the smallest possible value.”
Reference • http://en.wikipedia.org/wiki/Amdahl's_law