1 / 19

EEL 4930 (§6) & 5930 (§5) , Spring 2006 Physical Limits of Computing

http://www.eng.fsu.edu/~mpf EEL 4930 (§6) & 5930 (§5) , Spring 2006 Physical Limits of Computing Slides for a course taught by Dr. Michael P. Frank in the Department of Electrical & Computer Engineering Overview of First Lecture Course Introduction: Moore’s Law vs. Known Physics

Albert_Lan
Download Presentation

EEL 4930 (§6) & 5930 (§5) , Spring 2006 Physical Limits of Computing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. http://www.eng.fsu.edu/~mpf EEL 4930 (§6) & 5930 (§5), Spring 2006Physical Limits of Computing Slides for a course taught byDr. Michael P. Frankin the Department of Electrical & Computer Engineering

  2. Overview of First Lecture • Course Introduction: • Moore’s Law vs. Known Physics • Mechanics of the course: • Course website • Books / readings • Topics & schedule • Assignments & grading policies • misc. other administrivia M. Frank, Physical Limits of Computing, Spr. '06

  3. Physical Limits of ComputingIntroductory Lecture Moore’s Law vs. Known Physics

  4. Moore’s Law vs. Known Physics Outline of mini-lecture: • Moore’s law and Related Trends • Status of Known Physics in the Modern Era • Energy Efficiency and Performance Limits • New Paradigms for More Efficient Computing • Future Computing Technologies M. Frank, Physical Limits of Computing, Spr. '06

  5. Moore’s Law • Moore’s Law proper: • Trend of doubling of number of transistors per integrated circuit every 18 (later 24) months • First observed by Gordon Moore in 1965 (see readings) • “Generalized Moore’s Law” • Various trends of exponential improvement in many aspects of information processing technology (both computing & communication): • Storage capacity/cost, clock frequency, performance/cost, size/bit, cost/bit, energy/operation, bandwidth/cost … M. Frank, Physical Limits of Computing, Spr. '06

  6. Moore’s Law (Devices/IC) Intel µpu’s Early Fairchild ICs M. Frank, Physical Limits of Computing, Spr. '06

  7. Raw technologyperformance (gate ops/sec/chip):Up ~55%/year Microprocessor Performance Trends Source:Hennessy &Patterson,ComputerArchitecture:A QuantitativeApproach,3rd edition.AddedPerformanceanalysis based on datafrom theITRS 1999roadmap. M. Frank, Physical Limits of Computing, Spr. '06

  8. Super-Exponential Long-Term Trend Ops/second/$1,000 Source: Kurzweil ‘99 M. Frank, Physical Limits of Computing, Spr. '06

  9. Known Physics: • The history of physics has been a story of: • Ever-increasing precision, unity, & explanatory power • Modern physics is veryclose to perfection! • All accessible phenomena are exactly modeled, as far as we know, to the limits of experimental precision, which is ~11 decimal places today. • However, the story is not quite complete yet: • There is no experimentally verified theory unifying GR & QM (so far) String theory? M-theory?Loop quantum gravity? Other? M. Frank, Physical Limits of Computing, Spr. '06

  10. Fundamental Physical Limits of Computing ImpliedUniversal Facts Affected Quantities in Information Processing Thoroughly ConfirmedPhysical Theories Speed-of-LightLimit Communications Latency Theory ofRelativity Information Capacity UncertaintyPrinciple Information Bandwidth Definitionof Energy Memory Access Times QuantumTheory Reversibility 2nd Law ofThermodynamics Processing Rate Adiabatic Theorem Energy Loss per Operation Gravity M. Frank, Physical Limits of Computing, Spr. '06

  11. Device Size Scaling Trends Based on ITRS ’97-03 roadmaps (1 µm) Virus Protein molecule Naïve linear extrapolations Effective gate oxide thickness DNA/CNT radius Silicon atom Hydrogen atom M. Frank, Physical Limits of Computing, Spr. '06

  12. Trend of Min. Transistor Switching Energy Based on ITRS ’97-03 roadmaps fJ Node numbers(nm DRAM hp) Practical limit for CMOS? aJ Naïve linear extrapolation zJ M. Frank, Physical Limits of Computing, Spr. '06

  13. Implications of Energy Limits • If the limits on energy dissipation of irreversible operations can’t possibly be circumvented, this implies: • The number of low-level digital operations we can perform per unit of energy dissipation is limited. • Digital system performance per unit of power consumption is limited. • This could have deleterious long-term effects, including: • Braking of growth in the electronics industry • Stagnation of the world’s “information economy” • Perhaps even an eventual end to all life in the universe! • Therefore, we have some very strong motivations for finding ways to circumvent these limits! • How to accomplish this is a big part of what this course is about. M. Frank, Physical Limits of Computing, Spr. '06

  14. What is entropy? • First was characterized by Rudolph Clausius in 1850. • Originally was just defined as marginal heat ÷ temperature. • Noted to never decrease in thermodynamic processes. • Significance and physical meaning were mysterious. • In ~1880’s, Ludwig Boltzmann proposed that entropy S is the logarithm of a system’s number N of states, S = k ln N • What we would now call the information capacity of a system • Holds for systems at equilibrium, in maximum-entropy state • The modern understanding that emerged from 20th-century physics is that entropy is indeed the amount of unknown or incompressible information in a physical system. • Important contributions to this understanding were made by von Neumann, Shannon, Jaynes, and Zurek. M. Frank, Physical Limits of Computing, Spr. '06

  15. Von Neumann / Landauer (VNL) bound for bit erasure • The von Neumann-Landauer (VNL) lower bound for energy dissipation from bit erasure: • First alluded to by John von Neumann in a 1949 lecture • Developed more explicitly by Rolf Landauer (IBM) in 1961. • “Oblivious” erasure/overwriting/forgetting of a known logical bit really just moves the information that the bit previously contained to the environment • We lose track of that information and so it becomes entropy. • Leads to fundamental limit of kT ln 2 for oblivious erasure. • This particular limit could only possibly be avoidable through reversible computing. • Reversible computing “de-computes” unwanted bits, rather than obliviously erasing them! • This can avoid entropy generation, enabling the signal energy to be preserved for later re-use, rather than being dissipated. M. Frank, Physical Limits of Computing, Spr. '06

  16. Illustration of VNL Principle • Either of 2 digital states is initially encoded by any of N possible physical microstates • Illustrated as 4 in this simple example (the real number would usually be much larger) • Initial entropy (given the digital state) S = Log[#microstates] = Log 4 = 2 bits. • Now, suppose some mechanism resets the digital state to 0 regardless of what it was before. • Reversibility of physics ensures this “bit erasure” operation can’t possibly merge two microstates, so it must double the number of possible microstates in the digital state! • Entropy S = Log[#microstates] increases by Log 2 = 1 bit = (Log e)(ln 2) = kB ln 2. • To prevent entropy from accumulating locally, it must be expelled into the environment. Microstates representinglogical “0” Microstates representinglogical “1” Entropy S =log 4 = 2 bits Entropy S′ =log 8 = 3 bits Entropy S =log 4 = 2 bits ∆S = S′ − S= 3 bits − 2 bits= 1 bit M. Frank, Physical Limits of Computing, Spr. '06

  17. Reversible Computing • A reversible digital logic operation is: • Any operation that performs an invertible (one-to-one) transformation of the device’s local digital state space. • Or at least, of that subset of states that are actually used in a design. • Landauer’s principle only limits the energy dissipation of ordinary irreversible (many-to-one) logic operations. • Reversible logic operations could dissipate much less energy, • Since they can be implemented in a thermodynamically reversible way. • In 1973, Charles Bennett (IBM Research) showed how any desired computation can in fact be performed using only reversible logic operations (with essentially no bit erasure). • This opened up the possibility of a vastly more energy-efficient alternative paradigm for digital computation. • After 30 years of (sporadic) research, this idea is finally approaching the realm of practical implementability… • Making it happen is the goal of the RevComp project. M. Frank, Physical Limits of Computing, Spr. '06

  18. How Reversible Logic Avoids the von Neumann-Landauer Bound • We arrange our logical manipulations to never attempt to merge two distinct digital states, • but only to reversiblytransform them fromone state to another! • E.g., illustrated is a reversible operation“cCLR” (controlled clear) • Non-oblivious “erasure” • It and its inverse (cSET)enable arbitrary logic! ablogic 00 logic 01 a=0a=1 logic 10 logic 11 b=0 b=1 M. Frank, Physical Limits of Computing, Spr. '06

  19. Potential Cost-Efficiency Benefits Scenario: $1,000/3-years, 100-Watt conventional computer, vs. reversible computers w. same capacity. ~100,000× ~1,000× Best-case reversible computing Bit-operations per US dollar Worst-case reversible computing Conventional irreversible computing All curves would →0 if leakage not reduced. M. Frank, Physical Limits of Computing, Spr. '06

More Related