680 likes | 791 Views
Addressing the Funding Gap in Energy-Efficient Computing: Research Overview and Program Management Philosophy.
E N D
Addressing the Funding Gap in Energy-Efficient Computing: Research Overview and Program Management Philosophy By Michael P. FrankPresented to the National Science FoundationDirectorate for Computer & Information Science & EngineeringComputer & Communication Foundations (CCF) DivisionMonday, July 10, 2006 M. Frank, NSF/CISE/CCF job talk
Overview of Talk • Motivation: • The Looming Energy Efficiency Crisis in Computing • and the related Funding Gap between government & industry • The Science: • Why something called Reversible Computing is really “Our Only Hope” for solving the problem • And why we need to start major research on it now! • Why I’m Here: • Convey my vision of CCF, the EMT program and how the field of Reversible Computing fits into them • Ideas on how I would help run the EMT program M. Frank, NSF/CISE/CCF job talk
Motivation The Coming Crisis in Computer Energy Efficiency M. Frank, NSF/CISE/CCF job talk
Major Motivation of my Work:The Energy Efficiency Crisis • The bulk of past improvements in practical computer performance have been fundamentally enabled by steady improvements in the energy efficiency of computation… • Defined as the number of useful computational operations performed per unit of available energy dissipated into the form of waste heat • Unfortunately, an end to the past trend of steady energy efficiency improvements is now clearly within sight… • Designs at many levels (devices, circuits, architectures, algorithms) for conventional computing are rapidly converging towards optimal design-point asymptotes, within a few-decade time-frame • Beyond which substantial further progress will not be possible, at least not within the conventional classical, irreversible computing paradigm • To circumvent the crisis, a radical paradigm shift in our models and structures for computation is required! • I will show why reversible computing will be an essential part of this. M. Frank, NSF/CISE/CCF job talk
Computing’s Rapid Climb • The raw performance & efficiency characteristics of our information processing technologies (computing, storage, communication) have been improving at a steady, exponentially increasing rate over time, for at least the past 50 years… • Due to “Moore’s Law” (integration scale of electronics doubles every 1-2 years) and related technology trends • Performance trends also span multiple pre-IC technologies (vacuum tubes, relays, etc.) going back ~100 years or more • Each generation of performance improvements has reliably led to significant new information-processing applications becoming practicable… M. Frank, NSF/CISE/CCF job talk
Substantial Societal Impact • Economic measures of the nation’s (& world’s) economy, such as GDP, per-capita income, and standard of living have also improved exponentially (although at slower rates) over this same period… • It’s clear that a substantial portion of these gains was made possible by the introduction of new IT applications, itself made possible by raw technology improvements • Nearly every major industry today has relied on digital/ electronic technologies for a substantial portion of the productivity gains it has made over the last few decades • Effected either directly, or indirectly through its suppliers M. Frank, NSF/CISE/CCF job talk
These historical observations raise an important concern… • We can arguably expect that the future rate of growth of the entire world economy will substantially depend on future trends in information technology efficiency… • I.e., will our raw technologycapabilities flatten out,continue improvingsteadily, or accelerate even faster than before? logefficiency now decade M. Frank, NSF/CISE/CCF job talk
But, a Severe Problem… • The energy efficiency (useful operations performed per unit energy dissipated) of all conventional information processing technologies will flatten out within the next few decades… • This is true for fundamental and absolutely irrefutable physical reasons! (To be discussed) • As a consequence, the cost efficiency (ops performed per unit cost) and thus practical performance (e.g., FLOPS per dollar of annual operating budget) of systems will also flatten! • This is assuming only that the economic cost of energy will not soon enter a new era of rapid exponential decay… • Which seems unlikely since, at present, energy costs are rising • If this “flattening” happens, it can be expected to have a substantial braking effect on the entire world economy! • This would be an extremely negative outcome, which we should try our best to avoid at all costs… M. Frank, NSF/CISE/CCF job talk
Why Energy Efficiency of Conventional Computing Must Flatten • The potential energy efficiency gains from all conventional sources are limited… For example: • Decrease logic signal energy by lowering logic voltages • This has already reached a practical limit of on the order of ~1V; going to much lower voltages leads to excessive FET energy leakage • Also, signal energy is subject to thermodynamic limits to be discussed • Eliminate speculative execution and other unnecessary CPU activity • Soon, energy dissipation becomes dominated by “necessary” activity • Turn off unused functional units when not in use to avoid unnecessary power dissipation from leakage currents • Soon, power is dominated by active switching in units that are in use • Replace algorithms for general-purpose CPUs with FPGA configurations or special-purpose architectures: • This is quite helpful, but typically yields at most ~100x savings • Find new high-level algorithms that require fewer total operations • This is great when possible, but as our algorithms improve, significantly better algorithms become harder and harder to find M. Frank, NSF/CISE/CCF job talk
fJ aJ zJ Trend of Minimum Transistor Switching Energy Based on Data from International Technology Roadmaps for Semiconductors Node numbers(nm DRAM hp) Historical trendline Conservative industry targets CV2/2 gate energy, Joules M. Frank, NSF/CISE/CCF job talk
An Urgent Scientific Need • Given the above considerations, I would say that one of the most important basic research issuesthat our society needs the field of computer science & engineering to address is to find a definitive answer to the following question: • Can the introduction of new alternative, unconventional computing paradigms (such as reversible, quantum, and bio-inspired computing) realistically prevent or forestall the “flattening” of the information technology curve? • And if so, how exactly can this work? • My vision is that answering this question should be a primary scientific mission of the EMT program. • Although other applications are also important… M. Frank, NSF/CISE/CCF job talk
The Science Why Reversible Computing is Our “Last, Great Hope” for Continuing to Improve Computing Indefinitely M. Frank, NSF/CISE/CCF job talk
The von Neumann-Landauer (VNL) Bound • Physical theorem: To lose, obliviously erase, or otherwise irreversibly forget 1 bit’s worth of known information involves/requires the eventual dissipation of at least kBT ln 2 amount of free energy to heat in an external environment at some temperature T. • kB here is Boltzmann’s constant, 1.38×10−23 J/K in energy/temperature units • First alluded to by John von Neumann, 1949; clarified and proven by Rolf Landauer, 1961. M. Frank, NSF/CISE/CCF job talk
A simple proof of the VNL bound • Here’s a simple proof, from basic thermodynamic facts known for >100 years! • If known information becomes unknown, this is (by def’n) an increase of entropy. • Because entropy is simply unknown physical information. • And, all information that is accessible to us is physical information anyway. • Standard units of information and entropy are simply logarithmic units: • 1 bit = log 2= λb.logb2(indefinite logarithm object), Boltzmann’s constant kB = log e • Therefore, in units of Boltzmann’s constant, 1 bit = kB(log 2/log e) = kB ln 2 • Thus, the loss (forgetting) of 1 bit is, by definition, the very same thing as an increase of entropy by the amount kB ln 2. • Once entropy is created, it can never be destroyed (2nd law of thermodynamics) • This follows from the micro-scale reversibility of basic laws of (today quantum) mechanics • As entropy builds up in a system, its temperature rises. • To operate sustainably without eventual meltdown, • The entropy generated must be expelled to an external environment. • To add entropy S to an environment at temperature T requires adding energy E = ST to that environment - this is the very definition of thermodynamic temperature! • Thus, to forget a bit (i.e., permanently expel it into the environment) requires that we must eventually permanently commit energy kBT ln 2 to the environment (as heat). M. Frank, NSF/CISE/CCF job talk
An Essential Element of Future Paradigms: Reversible Computing • Basic idea: (R. Landauer, 1961 & C. Bennett, 1973) • Fundamental physics suggests that in principle there is no limit to the energy efficiency of computing technologies, although this is true only to the extent that we avoid performing irreversible operations that discard information during the computing process… • But, it seems that with sufficient engineering effort, we can in principle approach, as closely as we care to, the limit of a reversible computer that discards no information and dissipates no energy • Our practical aim is not zero energy, just continued steady reductions! • Present status of reversible computing: • Potential advantages/tradeoffs are reasonably well understood • Models & early prototypes exist, but no practical systems yet • Of interest to other clusters: Implementing this notion would eventually impact computer engineering & CS at all levels! • From low-level physical device requirements up through circuit design, theory, architecture, languages, & algorithms… M. Frank, NSF/CISE/CCF job talk
Irreversible vs. Reversible Digital Operations • A typical irreversible digital operation: • Regardless of the previous digital contents x of some circuit node or memory cell, destructively overwrite it with a given new value y. • A closely corresponding, but reversible operation: • Reversibly transform the old physical state representing x “in place” to a new state the new value y. • The semantic difference is that the 2nd op can only be done if the old value x is “known”… • This means, it can be reconstructed based on the new value y together with other available information. • This restricts the kinds of replacements that can be done reversibly; • e.g., can’t replace two bits a,b with the product ab and 1 other bit y x bit bucket y x M. Frank, NSF/CISE/CCF job talk
Irreversible CLEAR (set to 0) operation: Without knowing if there is charge on node N, connect it to ground (logic 0 reference level) The stored information is lost and the entire associated node energy E is dissipated to heat! Reversible “CLEAR”(change from 1 to 0): Given that N contains a 1, we connect it to a source that goes from 1 to 0 over time t > tc Only a fraction tc/t of the node energy E is dissipated, tc = 2RC is a time constant R = resistance of path C = capacitance of node Simple Electronic Implementations N N Switch open N Switch closed 1 Variablesource 0 Node is charged upwith an amount E ofelectrostaticenergy R Node dischargessuddenly,all info & energy arefully lost t C Charge Q = (2EC)1/2 flows out in a controlled way over time t, dissipation Ediss = I2Rt = Q2R/t = E(2RC/t) (Adiabatic charge transfer) M. Frank, NSF/CISE/CCF job talk
Simulation Results (Cadence/Spectre) 2LAL = Two-level adiabatic logic (invented at UF, ‘00) • Graph shows power dissipation vs. frequency • in 8-stage shift register. • At moderate frequencies (1 MHz), • Reversible uses < 1/100th the power of irreversible! • At ultra-low power (1 pW/transistor) • Reversible is 100× faster than irreversible! • Minimum energy dissip. per nFET is < 1 eV! • 500× lower than best irreversible! • 500× higher computational energy efficiency! • Energy transferred is still ~10 fJ (~100 keV) • So, energy recovery efficiency is 99.999%! • Not including losses in power supply, though 1 nJ 100 pJ Standard CMOS 10 aJ 10 pJ 1 aJ 1 pJ Energy dissipated per nFET per cycle 1 eV 100 fJ 2V 100 zJ 2LAL 1.8-2V 1V 10 fJ 10 zJ 0.5V 0.25V kT ln 2 1 fJ 1 zJ 100 aJ 100 yJ
Reversible and/or Adiabatic VLSI Chips Designed @ MIT, 1996-1999 By EECS grad students Josie Ammer, Mike Frank, Nicole Love, Scott Rixner,and Carlin Vieri under CS/AI lab members Tom Knight and Norm Margolus. M. Frank, NSF/CISE/CCF job talk
Some Important Results in Reversible Computing So Far • Landauer (IBM) 1961: • The von Neumann limit of kT ln 2 energy dissipation per bit operation only holds for irreversible operations. • Lecerf 1963, Bennett (IBM) 1973: • Computers that use only reversible operations are still Turing universal. • Fredkin & Toffoli (MIT), 1980: • Reversible computers can be implemented in an idealized classical physical model. • Feynman (CalTech), 1982: • Reversible computers can be implemented in a simple quantum physical model. • This paper eventually spawned the field of quantum computing • Younis & Knight (MIT), 1993: • Pipelined, sequential logic circuits can be implemented in fully-reversible CMOS. • This paper helped to spawn the field of adiabatic circuits • MIT Pendulum Project (Ammer, Frank, Knight, Love, Margolus, Rixner, Vieri), 1994-1999: • Designed & implemented fully reversible programmable circuits, general-purpose RISC architectures, high-level programming languages, and algorithms for a wide variety of classical CS problems • Frank (MIT) 1997-1999: • When physical constraints are accounted for, reversible computers offer asymptotically lower energy, cost, and time complexity for broad classes of problems than conventional machines. • Frank (UF) 2000-2002: • The advantages of reversible computing over conventional computing increase as small polynomials of the underlying technology characteristics… The trends show reversible winning within decades for machines at usual scales M. Frank, NSF/CISE/CCF job talk
Important Open Research Challenges in Reversible Computing • Fundamental research on practicability of reversible computing: • (Physics) Can we invent post-transistor devices with lower leakage and energy coefficients? • This research requires cross-disciplinary collaboration with physicists • (Engineering) Can we tailor physical mechanisms to precisely execute complex trajectories (computations) with high energy-recovery efficiency? • E.g. efficient resonators and power-clock distribution systems driving adiabatic logic. Collaboration with extremely skilled EEs is needed • (Structures) Can we design mostly-reversible architectures with low overhead for practical special-purpose applications, at least? • Existing general-purpose reversible architectures are highly suboptimal • (Theory) Can we reversibly emulate general irreversible algorithms with less space-time complexity overhead than presently known? • Oracle-based results suggest not, but more work is needed M. Frank, NSF/CISE/CCF job talk
The Funding Gap inEnergy-Efficient Computing • As a proposal writer, I’ve found that reversible computing falls into a rather awkward, in-between position… • Because it aims to help a broad range of practical applications, and is well-motivated by basic physics, many scientists who evaluate RC proposals say it seems “too practical” to receive basic research funding, they expect its development should be funded by industry. • Yet, because RC is high-risk, very disruptive, and probably will take much longer than industry’s traditional ~10-year lab-to-fab time lag to develop and broadly adopt, industry has largely ignored it, in favor of more short-term approaches to save energy • The major risk that society faces in allowing this funding gap to persist is that if industry steps in too late, then workable, practical implementations of RC might not be ready in time to prevent performance growth from stalling… • If there is even a brief stall, the loss of momentum could breed pessimism and choke off industry’s will to continue innovating… M. Frank, NSF/CISE/CCF job talk
Why I’m Here My vision of CCF, EMT, and how I and my field fit into it M. Frank, NSF/CISE/CCF job talk
Areas Covered by CCF • Emerging Models and Technologies (EMT) • Paradigms: Nanocomputing, quantum computing, biologically inspired computing… • I would add reversible computing to this list… • Founds. of Comp. Procs. & Artifs. (FCPA) • Structures: Programming languages, computer architecture, VLSI design… • Theoretical Foundations (TF) • Theory: Models of computation, complexity, parallelism, algorithms, information theory… M. Frank, NSF/CISE/CCF job talk
Some Highlights of My Related Educational Background • Early exposure to nanotech/nanocomputing concepts • Nanotechnology course, K. Eric Drexler, Stanford, 1988 • Solid general background in CS theory & AI • BS in Symbolic Systems, Stanford, 1991 • MS in EECS on Decision-Theoretic techniques in AI, MIT, 1994 • Ph.D. proposal on DNA-based computing • MIT Lab for CS, ’94-‘95 • Fairly early exposure to Quantum Computing • Reviewed the field for MIT EECS Ph.D. area exam, 1995 • Ph.D. minor in conventional CMOS VLSI design • Designed & had fabbed several chips, for courses & Ph.D. work • Ph.D. work on Reversible Computing • Included development of nanocomputing models, complexity theory, architectures, programming languages, & VLSI design M. Frank, NSF/CISE/CCF job talk
What I See As Some General Research Questions Behind EMT • What are the fundamental physical limits of present & future information processing technologies? • As opposed to the more abstract, algorithmic kinds of limits addressed by traditional theoretical CS • What fundamental changes to our underlying models/paradigms of computation may we need in order to fully harness emerging technologies? • New models based on physics (or chemistry, biology?) • How can practical considerations help to guide our exploration of the emerging technology concepts? • E.g., concerns with (at least estimates of) real-world cost, performance, energy efficiency, reliability, ease of use… M. Frank, NSF/CISE/CCF job talk
Some Cross-Cutting Questions to other areas of CCF • Cross-cutting to FCPA cluster: • What would the emergence of new computing paradigms require in terms of new architectures, programming languages, & HW design tools? • Cross-cutting to TF cluster: • What impacts do emerging technologies have on theoretical CS areas such as models of computation, complexity theory, algorithm design, and parallel computing? M. Frank, NSF/CISE/CCF job talk
What are the Fundamental Physical Limits of Computing? • Fundamental laws of physics impose a variety of universal limits that hold true in all physically possible information processing technologies: • Thermodynamic von Neumann/Landauer (VNL) lower bound of kT ln 2 (~18 meV at room temperature) on energy dissipated per known bit that is discarded into a temperature-T environment. • However, this one could be avoided via reversible computing • Quantum performance limit (Margolus-Levitin bound) of at most a rate 2E/h (h=Planck’s constant) of ‘useful’ bit operations in any device with an active energy of E. • This limit applies even to reversible & quantum computers! • There are also fundamental physical limits on information density and bandwidth, but I won’t get into those here… M. Frank, NSF/CISE/CCF job talk
New Paradigms for Computing • Reversible computing aims to directly circumvent the energy efficiency problem through the use of energy-conserving physical mechanisms for information processing… • Quantum computing aims for dramatic algorithmic improvements for some types of problems, using ‘shortcuts through state space’ made possible by nonclassical operations • Bio-inspired computing broadly includes: • In vivo biological computing, e.g., bacteria genetically engineered to incorporate custom gene expression regulation networks • In vitro biochemistry-based computing such as DNA computing and related approaches • “In silico” but still biologically-inspired techniques such as digital & analog neural networks, other analog approaches, “neuromorphic” computing, etc… M. Frank, NSF/CISE/CCF job talk
New Paradigms in Relation to What I see as EMT’s Mission • Bio-inspired computing is interesting, but generally incapable of superseding the limits of conventional technology by very much… • All realistic bio-inspired approaches could be simulated by conventional parallel digital machines with (at most) modest constant-factor overheads… • The motivation for bio-inspired computing must come from other directions… • Quantum computing is nice if it can be made to work, but as far as we know, it is limited in its applicability to relatively narrow classes of problems (e.g., hidden subgroup, modest gains for search)… • Its potential economic impact is therefore only a small fraction of that for all leading-edge computing in general • Research that aims to broaden its applicability is potentially worthwhile • Reversible computing is the only unconventional paradigm that might possibly break down the roadblocks to indefinite future improvement of computer efficiency and practical performance in general applications… • Its future economic value is thus potentially unlimited… • However, it is difficult to do, and still in its infancy! Much research is needed. M. Frank, NSF/CISE/CCF job talk
Some Other Motivations for Paradigms Covered by EMT • Bio-inspired computing: • In vivo computing: Self-reproducing, self-organizing microbial systems for various clinical or industrial applications • In vitro computing: Self-assembly of nanostructures • Neural networks: Applications in machine learning • Analog electronics: Low-power signal processing • Quantum computing: • Fast factoring etc. for cryptanalysis of PK cryptosystems • Strong information security via quantum cryptography • Fast, flexible, accurate simulation of quantum physical systems • Reversible computing: • Reversible logic is already used in quantum computing, and has a few possible applications in other areas of CS: • Security: auditable/verifiable computation, resilient systems • Transaction rollback for concurrent systems • May conceivably provide useful angles for tackling complexity-theory questions • e.g., FACTORINGP iff a poly-time zero-garbage reversible alg. to multiply primes M. Frank, NSF/CISE/CCF job talk
Some Important Research Challenges in Quantum Computing • Important experimental physics challenges: • Develop new experimental setups for prototype quantum computers that can effectively suppress decoherence to the threshold for fault-tolerance • To enable more rapid improvement of machine sizes • Develop effective physical architectures for efficient qubit transfer & execution of parallel quantum circuits • Important theory challenges: • Better characterize the limits of applicability of quantum algorithms • Find major new categories of applications beyond the scope of the standard hidden subgroup / unstructured search algorithms • Resolve major open issues in quantum complexity theory • Comparisons between BQP vs. BPP and NP, etc. M. Frank, NSF/CISE/CCF job talk
Program Administration Ideas • My personal program management philosophy: • “Hands-on” leadership, guiding & steering the work of proposers & reviewers based on my vision and understanding of the program’s mission and the scientific needs of the fields that it touches on • Clarify the vision and goals of the funding program up-front with a technical “white paper” surveying important open scientific issues • Include motivation for and summaries of important open research problems, with references to the literature • Encourage proposal writers to address the listed issues, or else to thoroughly motivate their own alternative directions • Proactively seek out researchers whose background, skills, and research interests seem to mesh well with the cluster’s mission and vision • and encourage them to submit proposals to the program • Encourage review panel members to carefully consider the quality & thoroughness of the motivation section when evaluating the scientific merit of proposals • IMHO, too much of today’s research is not sufficiently well-motivated M. Frank, NSF/CISE/CCF job talk
Educational Component • Strongly encourage proposers to include educational activities in their proposals, including: • Organizing of conferences • Writing of technical books & textbooks • Writing of introductory books for popular audiences • Even encourage submission of proposals for activity that is primarily educational in nature • There is an “education gap” in the areas I discussed also • Especially in reversible computing, which is still little known • Emphasize the need for educational materials that have a strong interdisciplinary perspective • E.g., integrating CS, EE, physics issues M. Frank, NSF/CISE/CCF job talk
Conclusion • Among the various unconventional computing technologies, there are strong reasons to believe that reversible computing has the greatest potential to make an enormous, vital, broad, and timely economic impact in coming decades… • Yet, compared to areas such as DNA, quantum, nano and bacterial computing, it has received by far the least attention and funding! • One of my main motivations for working in reversible computing has been to correct the imbalance between the underlying importance of and popular attention to this field… • However, my influence as a lone researcher “in the trenches” is limited… No programs support this presently unfashionable field • I hope in my position at EMT to help to finally bring some much-needed funding and attention to this orphaned area, and help guide research in new, productive directions… • While continuing support for well-motivated projects in other areas M. Frank, NSF/CISE/CCF job talk
finis End of Presentation – Extra Slides Follow M. Frank, NSF/CISE/CCF job talk
Everyone Has It All Wrong! • As the talk proceeds, • I’ll explain (in the proud MIT tradition) why most of the rest of the world is thinking about the future of computing in a completely wrong-headed way. • In particular, • The Low-Power Logic Circuit Designers have it all wrong! • The Semiconductor Process Engineers have it all wrong! • (Most) Device Physicists have it all wrong! M. Frank, NSF/CISE/CCF job talk
The von Neumann-Landauer (VNL) principle • John von Neumann, 1949: • Claim: The minimum energy dissipated “per elementary (binary) act of information” is kT ln 2. • No published proof exists; only a 2nd-hand account of a lecture • Rolf Landauer (IBM), 1961: • Logically irreversible (many-to-one) bit operations must dissipate at least kT ln 2 energy. • Paper anticipated but didn’t fully appreciate reversible computing • One proper (i.e. correct) statement of the principle: • The oblivious erasure of a known logical bit generates at least k ln 2 amount of new entropy. • Releasing into environment at T requires kT ln 2 heat emission. M. Frank, NSF/CISE/CCF job talk
Proof of the VNL Principle • The principle is occasionally questioned, but: • Its truth follows absolutely rigorously (and even trivially!) from rock-solid principles of fundamental physics! • (Micro-)reversibility of fundamental physics implies: • Information (at the microscale) is conserved • I.e., physical information cannot be created or destroyed • only transformed via reversible, deterministic processes • Thus, when a known bit is erased (lost, forgotten) it must really still be preserved somewhere in the microstate! • But, since its value has become unknown, it has become entropy • Entropy is just unknown/incompressible information M. Frank, NSF/CISE/CCF job talk
Types of Dynamical Processes • These animations illustrate how states transform in their configuration space, in: • A nondeterministic process: • One-to-many transformations • An irreversible process: • Many-to-one transformations • Nondeterministic and irreversible: • Deterministic and reversible: • One-to-one transformations only! WE ARE HERE M. Frank, NSF/CISE/CCF job talk
Physics is Reversible! • Despite all of the empirical phenomenology relating to macro-scale irreversibility, chaos, and nondeterministic quantum events, • Our most fundamental and thoroughly-tested modern models of physics (e.g. the Standard Model) are, at bottom, deterministic & reversible! • All of the observed nondeterministic and irreversible phenomena can still be explained within such models, as emergent effects. • Although classical General Relativity is argued by some researchers to have certain irreversible aspects, • The general consensus seems to be that we’ll eventually find that the “correct” theory of quantum gravity will be reversible. M. Frank, NSF/CISE/CCF job talk
Reversible/Deterministic Physics is Consistent with Observations • Apparent quantum nondeterminism can validly be understood as an emergent phenomenon, an expected practical result of permanent wavefunction splitting • As illustrated e.g. in the “many worlds” and “decoherent histories” pictures • Even if a quantum wavefunction does not split permanently, its evolution in a large system can quickly become much too complex to track within our models • Thus we resort to using “reduced” density matrices, which discard some knowledge • The above effects, plus imprecision in our knowledge of fundamental constants, result in some practical unpredictability even for microscale systems • Thus entropy, for all practical purposes, tends to increase towards its maximum • Chaos (macro-scale nondeterminism) occurs when entropy at the microscale infects our ability to forecast the long-term evolution of macroscopic variables • A necessary consequence of the computation-universality of physics? • Meanwhile, averaging of many high-entropy microscopic details results in a “smoothing” effect that leads to irreversible evolution of macro-variables. M. Frank, NSF/CISE/CCF job talk
Reversible Computing • We’d like to design mechanisms that compute while producing as little entropy as possible… • In order to minimize consumption of free energy / emission of heat to the environment • Losing known information necessarily results in a minimum k ln 2 entropy increase per bit lost, so… • Let’s consider what we can do using logically reversible (one-to-one) operations that don’t lose information. • Such operations are still computationally universal! • Lecerf (1963), Bennett (1973) M. Frank, NSF/CISE/CCF job talk
Conventional Gate Operations are Irreversible (even NOT!) • Consider a computer engineer’s (i.e., real world!) Boolean NOT gate (a.k.a. logical inverter) • Specified function: Destructively overwrite output node’s value with the logical complement of the input! Hardwarediagram: Space-time logic networkdiagram (not the same thing!!): New in in Oldin Twodifferentphysicallogicnodes Inverteroperation Invertergate Oldout New out out time M. Frank, NSF/CISE/CCF job talk
In-Place NOT (Reversible) • Computer scientist’s (i.e., somewhat fictionalized!) in-place logical NOT operation • Specified operation: Replace a given logic signal with its logical complement. • People occasionally confuse the irreversible inverter operation with a reversible in-place NOT operation • The same icon is sometimes used in spacetime diagrams time time in out old bit new bit M. Frank, NSF/CISE/CCF job talk
In-Place Controlled-NOT (cNOT) • Specified function: Perform an in-place NOT on the 2nd bit if and only if the 1st bit is a 1. • Equiv., replace 2nd bit with XOR of 1st & 2nd bits Transitiontable control old data new data time M. Frank, NSF/CISE/CCF job talk
Early Universal Reversible Gates • Controlled-controlled-NOT (ccNOT) • A.k.a. Toffoli gate • Perform cNOT(b,c) iff a=1. • Equiv., c := cXOR (a AND b) • Controlled-SWAP (cSWAP) • A.k.a. Fredkin gate • Swap b with c iff a=1. • Conserves 1s A B C A B C M. Frank, NSF/CISE/CCF job talk
The Adiabatic Principle • Applied physicists know that a wide class of physical transformations can be done adiabatically • From Greek adiabatos, “It shall not be passed through” • Used to mean, no passage of heat through an interface separating subsystems at different temperatures • Newer, more general meaning: No increase of entropy • Of course, exactly zero entropy increase isn’t practically doable • In practice, “adiabatic” is used to mean that the entropy generation scales down proportionally as the process takes place more gradually. • The general validity of this 1/t scaling relation is enshrined in the famous adiabatic theorem of quantum mechanics. M. Frank, NSF/CISE/CCF job talk
Adiabatic Charge Transfer Q • Consider passing a total quantity of charge Q through a resistive element of resistance R over time t via a constant current, I = Q/t. • The power dissipation (rate of energy diss.) during such a process is P = IV, where V = IR is the voltage drop across the resistor. • The total energy dissipated over time t is therefore: E = Pt = IVt = I2Rt = (Q/t)2Rt = Q2R/t. • Note the inverse scaling with the time t. • In adiabatic logic circuits, the resistive element is a switch. • The switch state can be changed by other adiabatic charge transfers. • In simple FET-type switches, the constant factor (“energy coefficient”) Q2R appears to be subject to some fundamental quantum lower bounds. • However, these are still rather far away from being reached. R M. Frank, NSF/CISE/CCF job talk
The Low-Power Designcommunity has it all wrong! • Even (most of) the ones who know about adiabatics and even many who have done extensive amounts of research on adiabatic circuits still aren’t doing it right! • Watch out! 99% of the so-called “adiabatic” circuit designs published in the low-power design literature aren’t truly adiabatic, for one reason or another! • As a result, most published results (and even review articles!) dramatically understate the energy efficiency gains that can actually be achieved with correct adiabatic design. • Which has resulted in (IMHO) too little serious attention having been paid to adiabatic techniques. M. Frank, NSF/CISE/CCF job talk