100 likes | 160 Views
New Evidence That Quantum Mechanics Is Hard to Simulate on Classical Computers. Scott Aaronson Parts based on joint work with Alex Arkhipov. In 1994, something big happened in the foundations of computer science, whose meaning is still debated today….
E N D
New Evidence That Quantum Mechanics Is Hard to Simulate on Classical Computers Scott Aaronson Parts based on joint work with Alex Arkhipov
In 1994, something big happened in the foundations of computer science, whose meaning is still debated today… Why exactly was Shor’s algorithm important? Boosters: Because it means we’ll build QCs! Skeptics: Because it means we won’t build QCs! Me: For reasons having nothing to do with building QCs!
Shor’s algorithm was a hardness result for one of the central computational problems of modern science: Quantum Simulation Use of DoE supercomputers by area (from a talk by Alán Aspuru-Guzik) Shor’s Theorem: Quantum Simulation is not in probabilistic polynomial time, unless Factoring is also
Today: A different kind of hardness result for simulating quantum mechanics Advantages of the new results: Based on “generic” complexity assumptions, rather than the classical hardness of Factoring Use only extremely weak kinds of quantum computing (e.g. nonadaptive linear optics)—testable before I’m dead? Give evidence that QCs have capabilities outside the entire polynomial hierarchy Disadvantages: Apply to sampling problems (or to problems with many possible valid outputs), not decision problems Harder to convince a skeptic that your QC is solving the relevant hard problem Problems don’t seem “useful”
First Problem Given a random Boolean function f:{0,1}n{-1,1} Find subsets S1,…,Sk[n] of the input bits, most of whose parities are “slightly better correlated than chance” with f E.g., sample a subset S with probability where Distribution of these Fourier coefficientsfor a random S Distribution for the S’s that you’re being asked to output
This problem is trivial to solve using a quantum computer! |0 H H |0 H f H Theorem 1: Any classical probabilistic algorithm to solve it (even approximately) must make exponentially many queries to f Theorem 2: This is true even if we imagine that P=NP, and that the classical algorithm can ask questions like |0 H H Theorem 3: Even if we “instantiate” f by some explicit function (like 3SAT), any classical algorithm to solve the problem really accurately would imply P#P=BPPNP(meaning “the polynomial hierarchy would collapse”)
Ideally, we want a simple, explicit quantum system Q, such that any classical algorithm that even approximately simulates Q would have dramatic consequences for classical complexity theory We argue that this possible, using non-interacting bosons There are two basic types of particle in the universe… All I can say is, the bosons got the harder job… BOSONS FERMIONS Their transition amplitudes are given respectively by…
Our Current Result Take a system of n photons with m=O(n2) “modes” each. Put each photon in a known mode, then apply a random mm scattering matrix U: U Let D be the distribution that results from measuring the photons. Suppose there’s an efficient classical algorithm that samples any distribution even 1/nO(1)-close to D. Then in BPPNP, one can approximate the permanent of a matrix A of independent N(0,1) Gaussians, to additive errorwith high probability over A. Challenge: Prove the above problem is #P-complete
Experimental Prospects • What would it take to implement this experiment with photonics? • Reliable phase-shifters • Reliable beamsplitters • Reliable single-photon sources • Reliable photodetector arrays • But crucially, no nonlinear optics or postselected measurements! Our Proposal: Concentrate on (say) n=30 photons, so that classical simulation is difficult but not impossible
Summary • I’ve often said we have three choices: either • The Extended Church-Turing Thesis is false, • Textbook quantum mechanics is false, or • QCs can be efficiently simulated classically. For all intents and purposes?