270 likes | 276 Views
This talk discusses counterexamples to the conjecture that the minimum entropy output state for a product channel is attained by a product-state input. It explores the implications of this counterexample and challenges it poses for physicists.
E N D
Counterexamples to the maximal p-norm multiplicativity conjecture | | | | N(½) Patrick Hayden (McGill University) p C&QIC, Santa Fe 2008
A challenge to the physicists • John Pierce [1973]: • I think that I have never met a physicist who understood information theory. I wish that physicists would stop talking about reformulating information theory and would give us a general expression for the capacity of a channel with quantum effects taken into account rather than a number of special cases.
Encoding ( state) Decoding (measurement) m’ m Sending classical information through noisy quantum channels Physical model of a noisy channel: (Trace-preserving, completely positive map) HSW noisy coding theorem: In the limit of many uses, the optimal rate at which Alice can send bits reliably to Bob through N is given by the (regularization of the) formula where the maximization is over some family of input/output states.
Sending classical information through noisy quantum channels Physical model of a noisy channel: (Trace-preserving, completely positive map) Encoding ( state) Decoding (measurement) m’ m HSW noisy coding theorem: In the limit of many uses, the optimal rate at which Alice can send bits reliably to Bob through N is given by the (regularization of the) formula
The additivity conjecture:These two formulas are equal where Sustained, heroic, and so far inconclusive efforts by: Datta, Eisert, Fukuda, Holevo, King, Ruskai, Schumacher, Shirokov, Shor, Werner... Why do they care so much?
The additivity conjecture:These two formulas are equal where Operational interpretation: •Alice doesn’t need to entangle her inputs across multiple uses of the channel. • Codewords look like ¾x1¾x2L ¾xn
QMAC solution pre-QIP 2005 Interpretation: Alice and Bob treat each others’ actions as noise. Independent decoding. No-go theorem for use of quantum side information. [Yard/Devetak/H 05 v1]
QMAC solution post-QIP 2005 Interpretation: Charlie decodes Alice’s quantum data first and uses it to help him decode Bob’s. (Or vice-versa.) Go theorem for use of quantum side information. [Yard/Devetak/H 05 v2]
Lesson: Capacity formulas matter If we can’t write down a tractable formula for the solution to a capacity problem, then we don’t fully understand the structure of the optimal codes. • Fair question to throw at the speaker if you’re getting bored in any quantum Shannon theory talk: • “Can you describe an effective procedure for calculating this capacity you claim to have determined?”
An (Almost) Equivalent Form:Minimum Entropy Outputs Notation: • H() = - Tr[ log ] (von Neumann entropy of the density operator ) • N, N1 and N2 are quantum channels. (CPTP) • Hmin(N) = min H(N()) is the minimum output entropy of N. Conjecture: The minimum entropy output state for the product channel N1N2 is attained by a product state input 12. [King-Ruskai 99]
Maximal p-norm multiplicativity conjecture Conjecture: The minimum entropy output state for the product channel N1N2 is attained by a product-state input 12.
Maximal p-norm multiplicativity conjecture Conjecture: The minimum entropy output state for the product channel N1N2 is attained by a product-state input 12. Renyi entropy (1 < p ): (Recover von Neumann entropy as p 1.) Norm? What norm? [Amosov-Holevo-Werner 00]
Partial results: Additivity holds if... • One channel is • Unitary • A unital qubit channel • A generalized depolarizing channel • A generalized dephasing channel • Entanglement-breaking • A very noisy channel • Complements of these channels [Amosov, Devetak, Eisert, Fujiwara, Hashizume, Holevo, King, Matsumoto, Nathanson, Ruskai, Shor, Wolf, Werner] [See Holevo ICM 2006]
But... • 2002: Additivity fails for p > 4.79... [Holevo-Werner] • 2007: Additivity fails for p > 2. [Winter]
p 1 Counterexamples for 1<p<2! • For all 1 < p < 2, there exist channels N1 and N2 to Cd such that: • Hpmin(N1) , Hpmin(N2) log d - O(1) • Hpmin(N1N2) p log d + O(1) Additivity would have implied: Hpmin(N1N2) 2 log d - O(1) 2 Near p=1, minimum output entropy of N1N2 not significantly greater than that of N1 or N2 alone! Intuition: Channels that look very noisy (nearly depolarizing) need not be anywhere near depolarizing on entangled input.
The counterexamples |0 R N() A S N() A N U S B TRASH Fix dimensions |R|<<|S|, |A|=|B| and choose U at random according to Haar measure. Demonstrate resulting channels violate Renyi additivity with non-zero probability. Two things to prove: Product channel has low minimum output entropy. Individual channels have high minimum output entropies.
NN has low output entropy The key identity: U I U* = I
NN has low output entropy The key identity (v1): The key identity (v2): |0 R N() A U S B TRASH Easy calculation: This is BIG if |R| is small! (Compare 1/|A|2 for maximally mixed state.) Choose |R| ~ |A|p-1.
N and N have high output entropy |0 R N() A | S A N() N U | S B TRASH If U is selected at random, what can be said about U||0? U||0 is highly entangled between A and B: Hp( N() ) log|A| - O(1) (Compare maximally mixed state: log|A|.) Is this true simultaneously for all | S with a typical U? i.e. Is min| S Hp( N() ) log|A| - O(1) ? [Lubkin, Lloyd, Page, Foong & Kanno, Sanchez-Ruiz, Sen…]
An < exp[-n g()] for some g() indep. of n An f (x)=x1 Concentration of measure Sn LEVY: Given an -Lipschitz function f : Sn!R with median M, the probability that, for a random x2RSn , f (x) is further than from M is bounded above by exp (-n2C/2) from some C > 0. Just need a Lipschitz constant: Choosing f the map from | to Hp(N()), can take 2 |A|p-1. Pr[ Hp(N()) < log|A|- const - ] ~ exp( - const 2|A|3-p )
Connect the dots U (S |0½ A B • Choose a fine net F of states on theunit sphere of S |0. • P( Not all states in UF highly entangled )· |F| P( One state isn’t ) • Highly entangled for sufficiently fine N implies same for all states in S. THEOREM: If |R|~|A|p-1, then |S| ~ |A|3-p and w.h.p. as|A| , min| S Hp( N() ) log|A| - O(1). N and N have high minimum output entropy.
Done! • For all 1 < p < 2, there exist channels N1 and N2 to Cd such that: • Hpmin(N1) , Hpmin(N2) log d - O(1) • Hpmin(N1N2) p log d + O(1) Additivity would have implied: Hpmin(N1N2) 2 log d - O(1) Near p=1, minimum output entropy of N1N2 not significantly greater than that of N1 or N2 alone!
What about von Neumann (p=1)??? Method fails: recall |R|~|A|p-1. Constants depend on p and blow up. Artifact of the analysis or does the conjecture survive at p=1?
|R|=3 |A|=|B|=24 (NN)()
What about von Neumann (p=1)??? Method fails: recall |R|~|A|p-1. Constants depend on p blow up. Artifact or does the conjecture survive at p=1? Hp for p > 1 very sensitive to a single large eigenvalue, but H1 is not.
Do some calculating Contribution from eigenvalue ~1/|R| Contribution from all the others For Hp, p > 1, first term dominates but second term dominates H1 H1((NN)()) = 2 log|A| - O(1) is BIG not small No additivity violations. To be sure, can anyone calculate the O(1) terms?
Summary • Additivity fails for 1 < p < 2. Closes main approach to additivity for capacity itself. • Further developments: • Winter tightened Lipschitz bound, showing same examples work for 1 < p < • Dupuis showed orthogonal group can replace unitary group: N1 = N2 • Cubitt, Harrow, Leung, Montanaro & Winter have found violations for 0 p 0.12