650 likes | 675 Views
Entropy and Majorisation in probabilistic theories -an introduction oscar.dahlsten@physics.ox.ac.uk. Entropy and Majorisation. Entropy and Majorisation. Entropy and Majorisation- Why care.
E N D
Entropy and Majorisation in probabilistic theories-an introductionoscar.dahlsten@physics.ox.ac.uk
Entropy and Majorisation Entropy and Majorisation
Entropy and Majorisation- Why care One of the greatest tech challenges is that current nano-electronics gets as hot as light-bulb filaments, to do with work and heat.
If the number of messages in the set is finite then this number or any monotonic function of this number can be regarded as a measure of the information produced when one message is chosen from the set, all choices being equally likely. As was pointed out by Hartley the most natural choice is the logarithmic function. Shannon 1948 The great advance provided by information theory lies in the discovery that there is a unique, unambiguous criterion for the the "amount of uncertainty" represented by a discrete probability distribution [...] Jaynes 1956 Part 1: Information Entropy
Conceptual: Probabilities are subjective Entropy S is associated with both an object and an observer
Beyond Shannon entropy and large n iid limit General case i.i.d. large n
Useful way of showing non-negativity f(b) f(pa+(1-p)b) . . pf(a)+(1-p)f(b) f(a) a b pa+(1-p)b
Coarse-graining reduces classical entropy pi pi’ pi + pi’
Coarse-graining reduces classical entropy Next slide: a more general statement
Data processing inequality DPI DPI Research question: what is relation to Baez et al axiomisation of Shannon entropy?
Aside: Quantifying quantum correlations, discord See Modi et. al. for review on discord http://arxiv.org/pdf/1112.6238v3.pdf
What about that data processing inequality? Data processing can only degrade data
Part III: Post-quantum entropy If information theory is a meta-theory it should hold in any probabilistic theory(?)
Interesting example x y y “Square bit”-also known as “gbit” x
Violating DPI? Research question: why does S(A|B):=S(AB)-S(B) work with DPI in quantum case but not in general?
Entropy and Majorisation in probabilistic theories-an introductionPART II: MAJORISATIONoscar.dahlsten@physics.ox.ac.uk
Majorisation: presentation overview First, a story about wealth inequality
Majorisation and income inequality (Lorenz 1905) Gini index=% of yellow area between equality line and curve Source: Worldbank k
Lorenz curves of probabilities L(x) 1 x
Majorisation in terms of Lorenz curves L(x) Black majorises red 1 L(x) No majorisation 1 x x
Visualising majorisation geometrically P(1) Accessible with DS matrices P(0)