180 likes | 549 Views
H ( X|Y ). I ( X;Y ). H ( Y|X ). H ( X ). H ( Y ). conditional entropy, mutual information. X and Y are independent. X is determined by Y. H ( X | Y ) = H ( X ) I ( X ; Y ) = 0. H ( X | Y ) = 0 I ( X ; Y ) = H ( X ). example: evolution. [Adami 2002].
E N D
H(X|Y) I(X;Y) H(Y|X) H(X) H(Y) conditional entropy, mutual information X and Y are independent X is determined by Y H(X|Y) = H(X) I(X;Y) = 0 H(X|Y) = 0 I(X;Y)= H(X)
example: evolution [Adami 2002] • consider information in the genome in the context of information in the environment • same genome in a differentenvironment wouldn’t be as fit • evolution is increasing mutual information • fitter organisms exploittheir environment better,so must contain more info about their env • (total information in a genome can change, as genome changes size, etc) G1 H(G1|E) G2 increasing fitness of genome G in Env E G3 Env E I(G3;E)
example: emergence [Weeks et al 2007] • consider information in the high level description S in the context of information in the low level description E • same high level model in a differentlow level environment wouldn’t be as good • modelling /engineering as increasing mutual information • small H(S|E) good model / impl • large H(E|S) redundancy • use I as a fitness functionto search for better models / impls ? increasing fit of model S to Env E S1 S2 S3 H(S1|E) I(S3;E) Env E