100 likes | 213 Views
Kolmogorov Complexity. Presented by: Wei Peng. Introduction. Source: Alan Turing in 1936 of the universal Turing machine
E N D
Kolmogorov Complexity Presented by: Wei Peng
Introduction • Source: Alan Turing in 1936 of the universal Turing machine • Kolmogorov complexity can be measured by the length of shortest program for a universal Turing machine that correctly reproduces the observed data S KT(S) = |dT(S)| = |<T, w>|
Relationship between two streams • Kolmogorov • Initiated by Kolmogorov with important later developments by Martin-Lof and Chaitin. The body of information is usually assumed to be a finite string of binary digits S. • Motivation: information theory • Solomonoff • Chronologically the first, springs from the work of Solomonoff. The intent is not to measure the complexity of S, but rather to develop a probability distribution over the set of finite binary strings. • Motivation: inductive inference and artificial intelligence
Relationship between two streams Input tape: one-way, read-only Output tape: one-way, written, no over-written Input and output tapes: binary alphabet with no blank or delimiter symbol in addition to zero and one • Stream one: The turing machine is required to stop KT(S) = |dT(S)| • Stream two: Its further action is unspecified PT(S) =
Definition of Information A = 0101010101010101010101010101 B = 1110010110100011101010000111 • The length of the description is the combined length of representing M and w • Definition 6.20 Let x be a binary string. The minimal description of x, written d(x), is the shortest string <M, w> where TM M on input w halts with x on its tape. If several such strings exist, select the lexicographically first among them. The descriptive complexity (Kolmogorov complexity) of x, written K(x), is K(x) = |d(x)|
Some proofs of Kolmogorov complexity Proof: M is a Turing machine that halts as soon as it is started. This machine computes the identity function – its output is the same as its input. A description of x is simply <M>x. Letting c be the length of <M> completes the proof. Proof: M=“On input <N, w> where N is a TM and w is a string: 1. Run N on w until it halts and produces an output string s. 2. Output the string ss.” K(xx)= |<M>| + |d(x)| = c + K(x)
Some proofs of Kolmogorov complexity Proof: We construct a TM M that breaks its input w into two separate descriptions. The bits of the first description d(x) are all doubled and terminated with string 01 before the second description d(y) appears. Once both descriptions are obtained, they are run to obtain the string X and y and the output xy is produced. 2log(K(x)) + K(x) + K(y) +c
Incompressible strings and randomness • Let x be a string. Say that x is c-compressible if If x is not c-compressible, say that x is incompressible by c. If x is incompressible by 1, say that x is incompressible. • Incompressible strings of every length exist. Proof: The number of binary strings of length n is Each description is a nonempty binary string, so the number of descriptions of length less than n is at most the sum of the number of strings of each length up to n-1, or
Incompressible strings and randomness • For some constant b, for every string x, the minimal description d(x) of x is incompressible by b. Proof: consider the following TM M: M = “On input <R, y>, where R is a TM and y is a string: 1. Run R on y and reject if its output is not of the form <S, z>. 2. Run S on z and halt with its output on the tape.” Let b be |<M>|+1. We show that b satisfies the theorem. Suppose to the contrary that d(x) is b-compressible for some string x. Then But then <M>d(d(x)) is a description of x whose length is at most This description of x is shorter that d(x), contradicting the latter’s minimality.
Randomness • Random • Complexity theory: A string S is random for a UTM T if KT(S) ≥ |S|-c, where c is a small constant chosen to impose a ‘significance’ requirement on the concept of non-randomness • Work on applying the algorithmic theory of randomness to practical problems of computer learning is under way in the Computer Learning Research Center at Royal Holloway, University of London. The detail information can be accessed at http://www.clrc.rhbnc.ac.uk