1 / 9

Section 13.2 The Church-Turing Thesis

Section 13.2 The Church-Turing Thesis The Church-Turing Thesis: Anything that is intuitively computable can be be computed by a Turing machine. It is a thesis rather than a theorem because it relates the informal notion of intuitively

bellel
Download Presentation

Section 13.2 The Church-Turing Thesis

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Section 13.2 The Church-Turing Thesis The Church-Turing Thesis: Anything that is intuitively computable can be be computed by a Turing machine. It is a thesis rather than a theorem because it relates the informal notion of intuitively computable to the formal notion of a Turing machine. Computational Models A computational model is a characterization of a computing process that describes the form a program and describes how the instructions and are executed. Example. The Turing machine computational model describes the form of TM instructions and how to execute them. Example. If X is a programming language, the X computational model describes the form of a program and how each instruction is executed. Equivalence of Computational Models Two computational models are equivalent in power if they solve the same class of problems. Any piece of data for a program can be represented by a string of symbols and any string of symbols can be represented by a natural number. So even though computational models may process different kinds of data, they can still be compared with respect to how they process natural numbers. We’ll make the assumption that there is an unlimited amount of memory available. So we can represent any natural number or any finite string. Each of the following models of computation is equal in power to the TM model.

  2. The Simple Programming Language • This imperative programming model processes natural numbers. The language is defined as • follows: • Variables have type N. • Assignment statements: X := 0; X := succ(Y); X := pred(Y). (assume pred(0) = 0) • Composition of statements: S1; S2. • whileX ≠ 0 doSod. • This simple language model has the same power as the Turing machine model. • For input and output use the values of the variables before and after execution. • Example/Quiz. To demonstrate the power of this language, define the following macros. • Some Macros Macro Expansion X := succ(Y); X := pred(X). X := 0; X := succ(X); X := succ(X). C := X; whileC ≠ 0 doZ := succ(Z); C := pred(C) od. Z := X; C := Y; whileC ≠ 0 doZ := succ(Z); C := pred(C) od. Z := X;C := Y; whileC ≠ 0 doZ := pred(Z); C := pred(C) od. T := Y monus X; whileT ≠ 0 doS; T := Y monus Xod. U := X; V := 1; whileU ≠ 0 doS1; V := 0; U := 0 od; whileV ≠ 0 doS2; V := 0 od. X := Y X := 2 Z := Z + X Z := X + Y Z := X monus Y whileX < YdoSod ifX ≠ 0 thenS1elseS2fi

  3. Partial Recursive Functions • This model consists of a set of functions that take natural numbers as arguments and as • values. The functions are defined as follows, where x can represent zero or more arguments. • Initial functions: zero(x) = 0, succ(n) = n + 1, and projections (e.g., p2(a, b, c) = b). • Composition: e.g., ƒ(x) = h(g1(x), …, gm(x)), where h, g1, …, gm are partial recursive. • Primitive recursion: • ƒ(x, 0) = h(x) (h is partial recursive) • ƒ(x, succ(y)) = g(x, y, ƒ(x, y)) (g is partial recursive). • Unbounded search (minimalization): • ƒ(x) = min(y, g(x, y) = 0) (g is total partial recursive). • This means that ƒ(x) = y is the minimum y such that g(x, y) = 0, if such a y exists. • This model for constructing functions has the same power as the Turing machine model. • Example. The predecessor and monus functions are partial recursive as follows: • pred(0) = 0 • pred(succ(y)) = y, which can be written in the form p1(y, pred(y)). • monus(x, 0) = x • monus(x, succ(y)) = pred(monus(x, y)), which can be written pred(p3(x, y, monus(x, y))). • Quiz. Show that pred(monus(x, y)) = monus(pred(x), y)). • Proof: The equation is true if x = 0 or y = 0. So assume x > 0 and y > 0. Then we have: • pred(monus(x, y)) = if x > y then x – y – 1 else 0 • monus(pred(x), y)) = if x – 1 > y then x – 1 – y else 0. • Although x > y and x – 1 > y are different, the if-then-else values are equal. QED.

  4. Quiz. Let p and q be partial recursive with p(x), q(x)  {0, 1}, for false and true. Show that the logical operations p(x) q(x), ¬ p(x), and p(x) q(x), are partial recursive functions. Solutions: p(x) q(x) = p(x)q(x) ¬ p(x) = monus(1, p(x)) p(x) q(x) = p(x) + monus(1, p(x))q(x). Examples (Unbounded Search, Minimalization) ƒ(x) = min(y, xy = 0) defines ƒ(x) = 0. ƒ(x) = min(y, x + y = 0) defines ƒ(x) = if x = 0 then 0 else undefined. ƒ(x) = min(y, monus(x, y) = 0) defines ƒ(x) = x. ƒ(x) = min(y, monus(y, x) = 0) defines ƒ(x) = 0. ƒ(x, y) = min(z, monus(x + z, y) = 0) defines ƒ(x, y) = if x ≤ y then 0 else undefined Example/Quiz. For y ≠ 0, let ƒ(x, y) = min(z, monus(x, yz) = 0). What function does ƒ compute? Answer: ƒ(x, y) =  x / y. To see this, notice that the definition ƒ(x, y) = min(z, monus(x, yz) = 0) means that ƒ(x, y) = z is the smallest natural number such that monus(x, yz) = 0. The definition of monus tells us that monus(x, yz) = 0 means that x ≤ yz. So z is the smallest natural number such that x ≤ yz. In other words, we have y(z – 1) < x ≤ yz. Divide by y to obtain z – 1 < x/y ≤ z. Therefore z = x / y.

  5. Markov Algorithms This model processes strings. An algorithm consists of a finite, ordered, sequence of productions of the form x y, where x, yA* for some alphabet A. Any production can be suffixed with (halt) although this is not required. Execution Given an input string wA*, perform the following execution step repeatedly. Scan the productions x y sequentially to see whether x occurs as a substring of w. If so, replace the leftmost occurrence of x in w by y and reset w to this string. Otherwise halt. If the x y is labeled with (halt), then halt. Assumption: w = Lw. So a production of the form Ly would transform w to yw. The Markov algorithm model has the same power as the Turing machine model. Example. The Markov algorithm consisting of the single production a L will delete all a’s from any string. Example. A more instructive Markov algorithm to delete all a’s from any string over {a, b} can be written as the following sequence of productions (# is an extra symbol). 1. #a # 2. #b b# 3. # L (halt) 4.L#. An example trace: abab (input) #abab (by 4) #bab (by 1) b#ab (by 2) b#b (by 1) bb# (by 2) bb (by 3, halt).

  6. Quiz (1 minute). Find a Markov algorithm to replace each a with aa in strings over {a, b}. Answer. Modify the previous example: 1. #a aa# 2. #b b# 3. # L (halt) 4.L#. Example/Quiz. Find a Markov algorithm to delete the rightmost b from strings over {a, b}. Solution: 1. #a a# 2. #b b# 3. # @ 4. a@ @a 5. b@ L (halt) 6. @ L (halt) 7.L#. Example/Quiz. Find a Markov algorithm to implement succ(x), where x is a natural number represented in binary. Assume no leading zeros (except 0 itself). Solution: 1. #0  0# 2. #1  1# 3. 0#  1 (halt) 4. 1#  @0 5. 1@  @0 6. 0@  1 (halt) 7. @  1 (halt) 8. L #.

  7. Post Algorithms This model processes strings. An algorithm consists of a finite set of productions of the form s t where s and t are strings over the union of an input alphabet A with a set of variables and other symbols. Restriction: If a variable X occurs in t then X occurs in s. A production can be suffixed with (halt) although this is not required. Execution Given an input string wA*, perform the following execution step repeatedly. Find a production x y such that w matches x. If so, use the match and y to construct a new string w. Otherwise halt. If the x y is labeled with (halt), then halt. Assumption: A variable may match L. Example. If the input string is 1, then 1 matches 1X. So a production like 1X  1X0 would transform 1 into 10. The Post algorithm model has the same power as the Turing machine model. Example. A Post algorithm with the single production XaY  XY will delete all a’s from any string. Notice the nondeterminism. Example. A Post algorithm to replace each a with aa in strings over {a, b}. Solution: 1. aX @aa#X 2. bX @b#X 3. X#aY  Xaa#Y 4. X#bY  Xb#Y 5. @X#  X (halt).

  8. Example. A Post algorithm to delete the rightmost b from any string over {a, b}. Solution: 1. Xb X (halt) 2. Xa X#a@ 3. Xa#Y  X#aY 4. Xb#Y@ XY (halt) 5. #X@  X (halt). Example/Quiz.Find a Post algorithm to implement succ(x), where x is a natural number represented in binary. Assume no leading zeros (except 0 itself). Solution: 1. X0 X1 (halt) 2. X1 X#0# 3. X1#Y# X#0Y# 4. X0#Y# X1Y (halt) 5. #Y#  1Y (halt). Post Systems This model generates a set of strings from axioms (a given set of strings) and inference rules (a given set of productions that map strings to strings by matching). Execution: Match the left side of a production with an axiom string or a string that has already been constructed. Then use the match and the right side to construct a new string. Example. A Post system to generate the binary representations of natural numbers: Axioms: 0, 1 Inference Rules: 1X  1X0 1X  1X1. The system generates the set {0, 1, 10, 11, 100, 101, 110, 111, … }.

  9. Quiz. Find a Post system to generate {a}*. Solution: Axiom: L Inference rule: XaX. Quiz. Find a Post system to generate the set {anbncn | nN}. One of Many Solutions: Axioms: L, abc Inference rule: aXbcYaaXbbccY. The Post system model has the same power as the Turing machine model in the following sense: A function ƒ : A* A* is Post-computable if there is a Post system to compute the function as a set of ordered pairs in the form {x#ƒ(x) | xA*}. Post-computable functions coincide with Turing-computable functions. Example/Quiz. Generate the function ƒ : {a}*  {a}* where ƒ(an) = a2n. Solution: Axiom: L#L Inference rule: X#YXa#Yaa (or the simpler XaXaa) This system generates the set {L#L, a#aa, aa#aaaa, …, an#a2n, … }. Example/Quiz. Generate the function ƒ : NN defined by ƒ(n) = n2, where n is represented as a string over {a} of length n. A Solution: Axiom: L#L Inference rule: X#YXa#YXXa. Proof: In terms of natural numbers, from the pair n#n2 we must construct (n + 1)#(n + 1)2. We can write (n + 1)2 = n2 + 2n + 1 = n2 + n + n + 1. So from n#n2 we must get (n + 1)#(n2 + n + n + 1). In terms of strings over {a}, we transform X#Y into Xa#YXXa. QED.

More Related