Turing Machine – More Definition and Examples Notion of an ...bchor/CM/Compute7.pdf · some...

Post on 25-Jul-2020

4 views 0 download

Transcript of Turing Machine – More Definition and Examples Notion of an ...bchor/CM/Compute7.pdf · some...

Computational Models– Lecture 7

Turing Machine – More Definition and Examples

Notion of an Algorithm

Hilbert’s Tenth Problem

Decidability of DFAs and PDAs Questions

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.1

Non-Deterministic Turing Machines(reminder)

Transition function:

δ : Q× Γ → P(Q× Γ× {L,R})

Computation is a tree.

Accepts if there is (∃) an accepting branch.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.2

Non-Deterministic Turing Machines(reminder)

Transition function:

δ : Q× Γ → P(Q× Γ× {L,R})

Computation is a tree.

Accepts if there is (∃) an accepting branch.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.2

Non-Deterministic Turing Machines(reminder)

Transition function:

δ : Q× Γ → P(Q× Γ× {L,R})

Computation is a tree.

Accepts if there is (∃) an accepting branch.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.2

Equivalence

Theorem: A language is enumerable if and only ifthere is some non-deterministic Turing machine thataccepts it.

One direction is trivial.

To prove the other direction, we will show how toconvert a non-deterministic TM, N , into an equivalentdeterministic TM, D.

D will simulate N .

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.3

Equivalence

Theorem: A language is enumerable if and only ifthere is some non-deterministic Turing machine thataccepts it.

One direction is trivial.

To prove the other direction, we will show how toconvert a non-deterministic TM, N , into an equivalentdeterministic TM, D.

D will simulate N .

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.3

Equivalence

Theorem: A language is enumerable if and only ifthere is some non-deterministic Turing machine thataccepts it.

One direction is trivial.

To prove the other direction, we will show how toconvert a non-deterministic TM, N , into an equivalentdeterministic TM, D.

D will simulate N .

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.3

Equivalence

Theorem: A language is enumerable if and only ifthere is some non-deterministic Turing machine thataccepts it.

One direction is trivial.

To prove the other direction, we will show how toconvert a non-deterministic TM, N , into an equivalentdeterministic TM, D.

D will simulate N .

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.3

Simulating Non-Determinism

Basic idea:

D tries all possible branches

If D finds any accepting branch, it accepts.

If all branches reject, D rejects.

If all branches reject or loop, D loops.

Of course, D does not “know” this (loop) is thecase. It just “follows” N .

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.4

Simulating Non-Determinism

Basic idea:

D tries all possible branches

If D finds any accepting branch, it accepts.

If all branches reject, D rejects.

If all branches reject or loop, D loops.

Of course, D does not “know” this (loop) is thecase. It just “follows” N .

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.4

Simulating Non-Determinism

Basic idea:

D tries all possible branches

If D finds any accepting branch, it accepts.

If all branches reject, D rejects.

If all branches reject or loop, D loops.

Of course, D does not “know” this (loop) is thecase. It just “follows” N .

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.4

Simulating Non-Determinism

Basic idea:

D tries all possible branches

If D finds any accepting branch, it accepts.

If all branches reject, D rejects.

If all branches reject or loop, D loops.

Of course, D does not “know” this (loop) is thecase. It just “follows” N .

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.4

Simulating Non-Determinism

Basic idea:

D tries all possible branches

If D finds any accepting branch, it accepts.

If all branches reject, D rejects.

If all branches reject or loop, D loops.

Of course, D does not “know” this (loop) is thecase. It just “follows” N .

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.4

Simulating Non-Determinism (2)

11 ... ba ... 1 ...#

input tape(never altered)

simulationtape

addresstape

2 3 3

D has three tapes

the input tape is never altered

the simulation tape is a copy of N ’s tape

the address tape keeps track of D’s location inN ’s computation tree.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.5

Simulating Non-Determinism (2)

11 ... ba ... 1 ...#

input tape(never altered)

simulationtape

addresstape

2 3 3

D has three tapes

the input tape is never altered

the simulation tape is a copy of N ’s tape

the address tape keeps track of D’s location inN ’s computation tree.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.5

Simulating Non-Determinism (2)

11 ... ba ... 1 ...#

input tape(never altered)

simulationtape

addresstape

2 3 3

D has three tapes

the input tape is never altered

the simulation tape is a copy of N ’s tape

the address tape keeps track of D’s location inN ’s computation tree.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.5

Simulating Non-Determinism (2)

11 ... ba ... 1 ...#

input tape(never altered)

simulationtape

addresstape

2 3 3

D has three tapes

the input tape is never altered

the simulation tape is a copy of N ’s tape

the address tape keeps track of D’s location inN ’s computation tree.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.5

DecidersDefinition: A non-deterministic TM is a decider if onall inputs, all branches halt (in either state qa or qr).

Definition (reminder): A language is decidable ifsome deterministic Turing machine decides it.

Theorem: A language is decidable if and only ifthere is a non-deterministic Turing machine thatdecides it.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.6

DecidersDefinition: A non-deterministic TM is a decider if onall inputs, all branches halt (in either state qa or qr).

Definition (reminder): A language is decidable ifsome deterministic Turing machine decides it.

Theorem: A language is decidable if and only ifthere is a non-deterministic Turing machine thatdecides it.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.6

DecidersDefinition: A non-deterministic TM is a decider if onall inputs, all branches halt (in either state qa or qr).

Definition (reminder): A language is decidable ifsome deterministic Turing machine decides it.

Theorem: A language is decidable if and only ifthere is a non-deterministic Turing machine thatdecides it.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.6

EnumeratorsA language is enumerable if it is accepted by someTuring machine.But why enumerable?

1 010

aaabba

Definition: An enumerator is a TM with a printer.

TM sends strings to printer

may create infinite list of strings

TM enumerates a language – all strings produced.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.7

EnumeratorsA language is enumerable if it is accepted by someTuring machine.But why enumerable?

1 010

aaabba

Definition: An enumerator is a TM with a printer.

TM sends strings to printer

may create infinite list of strings

TM enumerates a language – all strings produced.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.7

EnumeratorsA language is enumerable if it is accepted by someTuring machine.But why enumerable?

1 010

aaabba

Definition: An enumerator is a TM with a printer.

TM sends strings to printer

may create infinite list of strings

TM enumerates a language – all strings produced.Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.7

TheoremTheorem: A language is accepted by some Turingmachine if and only if some enumerator enumerates it.

Will show

If E enumerates language A, then some TM Maccepts A.

If M accepts A, then some enumerator Eenumerates it.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.8

TheoremTheorem: A language is accepted by some Turingmachine if and only if some enumerator enumerates it.

Will show

If E enumerates language A, then some TM Maccepts A.

If M accepts A, then some enumerator Eenumerates it.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.8

TheoremClaim: If E enumerates language A, then some TMM accepts A.

On input w, TM M

Runs E. Every time E outputs a string v, Mcompares it to w.

If v = w, M accept.

If v 6= w, M continues running E.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.9

TheoremClaim: If E enumerates language A, then some TMM accepts A.

On input w, TM M

Runs E. Every time E outputs a string v, Mcompares it to w.

If v = w, M accept.

If v 6= w, M continues running E.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.9

TheoremClaim: If E enumerates language A, then some TMM accepts A.

On input w, TM M

Runs E. Every time E outputs a string v, Mcompares it to w.

If v = w, M accept.

If v 6= w, M continues running E.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.9

TheoremClaim: If E enumerates language A, then some TMM accepts A.

On input w, TM M

Runs E. Every time E outputs a string v, Mcompares it to w.

If v = w, M accept.

If v 6= w, M continues running E.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.9

TheoremClaim: If M accepts A, then some enumerator Eenumerates it.

Let s1, s2, s3, . . . is a list of all strings in Σ∗ (e.g.strings in lexicographic order).The enumerator, E

repeat the following for i = 1, 2, 3, . . .

run M for i steps on each input s1, s2, . . . , si.

if any computation accepts, print out thecorresponding s. ♣

Note that with this procedure, each output isduplicated infinitely often.

Think how can this duplication be avoided?

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.10

TheoremClaim: If M accepts A, then some enumerator Eenumerates it.

Let s1, s2, s3, . . . is a list of all strings in Σ∗ (e.g.strings in lexicographic order).The enumerator, E

repeat the following for i = 1, 2, 3, . . .

run M for i steps on each input s1, s2, . . . , si.

if any computation accepts, print out thecorresponding s. ♣

Note that with this procedure, each output isduplicated infinitely often.

Think how can this duplication be avoided?

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.10

TheoremClaim: If M accepts A, then some enumerator Eenumerates it.

Let s1, s2, s3, . . . is a list of all strings in Σ∗ (e.g.strings in lexicographic order).The enumerator, E

repeat the following for i = 1, 2, 3, . . .

run M for i steps on each input s1, s2, . . . , si.

if any computation accepts, print out thecorresponding s. ♣

Note that with this procedure, each output isduplicated infinitely often.

Think how can this duplication be avoided?

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.10

TheoremClaim: If M accepts A, then some enumerator Eenumerates it.

Let s1, s2, s3, . . . is a list of all strings in Σ∗ (e.g.strings in lexicographic order).The enumerator, E

repeat the following for i = 1, 2, 3, . . .

run M for i steps on each input s1, s2, . . . , si.

if any computation accepts, print out thecorresponding s. ♣

Note that with this procedure, each output isduplicated infinitely often.

Think how can this duplication be avoided?

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.10

TheoremClaim: If M accepts A, then some enumerator Eenumerates it.

Let s1, s2, s3, . . . is a list of all strings in Σ∗ (e.g.strings in lexicographic order).The enumerator, E

repeat the following for i = 1, 2, 3, . . .

run M for i steps on each input s1, s2, . . . , si.

if any computation accepts, print out thecorresponding s. ♣

Note that with this procedure, each output isduplicated infinitely often.

Think how can this duplication be avoided?Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.10

Decidability vs. Enumerability

Decidability is a stronger notion thanenumerability.

If a language L is decidable then clearly it isenumerable (the other dirction does not hold, aswe’ll show in a couple of lectures).

It is also clear that if L is decidable then so is L,and thus L is also enumerable.

Let RE denote the class of enumerablelanguages, and let coRE denote the class oflanguages whose complement is enumerable.

Let R denote the class of decidable languages.Then what we just saw is R ⊆ RE ∩ coRE .

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.11

Decidability vs. Enumerability

Decidability is a stronger notion thanenumerability.

If a language L is decidable then clearly it isenumerable (the other dirction does not hold, aswe’ll show in a couple of lectures).

It is also clear that if L is decidable then so is L,and thus L is also enumerable.

Let RE denote the class of enumerablelanguages, and let coRE denote the class oflanguages whose complement is enumerable.

Let R denote the class of decidable languages.Then what we just saw is R ⊆ RE ∩ coRE .

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.11

Decidability vs. Enumerability

Decidability is a stronger notion thanenumerability.

If a language L is decidable then clearly it isenumerable (the other dirction does not hold, aswe’ll show in a couple of lectures).

It is also clear that if L is decidable then so is L,and thus L is also enumerable.

Let RE denote the class of enumerablelanguages, and let coRE denote the class oflanguages whose complement is enumerable.

Let R denote the class of decidable languages.Then what we just saw is R ⊆ RE ∩ coRE .

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.11

Decidability vs. Enumerability

Decidability is a stronger notion thanenumerability.

If a language L is decidable then clearly it isenumerable (the other dirction does not hold, aswe’ll show in a couple of lectures).

It is also clear that if L is decidable then so is L,and thus L is also enumerable.

Let RE denote the class of enumerablelanguages, and let coRE denote the class oflanguages whose complement is enumerable.

Let R denote the class of decidable languages.Then what we just saw is R ⊆ RE ∩ coRE .

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.11

Decidability vs. Enumerability

Decidability is a stronger notion thanenumerability.

If a language L is decidable then clearly it isenumerable (the other dirction does not hold, aswe’ll show in a couple of lectures).

It is also clear that if L is decidable then so is L,and thus L is also enumerable.

Let RE denote the class of enumerablelanguages, and let coRE denote the class oflanguages whose complement is enumerable.

Let R denote the class of decidable languages.Then what we just saw is R ⊆ RE ∩ coRE .

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.11

Decidability vs. Enumerability (2)

Theorem: R=RE ∩ coRE .Proof: We should prove the ⊇ direction. Namely ifL ∈ RE ∩ coRE , then L ∈ R.

In other words, if both L and its complement areenumerable, then L is decidable.Let M1 be a TM that accepts L.Let M2 be a TM that accepts L.We describe a TM, M , that decides L.On input x, M runs M1 and M2 in parallel.If M1 accepts, M accepts.If M2 accepts, M rejects.Should now show that indeed M decides L.Not too hard... ♣

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.12

Decidability vs. Enumerability (2)

Theorem: R=RE ∩ coRE .Proof: We should prove the ⊇ direction. Namely ifL ∈ RE ∩ coRE , then L ∈ R.In other words, if both L and its complement areenumerable, then L is decidable.

Let M1 be a TM that accepts L.Let M2 be a TM that accepts L.We describe a TM, M , that decides L.On input x, M runs M1 and M2 in parallel.If M1 accepts, M accepts.If M2 accepts, M rejects.Should now show that indeed M decides L.Not too hard... ♣

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.12

Decidability vs. Enumerability (2)

Theorem: R=RE ∩ coRE .Proof: We should prove the ⊇ direction. Namely ifL ∈ RE ∩ coRE , then L ∈ R.In other words, if both L and its complement areenumerable, then L is decidable.Let M1 be a TM that accepts L.Let M2 be a TM that accepts L.We describe a TM, M , that decides L.

On input x, M runs M1 and M2 in parallel.If M1 accepts, M accepts.If M2 accepts, M rejects.Should now show that indeed M decides L.Not too hard... ♣

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.12

Decidability vs. Enumerability (2)

Theorem: R=RE ∩ coRE .Proof: We should prove the ⊇ direction. Namely ifL ∈ RE ∩ coRE , then L ∈ R.In other words, if both L and its complement areenumerable, then L is decidable.Let M1 be a TM that accepts L.Let M2 be a TM that accepts L.We describe a TM, M , that decides L.On input x, M runs M1 and M2 in parallel.If M1 accepts, M accepts.If M2 accepts, M rejects.

Should now show that indeed M decides L.Not too hard... ♣

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.12

Decidability vs. Enumerability (2)

Theorem: R=RE ∩ coRE .Proof: We should prove the ⊇ direction. Namely ifL ∈ RE ∩ coRE , then L ∈ R.In other words, if both L and its complement areenumerable, then L is decidable.Let M1 be a TM that accepts L.Let M2 be a TM that accepts L.We describe a TM, M , that decides L.On input x, M runs M1 and M2 in parallel.If M1 accepts, M accepts.If M2 accepts, M rejects.Should now show that indeed M decides L.Not too hard... ♣

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.12

Some Perspective

Many models have been proposed forgeneral-purpose computation.

Remarkably, all “reasonable” models areequivalent to Turing machines.

All “reasonable” programming languages (e.g.Java, Pascal, C, Scheme, Mathematica, Maple, Cobol,. . . )are equivalent.

The notion of an algorithm ismodel-independent!

We don’t really care about Turing machines perse, we care about understanding computation.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.13

Some Perspective

Many models have been proposed forgeneral-purpose computation.

Remarkably, all “reasonable” models areequivalent to Turing machines.

All “reasonable” programming languages (e.g.Java, Pascal, C, Scheme, Mathematica, Maple, Cobol,. . . )are equivalent.

The notion of an algorithm ismodel-independent!

We don’t really care about Turing machines perse, we care about understanding computation.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.13

Some Perspective

Many models have been proposed forgeneral-purpose computation.

Remarkably, all “reasonable” models areequivalent to Turing machines.

All “reasonable” programming languages (e.g.Java, Pascal, C, Scheme, Mathematica, Maple, Cobol,. . . )are equivalent.

The notion of an algorithm ismodel-independent!

We don’t really care about Turing machines perse, we care about understanding computation.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.13

Some Perspective

Many models have been proposed forgeneral-purpose computation.

Remarkably, all “reasonable” models areequivalent to Turing machines.

All “reasonable” programming languages (e.g.Java, Pascal, C, Scheme, Mathematica, Maple, Cobol,. . . )are equivalent.

The notion of an algorithm ismodel-independent!

We don’t really care about Turing machines perse, we care about understanding computation.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.13

Some Perspective

Many models have been proposed forgeneral-purpose computation.

Remarkably, all “reasonable” models areequivalent to Turing machines.

All “reasonable” programming languages (e.g.Java, Pascal, C, Scheme, Mathematica, Maple, Cobol,. . . )are equivalent.

The notion of an algorithm is model-independent!

We don’t really care about Turing machines perse, we care about understanding computation.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.13

What is an Algorithm?

Informally

a recipea procedurea computer programwho cares? I know it when I see it _

Historically,notion has long history in Mathematics(starting with Euclid’s gcd algorithm), butnot precisely defined until 20th centuryinformal notions rarely questioned,still, they were insufficient

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.14

What is an Algorithm?

Informallya recipe

a procedurea computer programwho cares? I know it when I see it _

Historically,notion has long history in Mathematics(starting with Euclid’s gcd algorithm), butnot precisely defined until 20th centuryinformal notions rarely questioned,still, they were insufficient

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.14

What is an Algorithm?

Informallya recipea procedure

a computer programwho cares? I know it when I see it _

Historically,notion has long history in Mathematics(starting with Euclid’s gcd algorithm), butnot precisely defined until 20th centuryinformal notions rarely questioned,still, they were insufficient

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.14

What is an Algorithm?

Informallya recipea procedurea computer program

who cares? I know it when I see it _

Historically,notion has long history in Mathematics(starting with Euclid’s gcd algorithm), butnot precisely defined until 20th centuryinformal notions rarely questioned,still, they were insufficient

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.14

What is an Algorithm?

Informallya recipea procedurea computer programwho cares? I know it when I see it _

Historically,notion has long history in Mathematics(starting with Euclid’s gcd algorithm), butnot precisely defined until 20th centuryinformal notions rarely questioned,still, they were insufficient

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.14

What is an Algorithm?

Informallya recipea procedurea computer programwho cares? I know it when I see it _

Historically,

notion has long history in Mathematics(starting with Euclid’s gcd algorithm), butnot precisely defined until 20th centuryinformal notions rarely questioned,still, they were insufficient

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.14

What is an Algorithm?

Informallya recipea procedurea computer programwho cares? I know it when I see it _

Historically,notion has long history in Mathematics(starting with Euclid’s gcd algorithm), but

not precisely defined until 20th centuryinformal notions rarely questioned,still, they were insufficient

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.14

What is an Algorithm?

Informallya recipea procedurea computer programwho cares? I know it when I see it _

Historically,notion has long history in Mathematics(starting with Euclid’s gcd algorithm), butnot precisely defined until 20th century

informal notions rarely questioned,still, they were insufficient

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.14

What is an Algorithm?

Informallya recipea procedurea computer programwho cares? I know it when I see it _

Historically,notion has long history in Mathematics(starting with Euclid’s gcd algorithm), butnot precisely defined until 20th centuryinformal notions rarely questioned,

still, they were insufficient

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.14

What is an Algorithm?

Informallya recipea procedurea computer programwho cares? I know it when I see it _

Historically,notion has long history in Mathematics(starting with Euclid’s gcd algorithm), butnot precisely defined until 20th centuryinformal notions rarely questioned,still, they were insufficient

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.14

Dilbert’s Problems

Too much beer last night? These are D. Hilbert’sproblems, not Dilbert’s problems, we are supposed totalk about...

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.15

Dilbert’s Problems

Too much beer last night? These are D. Hilbert’sproblems, not Dilbert’s problems, we are supposed totalk about...

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.15

Hilbert’s 10th ProblemIn 1900, David Hilbert delivered a now-famousaddress at the International Congress ofMathematicians in Paris, France.

Presented 23 central mathematical problems

challenge for the next (20th) century

the 10th problem directly concerned algorithms

In November 2003, significantprogress made on the 6th problem.

But it is the 10th problem we careabout. Will start with some background.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.16

Hilbert’s 10th ProblemIn 1900, David Hilbert delivered a now-famousaddress at the International Congress ofMathematicians in Paris, France.

Presented 23 central mathematical problems

challenge for the next (20th) century

the 10th problem directly concerned algorithms

In November 2003, significantprogress made on the 6th problem.

But it is the 10th problem we careabout. Will start with some background.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.16

Hilbert’s 10th ProblemIn 1900, David Hilbert delivered a now-famousaddress at the International Congress ofMathematicians in Paris, France.

Presented 23 central mathematical problems

challenge for the next (20th) century

the 10th problem directly concerned algorithms

In November 2003, significantprogress made on the 6th problem.

But it is the 10th problem we careabout. Will start with some background.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.16

Hilbert’s 10th ProblemIn 1900, David Hilbert delivered a now-famousaddress at the International Congress ofMathematicians in Paris, France.

Presented 23 central mathematical problems

challenge for the next (20th) century

the 10th problem directly concerned algorithms

In November 2003, significantprogress made on the 6th problem.

But it is the 10th problem we careabout. Will start with some background.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.16

Hilbert’s 10th ProblemIn 1900, David Hilbert delivered a now-famousaddress at the International Congress ofMathematicians in Paris, France.

Presented 23 central mathematical problems

challenge for the next (20th) century

the 10th problem directly concerned algorithms

In November 2003, significantprogress made on the 6th problem.

But it is the 10th problem we careabout. Will start with some background.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.16

Polynomials

A term is a product of variables and a constantcoefficient, e.g. 6x3yz2.

A polynomial is a sum of terms, e.g.6x3yz2 + 3xy2 − x3 − 10 .

A root of a polynomial is an assignment of valuesto variables so that the polynomial equals zero.

For example, x = 5, y = 3, and z = 0 is a root ofthe polynomial above.

Here, we are interested in integral roots, namelyan assignment of integers to all variables.

Some polynomials have integral roots, somedon’t (e.g. x2 − 2).

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.17

Polynomials

A term is a product of variables and a constantcoefficient, e.g. 6x3yz2.

A polynomial is a sum of terms, e.g.6x3yz2 + 3xy2 − x3 − 10 .

A root of a polynomial is an assignment of valuesto variables so that the polynomial equals zero.

For example, x = 5, y = 3, and z = 0 is a root ofthe polynomial above.

Here, we are interested in integral roots, namelyan assignment of integers to all variables.

Some polynomials have integral roots, somedon’t (e.g. x2 − 2).

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.17

Polynomials

A term is a product of variables and a constantcoefficient, e.g. 6x3yz2.

A polynomial is a sum of terms, e.g.6x3yz2 + 3xy2 − x3 − 10 .

A root of a polynomial is an assignment of valuesto variables so that the polynomial equals zero.

For example, x = 5, y = 3, and z = 0 is a root ofthe polynomial above.

Here, we are interested in integral roots, namelyan assignment of integers to all variables.

Some polynomials have integral roots, somedon’t (e.g. x2 − 2).

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.17

Polynomials

A term is a product of variables and a constantcoefficient, e.g. 6x3yz2.

A polynomial is a sum of terms, e.g.6x3yz2 + 3xy2 − x3 − 10 .

A root of a polynomial is an assignment of valuesto variables so that the polynomial equals zero.

For example, x = 5, y = 3, and z = 0 is a root ofthe polynomial above.

Here, we are interested in integral roots, namelyan assignment of integers to all variables.

Some polynomials have integral roots, somedon’t (e.g. x2 − 2).

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.17

Polynomials

A term is a product of variables and a constantcoefficient, e.g. 6x3yz2.

A polynomial is a sum of terms, e.g.6x3yz2 + 3xy2 − x3 − 10 .

A root of a polynomial is an assignment of valuesto variables so that the polynomial equals zero.

For example, x = 5, y = 3, and z = 0 is a root ofthe polynomial above.

Here, we are interested in integral roots, namelyan assignment of integers to all variables.

Some polynomials have integral roots, somedon’t (e.g. x2 − 2).

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.17

Polynomials

A term is a product of variables and a constantcoefficient, e.g. 6x3yz2.

A polynomial is a sum of terms, e.g.6x3yz2 + 3xy2 − x3 − 10 .

A root of a polynomial is an assignment of valuesto variables so that the polynomial equals zero.

For example, x = 5, y = 3, and z = 0 is a root ofthe polynomial above.

Here, we are interested in integral roots, namelyan assignment of integers to all variables.

Some polynomials have integral roots, somedon’t (e.g. x2 − 2).

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.17

Hilbert’s Tenth Problem

The Problem: Devise an algorithm that tests whether apolynomial has an integral root.

Actually, what he said (translated from German) was

“to devise a process according to which itcan be determined by a finite number ofoperations”.

Note that

Hilbert explicitly asks that algorithm be“devised”

apparently Hilbert assumes that such an algorithmmust exist, and someone “only” need find it.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.18

Hilbert’s Tenth Problem

The Problem: Devise an algorithm that tests whether apolynomial has an integral root.

Actually, what he said (translated from German) was

“to devise a process according to which itcan be determined by a finite number ofoperations”.

Note that

Hilbert explicitly asks that algorithm be“devised”

apparently Hilbert assumes that such an algorithmmust exist, and someone “only” need find it.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.18

Hilbert’s Tenth ProblemWe now know no algorithm exists for this task.

Mathematicians of 1900 could not have provedthis, because they didn’t have a formal notion ofan algorithm.

Intuitive notions work fine for constructingalgorithms (we know one when we see it).

Formal notions are required to show that noalgorithm exists.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.19

Hilbert’s Tenth ProblemWe now know no algorithm exists for this task.

Mathematicians of 1900 could not have provedthis, because they didn’t have a formal notion ofan algorithm.

Intuitive notions work fine for constructingalgorithms (we know one when we see it).

Formal notions are required to show that noalgorithm exists.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.19

Hilbert’s Tenth ProblemWe now know no algorithm exists for this task.

Mathematicians of 1900 could not have provedthis, because they didn’t have a formal notion ofan algorithm.

Intuitive notions work fine for constructingalgorithms (we know one when we see it).

Formal notions are required to show that noalgorithm exists.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.19

Hilbert’s Tenth ProblemWe now know no algorithm exists for this task.

Mathematicians of 1900 could not have provedthis, because they didn’t have a formal notion ofan algorithm.

Intuitive notions work fine for constructingalgorithms (we know one when we see it).

Formal notions are required to show that noalgorithm exists.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.19

Church-Turing Thesis

Formal notions appeared in 1936:

λ-calculus of Alonzo Church

Turing machines of Alan Turing

Recursive functions of Stephen Kleene

Correspondence systems of Emil Post

These definitions look very different and havevery different characteristics, yet they areprovably equivalent.

Church-Turing Thesis:

“The intuitive notion of algorithms equalsTuring machine algorithms”.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.20

Church-Turing Thesis

Formal notions appeared in 1936:

λ-calculus of Alonzo Church

Turing machines of Alan Turing

Recursive functions of Stephen Kleene

Correspondence systems of Emil Post

These definitions look very different and havevery different characteristics, yet they areprovably equivalent.

Church-Turing Thesis:

“The intuitive notion of algorithms equalsTuring machine algorithms”.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.20

Church-Turing Thesis

Formal notions appeared in 1936:

λ-calculus of Alonzo Church

Turing machines of Alan Turing

Recursive functions of Stephen Kleene

Correspondence systems of Emil Post

These definitions look very different and havevery different characteristics, yet they areprovably equivalent.

Church-Turing Thesis:

“The intuitive notion of algorithms equalsTuring machine algorithms”.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.20

Church-Turing Thesis

Formal notions appeared in 1936:

λ-calculus of Alonzo Church

Turing machines of Alan Turing

Recursive functions of Stephen Kleene

Correspondence systems of Emil Post

These definitions look very different and havevery different characteristics, yet they areprovably equivalent.

Church-Turing Thesis:

“The intuitive notion of algorithms equalsTuring machine algorithms”.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.20

Church-Turing Thesis

Formal notions appeared in 1936:

λ-calculus of Alonzo Church

Turing machines of Alan Turing

Recursive functions of Stephen Kleene

Correspondence systems of Emil Post

These definitions look very different and havevery different characteristics, yet they areprovably equivalent.

Church-Turing Thesis:

“The intuitive notion of algorithms equalsTuring machine algorithms”.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.20

Church-Turing Thesis

Formal notions appeared in 1936:

λ-calculus of Alonzo Church

Turing machines of Alan Turing

Recursive functions of Stephen Kleene

Correspondence systems of Emil Post

These definitions look very different and havevery different characteristics, yet they areprovably equivalent.

Church-Turing Thesis:

“The intuitive notion of algorithms equalsTuring machine algorithms”.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.20

Hilbert’s Tenth Problem

In 1970, 23 years old Yuri Matijasevic̆, building onwork of Martin Davis, Hilary Putnam, and JuliaRobinson, proved that no algorithm exists for testingwhether a polynomial has integral roots(a survey of the proof)

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.21

Reformulating Hilbert’s Tenth Problem

Consider the language:

D = {p | p is a polynomial with an integral root}

Hilbert’s tenth problem asks whether this language isdecidable.

We now know it is not decidable, but it is enumerable!

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.22

Reformulating Hilbert’s Tenth Problem

Consider the language:

D = {p | p is a polynomial with an integral root}

Hilbert’s tenth problem asks whether this language isdecidable.

We now know it is not decidable, but it is enumerable!

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.22

Reformulating Hilbert’s Tenth Problem

Consider the language:

D = {p | p is a polynomial with an integral root}

Hilbert’s tenth problem asks whether this language isdecidable.

We now know it is not decidable, but it is enumerable!

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.22

Univariate Polynomials

Consider the simpler language:

D1 = {p | p is a polynomial over x with an integral root}

Here is a Turing machine that accepts D1.On input p,

evaluate p with x set successively to0, 1,−1, 2,−2, . . ..

if p evaluates to zero, accept.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.23

Univariate Polynomials

Consider the simpler language:

D1 = {p | p is a polynomial over x with an integral root}

Here is a Turing machine that accepts D1.

On input p,

evaluate p with x set successively to0, 1,−1, 2,−2, . . ..

if p evaluates to zero, accept.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.23

Univariate Polynomials

Consider the simpler language:

D1 = {p | p is a polynomial over x with an integral root}

Here is a Turing machine that accepts D1.On input p,

evaluate p with x set successively to0, 1,−1, 2,−2, . . ..

if p evaluates to zero, accept.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.23

Univariate Polynomials (2)

D1 = {p | p is a polynomial over x with an integral root}

Note that

If p has an integral root, the machine accepts.

If not, M1 loops.

M1 is an acceptor, but not a decider.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.24

Univariate Polynomials (2)

D1 = {p | p is a polynomial over x with an integral root}

Note that

If p has an integral root, the machine accepts.

If not, M1 loops.

M1 is an acceptor, but not a decider.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.24

Univariate Polynomials (2)

D1 = {p | p is a polynomial over x with an integral root}

Note that

If p has an integral root, the machine accepts.

If not, M1 loops.

M1 is an acceptor, but not a decider.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.24

Univariate Polynomials (2)

D1 = {p | p is a polynomial over x with an integral root}

Note that

If p has an integral root, the machine accepts.

If not, M1 loops.

M1 is an acceptor, but not a decider.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.24

Univariate Polynomials (3)

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.25

Univariate Polynomials (4)

In fact, D1 is decidable.

Can show that all real roots of p[x] lie inside interval

( −|kcmax/c1|, |kcmax/c1| ) ,

where k is number of terms, cmax is max coefficient,and c1 is high-order coefficient.

By Matijasevic̆ theorem, such effective bounds onrange of real roots cannot be computed formultivariable polynomials.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.26

Univariate Polynomials (4)

In fact, D1 is decidable.

Can show that all real roots of p[x] lie inside interval

( −|kcmax/c1|, |kcmax/c1| ) ,

where k is number of terms, cmax is max coefficient,and c1 is high-order coefficient.

By Matijasevic̆ theorem, such effective bounds onrange of real roots cannot be computed formultivariable polynomials.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.26

Wild ModelsWhat about “unreasonable” models of computation?Consider MUntel’s ℵ-AXP c© processor (to be releasedXMAS 2003).

Like a Turing machine, except

Takes first step in 1 second.

Takes second step in 1/2 second.

Takes i-th step in 2−i seconds . . .

After 2 seconds, the ℵ-AXP c© decides any enumerablelanguage!

Question: Does the ℵ-AXP c© invalidate theChurch-Turing Thesis?

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.27

Wild ModelsWhat about “unreasonable” models of computation?Consider MUntel’s ℵ-AXP c© processor (to be releasedXMAS 2003).

Like a Turing machine, except

Takes first step in 1 second.

Takes second step in 1/2 second.

Takes i-th step in 2−i seconds . . .

After 2 seconds, the ℵ-AXP c© decides any enumerablelanguage!

Question: Does the ℵ-AXP c© invalidate theChurch-Turing Thesis?

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.27

Wild ModelsWhat about “unreasonable” models of computation?Consider MUntel’s ℵ-AXP c© processor (to be releasedXMAS 2003).

Like a Turing machine, except

Takes first step in 1 second.

Takes second step in 1/2 second.

Takes i-th step in 2−i seconds . . .

After 2 seconds, the ℵ-AXP c© decides any enumerablelanguage!

Question: Does the ℵ-AXP c© invalidate theChurch-Turing Thesis?

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.27

Wild ModelsWhat about “unreasonable” models of computation?Consider MUntel’s ℵ-AXP c© processor (to be releasedXMAS 2003).

Like a Turing machine, except

Takes first step in 1 second.

Takes second step in 1/2 second.

Takes i-th step in 2−i seconds . . .

After 2 seconds, the ℵ-AXP c© decides any enumerablelanguage!

Question: Does the ℵ-AXP c© invalidate theChurch-Turing Thesis?

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.27

Wild ModelsWhat about “unreasonable” models of computation?Consider MUntel’s ℵ-AXP c© processor (to be releasedXMAS 2003).

Like a Turing machine, except

Takes first step in 1 second.

Takes second step in 1/2 second.

Takes i-th step in 2−i seconds . . .

After 2 seconds, the ℵ-AXP c© decides any enumerablelanguage!

Question: Does the ℵ-AXP c© invalidate theChurch-Turing Thesis?

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.27

Wild ModelsWhat about “unreasonable” models of computation?Consider MUntel’s ℵ-AXP c© processor (to be releasedXMAS 2003).

Like a Turing machine, except

Takes first step in 1 second.

Takes second step in 1/2 second.

Takes i-th step in 2−i seconds . . .

After 2 seconds, the ℵ-AXP c© decides any enumerablelanguage!

Question: Does the ℵ-AXP c© invalidate theChurch-Turing Thesis?

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.27

Encoding

Input to a Turing machine is a string of symbols.

But we want algorithms that work on graphs,matrices, polynomials, Turing machines, etc.

Need to choose an encoding for objects.

Can often be done in many reasonable ways.

Sometimes distinguish between X , the object,and 〈X〉, its encoding.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.28

Encoding

Input to a Turing machine is a string of symbols.

But we want algorithms that work on graphs,matrices, polynomials, Turing machines, etc.

Need to choose an encoding for objects.

Can often be done in many reasonable ways.

Sometimes distinguish between X , the object,and 〈X〉, its encoding.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.28

Encoding

Input to a Turing machine is a string of symbols.

But we want algorithms that work on graphs,matrices, polynomials, Turing machines, etc.

Need to choose an encoding for objects.

Can often be done in many reasonable ways.

Sometimes distinguish between X , the object,and 〈X〉, its encoding.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.28

Encoding

Input to a Turing machine is a string of symbols.

But we want algorithms that work on graphs,matrices, polynomials, Turing machines, etc.

Need to choose an encoding for objects.

Can often be done in many reasonable ways.

Sometimes distinguish between X , the object,and 〈X〉, its encoding.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.28

Encoding

Input to a Turing machine is a string of symbols.

But we want algorithms that work on graphs,matrices, polynomials, Turing machines, etc.

Need to choose an encoding for objects.

Can often be done in many reasonable ways.

Sometimes distinguish between X , the object,and 〈X〉, its encoding.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.28

Encoding

Consider strings representing undirected graphs.

A graph is connected if every node can be reachedfrom any other node by traveling along edges.

Define the language:

A = {〈G〉 | G is a connected undirected graph}

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.29

High-Level Description

High-level description of a machine that decides

A = {〈G〉 | G is a connected undirected graph}

On input 〈G〉, encoding of graph G

select first node of G and mark it.

repeat until no new nodes marked:

For each node in G, mark if attached by an edgeto a node already marked.

scan nodes of G to determine whether they are allmarked. If so, accept, otherwise reject.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.30

High-Level Description

High-level description of a machine that decides

A = {〈G〉 | G is a connected undirected graph}

On input 〈G〉, encoding of graph G

select first node of G and mark it.

repeat until no new nodes marked:

For each node in G, mark if attached by an edgeto a node already marked.

scan nodes of G to determine whether they are allmarked. If so, accept, otherwise reject.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.30

High-Level Description

High-level description of a machine that decides

A = {〈G〉 | G is a connected undirected graph}

On input 〈G〉, encoding of graph G

select first node of G and mark it.

repeat until no new nodes marked:

For each node in G, mark if attached by an edgeto a node already marked.

scan nodes of G to determine whether they are allmarked. If so, accept, otherwise reject.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.30

High-Level Description

High-level description of a machine that decides

A = {〈G〉 | G is a connected undirected graph}

On input 〈G〉, encoding of graph G

select first node of G and mark it.

repeat until no new nodes marked:

For each node in G, mark if attached by an edgeto a node already marked.

scan nodes of G to determine whether they are allmarked. If so, accept, otherwise reject.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.30

Some DetailsQuestion: How is G encoded?

14

23

G <G>

(1,2,3,4)((1,2),(2,3),(3,1),(1,4))

vertexes edges

Answer: List of nodes, followed by list of edges.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.31

More Details

On input M checks that input is valid graph encoding

two lists

first is list of numbers

second is list of pairs

first list contains no duplicates (elementdistinctness subroutine)

every node in second list appears in first

Now ready to start “step one”.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.32

More Details

On input M checks that input is valid graph encoding

two lists

first is list of numbers

second is list of pairs

first list contains no duplicates (elementdistinctness subroutine)

every node in second list appears in first

Now ready to start “step one”.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.32

More Details

On input M checks that input is valid graph encoding

two lists

first is list of numbers

second is list of pairs

first list contains no duplicates (elementdistinctness subroutine)

every node in second list appears in first

Now ready to start “step one”.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.32

More Details

On input M checks that input is valid graph encoding

two lists

first is list of numbers

second is list of pairs

first list contains no duplicates (elementdistinctness subroutine)

every node in second list appears in first

Now ready to start “step one”.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.32

More Details

On input M checks that input is valid graph encoding

two lists

first is list of numbers

second is list of pairs

first list contains no duplicates (elementdistinctness subroutine)

every node in second list appears in first

Now ready to start “step one”.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.32

More Details

On input M checks that input is valid graph encoding

two lists

first is list of numbers

second is list of pairs

first list contains no duplicates (elementdistinctness subroutine)

every node in second list appears in first

Now ready to start “step one”.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.32

Detailed AlgorithmOn input 〈G〉, encoding of graph G

1. mark first node with a dot on leftmost digit.

2. loop:

Scans list and “underlines” undotted node n1.

M rescans and “underlines” dotted node n2.

M scans edges.

M tests each edge if it is (n1, n2).

If so, dot n1, remove underlines, goto Step 2.

If not, check next edge. When no more edges, move

underline to next dotted n2.

when no more dotted vertexes, move underlines: new n1 is

next undotted node and new n1 is first dotted node. Repeat

Step 2. When no more undotted nodes, go to Step 3.

3. M scans the list of nodes. If all dotted, accept, otherwise reject.Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.33

Detailed AlgorithmOn input 〈G〉, encoding of graph G

1. mark first node with a dot on leftmost digit.

2. loop:

Scans list and “underlines” undotted node n1.

M rescans and “underlines” dotted node n2.

M scans edges.

M tests each edge if it is (n1, n2).

If so, dot n1, remove underlines, goto Step 2.

If not, check next edge. When no more edges, move

underline to next dotted n2.

when no more dotted vertexes, move underlines: new n1 is

next undotted node and new n1 is first dotted node. Repeat

Step 2. When no more undotted nodes, go to Step 3.

3. M scans the list of nodes. If all dotted, accept, otherwise reject.Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.33

Detailed AlgorithmOn input 〈G〉, encoding of graph G

1. mark first node with a dot on leftmost digit.

2. loop:

Scans list and “underlines” undotted node n1.

M rescans and “underlines” dotted node n2.

M scans edges.

M tests each edge if it is (n1, n2).

If so, dot n1, remove underlines, goto Step 2.

If not, check next edge. When no more edges, move

underline to next dotted n2.

when no more dotted vertexes, move underlines: new n1 is

next undotted node and new n1 is first dotted node. Repeat

Step 2. When no more undotted nodes, go to Step 3.

3. M scans the list of nodes. If all dotted, accept, otherwise reject.Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.33

Detailed AlgorithmOn input 〈G〉, encoding of graph G

1. mark first node with a dot on leftmost digit.

2. loop:

Scans list and “underlines” undotted node n1.

M rescans and “underlines” dotted node n2.

M scans edges.

M tests each edge if it is (n1, n2).

If so, dot n1, remove underlines, goto Step 2.

If not, check next edge. When no more edges, move

underline to next dotted n2.

when no more dotted vertexes, move underlines: new n1 is

next undotted node and new n1 is first dotted node. Repeat

Step 2. When no more undotted nodes, go to Step 3.

3. M scans the list of nodes. If all dotted, accept, otherwise reject.Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.33

Detailed AlgorithmOn input 〈G〉, encoding of graph G

1. mark first node with a dot on leftmost digit.

2. loop:

Scans list and “underlines” undotted node n1.

M rescans and “underlines” dotted node n2.

M scans edges.

M tests each edge if it is (n1, n2).

If so, dot n1, remove underlines, goto Step 2.

If not, check next edge. When no more edges, move

underline to next dotted n2.

when no more dotted vertexes, move underlines: new n1 is

next undotted node and new n1 is first dotted node. Repeat

Step 2. When no more undotted nodes, go to Step 3.

3. M scans the list of nodes. If all dotted, accept, otherwise reject.Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.33

Detailed AlgorithmOn input 〈G〉, encoding of graph G

1. mark first node with a dot on leftmost digit.

2. loop:

Scans list and “underlines” undotted node n1.

M rescans and “underlines” dotted node n2.

M scans edges.

M tests each edge if it is (n1, n2).

If so, dot n1, remove underlines, goto Step 2.

If not, check next edge. When no more edges, move

underline to next dotted n2.

when no more dotted vertexes, move underlines: new n1 is

next undotted node and new n1 is first dotted node. Repeat

Step 2. When no more undotted nodes, go to Step 3.

3. M scans the list of nodes. If all dotted, accept, otherwise reject.Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.33

Detailed AlgorithmOn input 〈G〉, encoding of graph G

1. mark first node with a dot on leftmost digit.

2. loop:

Scans list and “underlines” undotted node n1.

M rescans and “underlines” dotted node n2.

M scans edges.

M tests each edge if it is (n1, n2).

If so, dot n1, remove underlines, goto Step 2.

If not, check next edge. When no more edges, move

underline to next dotted n2.

when no more dotted vertexes, move underlines: new n1 is

next undotted node and new n1 is first dotted node. Repeat

Step 2. When no more undotted nodes, go to Step 3.

3. M scans the list of nodes. If all dotted, accept, otherwise reject.Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.33

Detailed AlgorithmOn input 〈G〉, encoding of graph G

1. mark first node with a dot on leftmost digit.

2. loop:

Scans list and “underlines” undotted node n1.

M rescans and “underlines” dotted node n2.

M scans edges.

M tests each edge if it is (n1, n2).

If so, dot n1, remove underlines, goto Step 2.

If not, check next edge. When no more edges, move

underline to next dotted n2.

when no more dotted vertexes, move underlines: new n1 is

next undotted node and new n1 is first dotted node. Repeat

Step 2. When no more undotted nodes, go to Step 3.

3. M scans the list of nodes. If all dotted, accept, otherwise reject.Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.33

Detailed AlgorithmOn input 〈G〉, encoding of graph G

1. mark first node with a dot on leftmost digit.

2. loop:

Scans list and “underlines” undotted node n1.

M rescans and “underlines” dotted node n2.

M scans edges.

M tests each edge if it is (n1, n2).

If so, dot n1, remove underlines, goto Step 2.

If not, check next edge. When no more edges, move

underline to next dotted n2.

when no more dotted vertexes, move underlines: new n1 is

next undotted node and new n1 is first dotted node. Repeat

Step 2. When no more undotted nodes, go to Step 3.

3. M scans the list of nodes. If all dotted, accept, otherwise reject.Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.33

Decidability of Languages

Question: Why study decidability?

Good for improving your imagination.

Some of the most beautiful and importantmathematics of the 20th century, and you canactually understand it!

Your boss (who never took this course...) ordersyou to solve Hilbert’s 10th, or else.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.34

Decidability of Languages

Question: Why study decidability?

Good for improving your imagination.

Some of the most beautiful and importantmathematics of the 20th century, and you canactually understand it!

Your boss (who never took this course...) ordersyou to solve Hilbert’s 10th, or else.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.34

Decidability of Languages

Question: Why study decidability?

Good for improving your imagination.

Some of the most beautiful and importantmathematics of the 20th century, and you canactually understand it!

Your boss (who never took this course...) ordersyou to solve Hilbert’s 10th, or else.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.34

Examples of Decidable Languages

Finite automata problems can be reformulated aslanguages.

Does DFA, B, accept input string w?

Consider the language:

ADFA = {〈B,w〉 | B is a DFA that accepts w}

The following are equivalent:

B accepts w

〈B,w〉 ∈ ADFA

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.35

Examples of Decidable Languages

Finite automata problems can be reformulated aslanguages.

Does DFA, B, accept input string w?

Consider the language:

ADFA = {〈B,w〉 | B is a DFA that accepts w}

The following are equivalent:

B accepts w

〈B,w〉 ∈ ADFA

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.35

Examples of Decidable Languages

Finite automata problems can be reformulated aslanguages.

Does DFA, B, accept input string w?

Consider the language:

ADFA = {〈B,w〉 | B is a DFA that accepts w}

The following are equivalent:

B accepts w

〈B,w〉 ∈ ADFA

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.35

Decidability of DFA Acceptance

Theorem: ADFA is a decidable language.

Proof: On input 〈B,w〉, where B is a DFA and w astring:

1. Simulate B on input w

2. if simulation ends in accepting state, accept,otherwise reject. ♣

Remarks

“where” clause means scan and check condition.

B represented by a list of (Q, Σ, δ, q0, F ).

simulation straightforward.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.36

Decidability of DFA Acceptance

Theorem: ADFA is a decidable language.

Proof: On input 〈B,w〉, where B is a DFA and w astring:

1. Simulate B on input w

2. if simulation ends in accepting state, accept,otherwise reject. ♣

Remarks

“where” clause means scan and check condition.

B represented by a list of (Q, Σ, δ, q0, F ).

simulation straightforward.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.36

Decidability of DFA Acceptance

Theorem: ADFA is a decidable language.

Proof: On input 〈B,w〉, where B is a DFA and w astring:

1. Simulate B on input w

2. if simulation ends in accepting state, accept,otherwise reject. ♣

Remarks

“where” clause means scan and check condition.

B represented by a list of (Q, Σ, δ, q0, F ).

simulation straightforward.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.36

Decidability of NFA Acceptance

Does an NFA accept a string?

ANFA = {〈B,w〉|B is an NFA that accepts w}

Theorem: ANFA is a decidable language.

On input 〈B,w〉, where B is an NFA and w a string:

1. Convert the NFA B into equivalent DFA C .

2. Run previous TM on input 〈C,w〉.

3. if that TM accepts, accept, otherwise reject. ♣

Note use of subroutine (2).

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.37

Decidability of NFA Acceptance

Does an NFA accept a string?

ANFA = {〈B,w〉|B is an NFA that accepts w}

Theorem: ANFA is a decidable language.

On input 〈B,w〉, where B is an NFA and w a string:

1. Convert the NFA B into equivalent DFA C .

2. Run previous TM on input 〈C,w〉.

3. if that TM accepts, accept, otherwise reject. ♣

Note use of subroutine (2).

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.37

Decidability of NFA Acceptance

Does an NFA accept a string?

ANFA = {〈B,w〉|B is an NFA that accepts w}

Theorem: ANFA is a decidable language.

On input 〈B,w〉, where B is an NFA and w a string:

1. Convert the NFA B into equivalent DFA C .

2. Run previous TM on input 〈C,w〉.

3. if that TM accepts, accept, otherwise reject. ♣

Note use of subroutine (2).

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.37

Decidability of Reg. Exp. Generation

Does a regular expression generate a string?

AREX = {〈R,w〉 | R is a regular expression

that generates w}

Theorem: AREX is a decidable language.

On input 〈R,w〉, where R is a regular expression andw a string:

1. Convert regular expression R into DFA C .

2. Run earlier TM on input 〈C,w〉.

3. If that TM accepts, accept, otherwise reject. ♣

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.38

Decidability of Reg. Exp. Generation

Does a regular expression generate a string?

AREX = {〈R,w〉 | R is a regular expression

that generates w}

Theorem: AREX is a decidable language.

On input 〈R,w〉, where R is a regular expression andw a string:

1. Convert regular expression R into DFA C .

2. Run earlier TM on input 〈C,w〉.

3. If that TM accepts, accept, otherwise reject. ♣

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.38

Decidability of Reg. Exp. Generation

Does a regular expression generate a string?

AREX = {〈R,w〉 | R is a regular expression

that generates w}

Theorem: AREX is a decidable language.

On input 〈R,w〉, where R is a regular expression andw a string:

1. Convert regular expression R into DFA C .

2. Run earlier TM on input 〈C,w〉.

3. If that TM accepts, accept, otherwise reject. ♣Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.38

Decidability of DFA Emptiness

Does a DFA accept the empty language, ∅?

Define EDFA = {〈A〉|A is a DFA and L(A) = ∅}

Theorem: EDFA is a decidable language.On input 〈A〉, where A is a DFA:

1. Mark the start state of A.

2. Repeat until no new states are marked:

3. Mark any state that has a transition coming into it from any

already marked state.

4. If no accept state is marked, accept, otherwise reject. ♣

This TM actually just tests whether any accepting state is

reachable from initial state (a reachability problem in digraphs).

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.39

Decidability of DFA Emptiness

Does a DFA accept the empty language, ∅?

Define EDFA = {〈A〉|A is a DFA and L(A) = ∅}

Theorem: EDFA is a decidable language.

On input 〈A〉, where A is a DFA:

1. Mark the start state of A.

2. Repeat until no new states are marked:

3. Mark any state that has a transition coming into it from any

already marked state.

4. If no accept state is marked, accept, otherwise reject. ♣

This TM actually just tests whether any accepting state is

reachable from initial state (a reachability problem in digraphs).

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.39

Decidability of DFA Emptiness

Does a DFA accept the empty language, ∅?

Define EDFA = {〈A〉|A is a DFA and L(A) = ∅}

Theorem: EDFA is a decidable language.On input 〈A〉, where A is a DFA:

1. Mark the start state of A.

2. Repeat until no new states are marked:

3. Mark any state that has a transition coming into it from any

already marked state.

4. If no accept state is marked, accept, otherwise reject. ♣

This TM actually just tests whether any accepting state is

reachable from initial state (a reachability problem in digraphs).Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.39

Decidability of DFAs Equivalence

EQDFA = {〈A,B〉 | A,B are DFAs and L(A) = L(B)}

Theorem: EQDFA is a decidable language.We construct a new DFA C from A and B, such thatC accepts only string accepted by A or by B, but notboth. In other wordsL(C) =

(

L(A) ∩ L(B))

∪(

L(A) ∩ L(B))

.

L(C) is symmetric difference of L(A) and L(B)

use construction used for regular languagetheorems

construction can be expressed as a TM

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.40

Decidability of DFAs Equivalence

EQDFA = {〈A,B〉 | A,B are DFAs and L(A) = L(B)}

Theorem: EQDFA is a decidable language.

We construct a new DFA C from A and B, such thatC accepts only string accepted by A or by B, but notboth. In other wordsL(C) =

(

L(A) ∩ L(B))

∪(

L(A) ∩ L(B))

.

L(C) is symmetric difference of L(A) and L(B)

use construction used for regular languagetheorems

construction can be expressed as a TM

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.40

Decidability of DFAs Equivalence

EQDFA = {〈A,B〉 | A,B are DFAs and L(A) = L(B)}

Theorem: EQDFA is a decidable language.We construct a new DFA C from A and B, such thatC accepts only string accepted by A or by B, but notboth. In other wordsL(C) =

(

L(A) ∩ L(B))

∪(

L(A) ∩ L(B))

.

L(C) is symmetric difference of L(A) and L(B)

use construction used for regular languagetheorems

construction can be expressed as a TM

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.40

DFA Equivalence (continued)

Theorem: EQDFA is a decidable language.

On input 〈A.B〉, where A,B are DFAs:

1. Construct DFA, C , as described.

2. Run previous “emptyness” TM on input 〈C〉.

3. If that TM accepts, accept, otherwise reject. ♣

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.41

DFA Equivalence (continued)

Theorem: EQDFA is a decidable language.

On input 〈A.B〉, where A,B are DFAs:

1. Construct DFA, C , as described.

2. Run previous “emptyness” TM on input 〈C〉.

3. If that TM accepts, accept, otherwise reject. ♣

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.41

Decidability of CFG Generation

Does a CFG generate a given string?

DefineACFG = {〈G,w〉 | string w is generated by CFG G}

Theorem: The language ACFG is decidable.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.42

Decidability of CFG Generation (2)

Does a CFG generate a given string?

Initial Idea: Design a TM, M , to try all derivations.

Problem: M accepts, but does not decide. (why?)

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.43

Decidability of CFG Generation (2)

Does a CFG generate a given string?

Initial Idea: Design a TM, M , to try all derivations.

Problem: M accepts, but does not decide. (why?)

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.43

Decidability of CFG Generation (2)

Does a CFG generate a given string?

Initial Idea: Design a TM, M , to try all derivations.

Problem: M accepts, but does not decide. (why?)

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.43

Decidability of CFG Generation (3)

Lemma: If G is in Chomsky normal form, |w| = n,and w is generated by G, then w has a derivation oflength 2n− 1 or less.

We won’t prove this (go ahead — try it at home!).

Algorithm’s idea:

First, convert G to Chomsky normal form.

Now need only consider a finite number ofderivations – those of length 2n− 1 or less.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.44

Decidability of CFG Generation (3)

Lemma: If G is in Chomsky normal form, |w| = n,and w is generated by G, then w has a derivation oflength 2n− 1 or less.

We won’t prove this (go ahead — try it at home!).

Algorithm’s idea:

First, convert G to Chomsky normal form.

Now need only consider a finite number ofderivations – those of length 2n− 1 or less.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.44

Decidability of CFG Generation (3)

Lemma: If G is in Chomsky normal form, |w| = n,and w is generated by G, then w has a derivation oflength 2n− 1 or less.

We won’t prove this (go ahead — try it at home!).

Algorithm’s idea:

First, convert G to Chomsky normal form.

Now need only consider a finite number ofderivations – those of length 2n− 1 or less.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.44

Decidability of CFG Generation (3)

Theorem: ACFG is a decidable language.

On input 〈G,w〉, where G is a grammar and w astring,

1. Convert G to Chomsky normal form.

2. List all derivations with 2n− 1 steps, weren = |w|.

3. If any generates w, accept, otherwise reject. ♣

Remarks:

related to problem of compiling prog. languages

would you want to use this algorithm at work?

every theorem about CFLs is also about PDAs.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.45

Decidability of CFG Generation (3)

Theorem: ACFG is a decidable language.

On input 〈G,w〉, where G is a grammar and w astring,

1. Convert G to Chomsky normal form.

2. List all derivations with 2n− 1 steps, weren = |w|.

3. If any generates w, accept, otherwise reject. ♣

Remarks:

related to problem of compiling prog. languages

would you want to use this algorithm at work?

every theorem about CFLs is also about PDAs.

Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.45

Decidability of CFG Generation (3)

Theorem: ACFG is a decidable language.

On input 〈G,w〉, where G is a grammar and w astring,

1. Convert G to Chomsky normal form.

2. List all derivations with 2n− 1 steps, weren = |w|.

3. If any generates w, accept, otherwise reject. ♣

Remarks:

related to problem of compiling prog. languages

would you want to use this algorithm at work?

every theorem about CFLs is also about PDAs.Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. – p.45