1
Proof Methods for Propositional Logic
Russell and Norvig Chapter 7
CS440 Fall 2015 1
Logical equivalence
n Two sentences are logically equivalent iff they are true in the same models: α ≡ ß iff α╞ β and β╞ α
CS440 Fall 2015 2
Validity and satisfiability A sentence is valid (a tautology) if it is true in all models
e.g., True, A ∨¬A, A ⇒ A, (A ∧ (A ⇒ B)) ⇒ B
Validity is connected to inference via the Deduction Theorem: KB ╞ α if and only if (KB ⇒ α) is valid
A sentence is satisfiable if it is true in some model e.g., A ∨ B
A sentence is unsatisfiable if it is false in all models e.g., A ∧ ¬A
Satisfiability is connected to inference via the following: KB ╞ α if and only if (KB ∧¬α) is unsatisfiable (known as proof by contradiction)
CS440 Fall 2015 3
Inference rules
n Modus Ponens A ⇒ B, A B Example: “raining implies soggy courts”, “raining” Infer: “soggy courts”
CS440 Fall 2015 4
2
Example
n KB: {A⇒B, B ⇒C, A} n Is C entailed? n Yes.
Given Rule Inferred
A⇒B, A Modus Ponens B
B ⇒C, B Modus Ponens C
CS440 Fall 2015 5
Inference rules (cont.)
n Modus Tollens A ⇒ B, ¬B ¬A Example: “raining implies soggy courts”, “courts not soggy” Infer: “not raining”
n And-elimination A ∧ B A
CS440 Fall 2015 6
Reminder: The Wumpus World
n Performance measure q gold: +1000, death: -1000 q -1 per step, -10 for using the arrow
n Environment q Squares adjacent to wumpus: smelly q Squares adjacent to pit: breezy q Glitter iff gold is in the same square q Shooting kills wumpus if you are facing it q Shooting uses up the only arrow q Grabbing picks up gold if in same square
n Sensors: Stench, Breeze, Glitter, Bump n Actuators: Left turn, Right turn, Forward, Grab, Release,
Shoot
CS440 Fall 2015 7
Inference in the wumpus world
Given: 1. ¬B1,1 2. B1,1 ⇔ (P1,2 ∨ P2,1) Let’s make some inferences: 1. (B1,1 ⇒ (P1,2 ∨ P2,1)) ∧ ((P1,2 ∨ P2,1) ⇒ B1,1 ) (By
definition of the biconditional) 2. (P1,2 ∨ P2,1) ⇒ B1,1 (And-elimination)
3. ¬B1,1 ⇒ ¬(P1,2 ∨ P2,1) (equivalence with contrapositive) 4. ¬(P1,2 ∨ P2,1) (modus ponens) 5. ¬P1,2 ∧ ¬P2,1 (DeMorgan’s rule) 6. ¬P1,2 (And-elimination) 7. ¬P2,1 (And-elimination)
CS440 Fall 2015 8
3
More inference
n Recall that when we were at (2,1) we could not decide on a safe move, so we backtracked, and explored (1,2), which yielded ¬B1,2. This yields ¬P2,2 ∧ ¬P1,3
n Now we can consider the implications of B2,1
CS440 Fall 2015 9
ABG
PS
W
= Agent = Breeze = Glitter, Gold
= Pit = Stench
= Wumpus
OK = Safe square
V = Visited
A
OK 1,1 2,1 3,1 4,1
1,2 2,2 3,2 4,2
1,3 2,3 3,3 4,3
1,4 2,4 3,4 4,4
OKOKB
P?
P?A
OK OK
OK 1,1 2,1 3,1 4,1
1,2 2,2 3,2 4,2
1,3 2,3 3,3 4,3
1,4 2,4 3,4 4,4
V
(a) (b)
BB P!
A
OK OK
OK 1,1 2,1 3,1 4,1
1,2 2,2 3,2 4,2
1,3 2,3 3,3 4,3
1,4 2,4 3,4 4,4
V
OK
W!
VP!
A
OK OK
OK 1,1 2,1 3,1 4,1
1,2 2,2 3,2 4,2
1,3 2,3 3,3 4,3
1,4 2,4 3,4 4,4
V
S
OK
W!
V
V V
BS G
P?
P?
(b)(a)
S
ABG
PS
W
= Agent = Breeze = Glitter, Gold
= Pit = Stench
= Wumpus
OK = Safe square
V = Visited
Resolution
1. ¬P2,2, ¬P1,1 2. B2,1 ⇔ (P1,1 ∨ P2,2 ∨ P3,1) 3. B2,1 ⇒ (P1,1 ∨ P2,2 ∨ P3,1) (biconditional elimination) 4. P1,1 ∨ P2,2 ∨ P3,1 (modus ponens) 5. P1,1 ∨ P3,1 (resolution rule) 6. P3,1 (resolution rule)
The resolution rule: if there is a pit in (1,1) or (3,1), and it’s not in (1,1), then it’s in (3,1).
P1,1 ∨ P3,1, ¬P1,1 P3,1 CS440 Fall 2015 10
Resolution
Unit Resolution inference rule: l1 ∨… ∨ lk, m
l1 ∨ … ∨ li-1 ∨ li+1 ∨ … ∨ lk
where li and m are complementary literals. Full Resolution
l1 ∨… ∨ lk, m1 ∨ … ∨ mn l1 ∨…∨ li-1 ∨ li+1 ∨…∨ lk ∨ m1 ∨…∨ mj-1 ∨ mj+1 ∨...∨ mn where li and mj are complementary literals.
CS440 Fall 2015 11
Resolution is sound
For simplicity let’s consider clauses of length two:
l1 ∨ l2, ¬l2 ∨ l3 l1 ∨ l3
To derive the soundness of resolution consider the values l2 can take: q If l2 is True, then since we know that ¬l2 ∨ l3 holds, it must
be the case that l3 is True. q If l2 is False, then since we know that l1 ∨ l2 holds, it must
be the case that l1 is True.
CS440 Fall 2015 12
4
Resolution
n Properties of the resolution rule: q Sound q Complete
n Resolution can be applied only to disjunctions of literals. How can it be complete?
n Turns out any knowledge base can be expressed as a conjunction of disjunctions (conjunctive normal form, CNF).
n Example: (A ∨ ¬B) ∧ (B ∨ ¬C ∨ ¬D) CS440 Fall 2015 13
Conversion to CNF
B1,1 ⇔ (P1,2 ∨ P2,1) 1. Eliminate ⇔, replacing α ⇔ β with (α ⇒ β)∧(β ⇒α).
(B1,1 ⇒ (P1,2 ∨ P2,1)) ∧ ((P1,2 ∨ P2,1) ⇒ B1,1) 2. Eliminate ⇒, replacing α ⇒ β with ¬α ∨ β.
(¬B1,1 ∨ P1,2 ∨ P2,1) ∧ (¬(P1,2 ∨ P2,1) ∨ B1,1)
3. Move ¬ inwards using de Morgan's rules and double-negation: (¬B1,1 ∨ P1,2 ∨ P2,1) ∧ ((¬P1,2 ∧ ¬P2,1) ∨ B1,1)
4. Apply distributive law (∧ over ∨) and flatten: (¬B1,1 ∨ P1,2 ∨ P2,1) ∧ (¬P1,2 ∨ B1,1) ∧ (¬P2,1 ∨ B1,1)
CS440 Fall 2015 14
Converting to CNF
n Every sentence can be converted to CNF 1. Replace α ⇔ β with (α ⇒ β) ∧(β ⇒ α) 2. Replace α ⇒ β with (¬ α v β) 3. Move ¬ “inward”
1. Replace ¬(¬ α) with α 2. Replace ¬(α ∧ β) with (¬ α v ¬ β) 3. Replace ¬(α v β) with (¬ α ∧ ¬ β)
4. Replace (α v (β ∧ γ)) with (α v β) ∧(α v γ)
CS440 Fall 2015 15
Converting to CNF (II)
n While converting expressions, note that q ((α v β) v γ) is equivalent to (α v β v γ) q ((α ∧ β) ∧ γ) is equivalent to (α ∧ β ∧ γ)
n Why does this algorithm work? q Because ⇒ and ⇔ are eliminated q Because ¬ is always directly attached to literals q Because what is left must be ∧’s and v’s, and they
can be distributed over to make CNF clauses
CS440 Fall 2015 16
5
Using resolution
n Even if our KB entails a sentence α, resolution is not guaranteed to produce α.
n To get around this we use proof by contradiction, i.e., show that KB∧¬α is unsatisfiable.
n Resolution is complete with respect to proof by contradiction.
CS440 Fall 2015 17
Example of proof by resolution
KB = (B1,1 ⇔ (P1,2∨ P2,1)) ∧¬ B1,1 in CNF… (¬B1,1 ∨ P1,2 ∨ P2,1) ∧ (¬P1,2 ∨ B1,1) ∧ (¬P2,1 ∨ B1,1)
α = ¬P1,2
Resolution yielded the empty clause. The empty clause is False (a disjunction is True only if
at least one of its disjuncts is true). CS440 Fall 2015 18
Automated Theorem Proving
n How do we automate the inference process? q Step 1: assume the negation of the consequent and add it
to the knowledge base q Step 2: convert KB to CNF
n i.e. a collection of disjunctive clauses
q Step 3: Repeatedly apply resolution until: n It produces an empty clause (contradiction), in which case the
consequent is proven, or n No more terms can be resolved, in which case the consequent
cannot be proven
CS440 Fall 2015 19
Resolution Algorithm
function PL-RESOLUTION(KB, α) returns true or false inputs: KB, the knowledge base, a sentence in propositional logic α, the query, a sentence in propositional logic clauses ← the set of clauses in the CNF representation of KB ∧ ¬α new ← {} loop do for each pair of clauses Ci,Cj in clauses do resolvents ← PL-RESOLVE(Ci,Cj) if resolvents contains the empty clause then return true new ← new U resolvents if new ⊆ clauses then return false clauses ← clauses U new
CS440 Fall 2015 20
6
Another example
n If it rains, I get wet. n If I’m wet, I get mad. n Given that I’m not mad, prove that it’s not
raining.
CS440 Fall 2015 21
Inference for Horn clauses Horn Form (special form of CNF)
KB = conjunction of Horn clauses Horn clause = disjunction of literals of which at most one is
positive Example: C ∨¬ B ∨¬A
Modus Ponens is a natural way to make inference in
Horn KBs (recall a⇒b is equivalent to ¬a ∨ b) α1, … ,αn, α1 ∧ … ∧ αn ⇒ β
β
n Successive application of modus ponens leads to algorithms that are sound and complete, and run in linear time
CS440 Fall 2015 22
Forward chaining
n Idea: fire any rule whose premises are satisfied in the KB q add its conclusion to the KB, until query is found
CS440 Fall 2015 23
And-or graph
Forward chaining example
CS440 Fall 2015 24
7
Forward chaining example
CS440 Fall 2015 25
Forward chaining example
CS440 Fall 2015 26
Forward chaining example
CS440 Fall 2015 27
Forward chaining example
CS440 Fall 2015 28
8
Forward chaining example
CS440 Fall 2015 29
Forward chaining example
CS440 Fall 2015 30
Forward chaining example
CS440 Fall 2015 31
Forward chaining algorithm Forward chaining is sound and complete for Horn KB function PL-FC-ENTAILS?(KB, q) returns true or false
inputs: KB, knowledge base, a set of propositional definite clauses q, the query, a propositional symbol count ← a table, where count[c] is the number of symbols in c’s premise inferred ← a table, where inferred[s] is initially false for all symbols agenda ← a queue of symbols, initially symbols known to be true in KB while agenda is not empty do p ← POP(agenda) if p=q then return true if inferred[p]=false then inferred[p] ← true for each clause c in KB where p is in c.PREMISE do decrement count[c] if count[c]=0 then add c.CONCLUSION to agenda return false
CS440 Fall 2015 32
9
Backward chaining
aka Goal Directed reasoning Idea: work backwards from the query q:
check if q is known already, or prove by backward chaining all premises of some rule
concluding q Avoid loops:
check if new subgoal is already on the goal stack
Avoid repeated work: check if new subgoal has already been proved true, or has already failed
CS440 Fall 2015 33
Backward chaining example
CS440 Fall 2015 34
Backward chaining example
CS440 Fall 2015 35
Backward chaining example
CS440 Fall 2015 36
10
Backward chaining example
CS440 Fall 2015 37
Backward chaining example
CS440 Fall 2015 38
Backward chaining example
CS440 Fall 2015 39
Backward chaining example
CS440 Fall 2015 40
11
Backward chaining example
CS440 Fall 2015 41
Backward chaining example
CS440 Fall 2015 42
Backward chaining example
CS440 Fall 2015 43
Forward vs. backward chaining
n FC is data-driven n May do lots of work that is irrelevant to the goal
n BC is goal-driven, appropriate for problem-solving, q e.g., What courses do I need to take to graduate? How do I
get into a PhD program?
n Complexity of BC can be much less than linear in size of KB
CS440 Fall 2015 44
12
Efficient propositional inference
Two families of efficient algorithms for satisfiability: n Complete backtracking search algorithms:
q DPLL algorithm (Davis, Putnam, Logemann, Loveland) n Local search algorithms
q WalkSAT algorithm: n Start with a random assignment n At each iteration pick an unsatisfied clause and pick a symbol in the
clause to flip; alternate between: q Pick the symbol that minimizes the number of unsatisfied clauses q Pick a random symbol
Is WalkSAT sound? Complete?
CS440 Fall 2015 45
In the wumpus world
A wumpus-world agent using propositional logic: ¬P1,1 ¬W1,1 Bx,y ⇔ (Px,y+1 ∨ Px,y-1 ∨ Px+1,y ∨ Px-1,y) Sx,y ⇔ (Wx,y+1 ∨ Wx,y-1 ∨ Wx+1,y ∨ Wx-1,y) W1,1 ∨ W1,2 ∨ … ∨ W4,4 ¬W1,1 ∨ ¬W1,2 ¬W1,1 ∨ ¬W1,3 …
⇒ 64 distinct proposition symbols, 155 sentences
CS440 Fall 2015 46
Summary
n Logical agents apply inference to a knowledge base to derive new information and make decisions
n Basic concepts of logic: q syntax: formal structure of sentences q semantics: truth of sentences wrt models q entailment: truth of one sentence given another q inference: deriving sentences from other sentences q soundness: derivations produce only entailed sentences q completeness: derivations can produce all entailed sentences
n Resolution is complete for propositional logic n Forward, backward chaining are linear-time, complete for Horn
clauses n Propositional logic lacks expressive power
CS440 Fall 2015 47
Top Related