CS 245 — Lecture 10

Completeness of natural deduction

Completeness is the other, complementary direction for soundness. That is, if a statement is true, then it must be provable. In other words, if $\Sigma \vDash \psi$, then $\Sigma \vdash \psi$. Like soundness, this is a useful method for showing an entailment holds: If the entailment holds, then all I need to do is show you a proof for it.

Theorem 10.1. (Completeness) If $\Sigma \models \psi$, then $\Sigma \vdash \psi$.

The proof of this can be broken up into three lemmas.

  1. If $\{\varphi_1, \varphi_2, \dots, \varphi_n\} \models \psi$, then $\emptyset \models (\varphi_1 \rightarrow (\varphi_2 \rightarrow ( \dots (\varphi_n \rightarrow \psi ) \dots )))$.
  2. If $\emptyset \models \alpha$, then $\emptyset \vdash \alpha$.
  3. If $\emptyset \vdash (\varphi_1 \rightarrow (\varphi_2 \rightarrow ( \dots (\varphi_n \rightarrow \psi ) \dots )))$, then $\{\varphi_1, \varphi_2, \dots, \varphi_n\} \vdash \psi$.

The first and third parts are not so bad. The heavy lifting comes with the second part, which, as you may have guessed, is the most important part.

Lemma 10.2. If $\{\varphi_1, \varphi_2, \dots, \varphi_n\} \models \psi$, then $\emptyset \models (\varphi_1 \rightarrow (\varphi_2 \rightarrow ( \dots (\varphi_n \rightarrow \psi ) \dots )))$.

Proof (Sketch). Suppose that the entailment $\varphi_1, \varphi_2, \dots, \varphi_n \models \psi$ holds but the entailment $\emptyset \models (\varphi_1 \rightarrow (\varphi_2 \rightarrow ( \dots (\varphi_n \rightarrow \psi ) \dots )))$ doesn't. This means that the formula $(\varphi_1 \rightarrow (\varphi_2 \rightarrow ( \dots (\varphi_n \rightarrow \psi ) \dots )))$, must be false under some valuation $t$. So suppose we have a valuation $t$ such that $(\varphi_1 \rightarrow (\varphi_2 \rightarrow ( \dots (\varphi_n \rightarrow \psi ) \dots )))^t = F$.

Let's consider the top level implication. If under $t$, our formula evaluates to $F$, this must mean we have $\varphi_1^t = T$ and $(\varphi_2 \rightarrow (\dots(\varphi_n \rightarrow \psi) \dots ))^t = F$. If we want to be super duper formal about this process, we can show by induction on $n$ that we can ultimately unpack this formula so that we have $\varphi_1^t = \varphi_2^t = \dots = \varphi_n^t = T$ and $\psi^t = F$.

Instead, we'll leave it at that and observe that what this implies is that the set $\{\varphi_1, \dots, \varphi_n\}$ is satisfied by $t$ but $\psi$ is not. This is a contradiction of our assumption that the entailment $\{\varphi_1, \dots, \varphi_n\} \models \psi$ holds and therefore, our assumption that the entailment $\emptyset \models (\varphi_1 \rightarrow (\varphi_2 \rightarrow ( \dots (\varphi_n \rightarrow \psi ) \dots )))$ doesn't hold was incorrect. $$\tag*{$\Box$}$$

Lemma 10.3. If $\emptyset \vdash (\varphi_1 \rightarrow (\varphi_2 \rightarrow ( \dots (\varphi_n \rightarrow \psi ) \dots )))$, then $\{\varphi_1, \varphi_2, \dots, \varphi_n\} \vdash \psi$.

Proof (Sketch). You may already have an idea of what this may involve, since I briefly mentioned this idea when we were nearing the end of discussing natural deduction proof rules. So, suppose that we do indeed have a proof of $\emptyset \vdash (\varphi_1 \rightarrow (\varphi_2 \rightarrow ( \dots (\varphi_n \rightarrow \psi ) \dots )))$. We can take this proof and add each of $\varphi_1, \dots, \varphi_n$ to the beginning as premises. Then, there is a line that concludes with the formula $(\varphi_1 \rightarrow (\varphi_2 \rightarrow ( \dots (\varphi_n \rightarrow \psi ) \dots )))$. After this line, we can start applying $\rightarrow\!e$ on this formula repeatedly until we end up with $\psi$ as our conclusion. In this way, we can construct a proof for $\{\varphi_1, \dots, \varphi_n\} \vdash \psi$. Again, we can make this much more formal by setting up an induction proof. $$\tag*{$\Box$}$$

Lemma 10.4. If $\emptyset \models \alpha$, then $\emptyset \vdash \alpha$.

Proof. Suppose that $\alpha$ has $n$ propositional variables. Then there are $2^n$ different valuations for $\alpha$, under which $\alpha$ evaluates to true. We will construct $2^n$ subproofs; one for each valuation.

Let $t$ be a valuation. Then we define a formula $\hat p_i$ for $1 \leq i \leq n$ by $$\hat p_i = \begin{cases} p_i & \text{if $p_i^t = T$,} \\ (\neg p_i) & \text{if $p_i^t = F$.} \end{cases}$$ This gives us $\hat p_i^t = T$ for all $1 \leq i \leq n$. We will now need to make use of another lemma.

Lemma 10.5. Let $\eta$ be a formula with propositional variables $p_1, \dots, p_n$ and $t$ a valuation. If $\eta^t = T$, then $\{\hat p_1, \dots, \hat p_n\} \vdash \eta$ and if $\eta^t = F$, then $\{\hat p_1, \dots, \hat p_n\} \vdash (\neg \eta)$.

Proof of Lemma 10.5. We will show this by structural induction on $\eta$.

For our base case, $\eta = p$, for some propositional variable $p$. Then $\{p\} \vdash p$ and $\{(\neg p)\} \vdash (\neg p)$. Recall that these have single line proofs.

For our inductive case, let $\beta$ and $\gamma$ be formulas and we assume that we have $\{\hat q_1, \dots, \hat q_k \} \vdash \beta$ and $\{\hat r_1, \dots, \hat r_\ell\} \vdash \gamma$. Observe that since $\beta$ and $\gamma$ are subformulas of $\eta$, we have $(\{\hat q_1 \dots \hat q_k\} \cup \{\hat r_1, \dots, r_\ell\}) \subseteq \{\hat p_1, \dots, \hat p_n\}$.

Suppose that $\eta = (\neg \beta)$. If $\eta^t = T$, then $\beta^t = F$, and we have $\{\hat p_1, \dots, \hat p_n\} \vdash (\neg \beta) = \eta$. If $\eta^t = F$, then we have $\beta^t = T$ and $\{\hat p_1, \dots, \hat p_n\} \vdash \beta$. It is not difficult to see that we also have $\{\hat p_1, \dots, \hat p_n\} \vdash (\neg (\neg \beta))$ and therefore $\{\hat p_1, \dots, \hat p_n\} \vdash (\neg \eta)$.

Next, suppose that $\eta = (\beta \rightarrow \gamma)$. First, we consider $\eta^t = F$. In this case, $\beta^t = T$ and $\gamma^t = F$. Via the inductive hypothesis, we have $\{\hat q_1, \dots, \hat q_k\} \vdash \beta$ and $\{\hat r_1, \dots, \hat r_\ell\} \vdash (\neg \gamma)$. As we observed above, this gives us $\{\hat p_1, \dots, \hat p_n\} \vdash \beta$ and $\{\hat p_1, \dots, \hat p_n\} \vdash (\neg \gamma)$. Using these proofs, we construct a new proof with an additional application of $\wedge i$ to give us $\{\hat p_1, \dots, \hat p_n\} \vdash (\beta \wedge (\neg \gamma))$. From this, we simply need to show that $\{(\beta \wedge (\neg \gamma))\} \vdash (\neg (\beta \rightarrow \gamma))$ to complete the proof of this case.

Now, we must consider $\eta^t = T$. There are three subcases to consider.

  1. $\beta^t = T$ and $\gamma^t = T$,
  2. $\beta^t = F$ and $\gamma^t = T$,
  3. $\beta^t = F$ and $\gamma^t = F$.

By the same kind of reasoning, we can arrive at the following sequents.

  1. $\{\hat p_1, \dots, \hat p_n\} \vdash (\beta \wedge \gamma)$,
  2. $\{\hat p_1, \dots, \hat p_n\} \vdash ((\neg \beta) \wedge \gamma)$,
  3. $\{\hat p_1, \dots, \hat p_n\} \vdash ((\neg \beta) \wedge (\neg \gamma))$.

And for each of these, we must show that

  1. $\{(\beta \wedge \gamma)\} \vdash (\beta \rightarrow \gamma)$,
  2. $\{((\neg \beta) \wedge \gamma)\} \vdash (\beta \rightarrow \gamma)$,
  3. $\{((\neg \beta) \wedge (\neg \gamma))\} \vdash (\beta \rightarrow \gamma)$,

After finishing the case for $(\beta \rightarrow \gamma)$, we must tackle the cases of $(\beta \wedge \gamma)$ and $(\beta \vee \gamma)$ to complete the proof. Again, the same kind of reasoning applies to all of these cases. $$\tag*{$\Box$}$$

Now, we return to the lemma we were originally trying to prove. Since $\alpha$ is a tautology, $\alpha$ will evaluate to true under all $2^n$ possible valuations. This means that we can use Lemma 10.5 to get a proof for $\{\hat p_1, \dots, \hat p_n\} \vdash \alpha$ for every possible choice of $\hat p_1, \dots, \hat p_n$.

So how do we get all of these propositional variables when our formula is a tautology? By repeatedly applying LEM. To see how this works, we first apply LEM to get $(p_1 \vee (\neg p_1))$. Then, we will want to apply $\vee e$ on this formula, so we need to begin our two subproofs. Inside of both of these subproofs, we will apply LEM to get $(p_2 \vee (\neg p_2))$, try to apply $\vee e$ on the formula, and open two more subproofs. We will repeat this process until we have $2^n$ subproofs with every combination $\hat p_1, \dots, \hat p_n$ in each box.

Each of the $2^n$ innermost subproofs will be contained in a series of $n$ subproofs, which gives some choice of $\hat p_1, \dots, \hat p_n$. We use this, together with Lemma 10.5 to construct a proof for $\alpha$. Then, having concluded with $\alpha$ in all of the innermost subproofs, we can finish applying $\vee e$ and propagating the conclusion $\alpha$ up until all the subproofs are all closed.

Though you will likely want to do everything in your power to avoid coming up with a proof like this, this construction does give us a valid proof for $\emptyset \vdash \alpha$. Therefore, we have shown that if $\emptyset \models \alpha$, then $\emptyset \vdash \alpha$. $$\tag*{$\Box$}$$

Proof of Theorem 10.1. Let $\Sigma = \{\varphi_1, \dots, \varphi_k\}$. Then since $\Sigma \models \varphi$, by Lemma 10.2, we obtain the entailment $$\emptyset \models (\varphi_1 \rightarrow (\varphi_2 \rightarrow \dots (\varphi_k \rightarrow \psi) \dots )).$$ Then by Lemma 10.4, we obtain a proof for $$\emptyset \vdash (\varphi_1 \rightarrow (\varphi_2 \rightarrow \dots (\varphi_k \rightarrow \psi) \dots )).$$ From this, we obtain a proof of $\{\varphi_1, \dots, \varphi_k\} \vdash \psi$ by Lemma 10.3. Therefore, $\Sigma \vdash \psi$. $$\tag*{$\Box$}$$