## Thursday, 1 March 2012

### Is arithmetic a necessary condition for Gödel incompleteness?

This is really a question for Jeff, but I hope others will be interested as well. Here is the thing: next week NewAPPS will be hosting a symposium on a text by Paul Livingstone which presents a comparative analysis of Gödel’s incompleteness theorems, and Graham Priest and Derrida (yes indeed!) on diagonalization. It is an interesting text, even though some of the more metaphysical claims seem a bit over-the-top to me (as I will argue in my contribution to the symposium). But anyway, so I’ve been thinking about the whole Gödel thing again and how wide-ranging the conclusions drawn by Livingstone really are, and this got me thinking about the conditions that a system must satisfy for the Gödel argument to go through. It is clear that containing arithmetic is a sufficient condition for the argument to go through, as arithmetic allows for the encoding which then gives rise to the Gödel sentence.

But my question now is: is containing arithmetic a necessary condition in a system for the Gödel argument to go through? It seems to me that the answer should be negative. In fact, I recall that many years ago I heard Haim Gaifman saying that any other suitable encoding technique would be enough for the argument to run; arithmetization is just a particularly convenient encoding method. So my first question is whether this is indeed correct, i.e. that containing arithmetic is not a necessary condition for a Gödel incompleteness argument to take off. The second question is whether there are interesting examples of systems that can be proved to be incomplete by a Gödel argument even though they do not ‘contain arithmetic’ (I have the feeling that even the concept of ‘containing arithmetic’ might need to be clarified).

Thoughts, anyone?

1. I confess to not really understanding the question; Gödel's incompleteness theorem is fundamentally about arithmetic. To quote Kleene's statement of the theorem:

"Any effectively generated theory capable of expressing elementary arithmetic cannot be both consistent and complete. In particular, for any consistent, effectively generated formal theory that proves certain basic arithmetic truths, there is an arithmetical statement that is true, but not provable in the theory" (Kleene, Mathematical Logic 1967, p. 250).

If you have a theory which is not capable of expressing basic arithmetic, then it doesn't make sense to speak of Gödel's results with respect to it, because the theorem doesn't apply.

1. The point is that you could run an analogous argument (encoding, diagonalization etc) for other theories/formal systems, so as to generalize the Godel's results which are indeed originally intended to be about arithmetic specifically. Something like using the Henkin method to prove the completeness of different logics, but then use encoding and diagonalization to prove the incompleteness of a formal system.

2. As far as I remember Gödel is about coding and decoding using arithmetic to be sure of uniqueness of the code. But it is a rare to code 'all' as if one can code 'all chinese eat rice' unless 'all' is just a formal expression.

3. I believe Sara's comment above is entirely correct. Moreover, let me state the obvious. The coding language, which is arithmetic, is only half of the story; there is of course a coded language, which is also arithmetic. It will contain primitive recursive functions (in particular arithmetic ones) and a provability relation in that language. If you forget about it, the whole diagonalization makes no sense: it is generated precisely by using arithmetic to express formal properties of arithmetic (in particular about provability). This will hold for any formal language that is as expressible as arithmetic, i.e. that contains it. In other words, yes, it is a necessary condition.

4. Not sure if it's relevant to your question, but there's a very interesting paper called "Undecidability Without Arithmetization" (http://philpapers.org/rec/GRZUWA) by Andrzej Grzegorczyk that answers (positively) a similar-sounding question with respect to undecidability, i.e.: "Are there interesting examples of systems that can be proved to be undecidable by a Gödel argument even though they do not ‘contain arithmetic’?". In the paper he proves the undecidability of a finitely axiomatizable first-order theory of concatentation on texts, i.e., strings in a formal language, and, hence the undecidability of first-order logic, in a metatheory that is itself based on a (richer) theory of concatenation that includes no arithmetic and, hence, in a metatheory that makes no use of arithmetization. As decidability in logic is standardly defined via arithmetization and recursive functions, the trick is coming up with an appropriate (and extensionally equivalent) notion of decidability in such a metatheory. He argues (persuasively, it seems to me) that this is a more natural approach to the question of decidability in logic, as formulas in logical languages are (intuitively) texts.

1. Chris, that's excellent! That's exactly the kind of thing I was looking for, so thanks so much for the reference. If anyone knows of other papers along these lines, I'd be very interested.

5. I'm sure you already think about it but your idea seems close to what Priest defends. For him, diagonalization and encoding are the sources of many paradoxes.

Perhaps you could find more technical papers from his bibliography, (see e.g. In Contradiction and Beyond the limits of thoughts).

Mathieu

6. There's a nice, general statement of the Second Incompleteness Theorem in Visser's paper "Can we make the Second Incompleteness Theorem coordinate free?" using the notion of interpretability. It says:

No recursively enumerable theory U interprets Q + Con(U).

Interpretability might thus be seen as an elaboration of the notion of "containment" between theories.

7. It's nice of you to mention me, and agree with you Catarina (and Gaifman too, I suppose). A theory of syntax, with concatenation, such as Grzegorczyk syntax theory TC that Chris mentions is incomplete: on Visser's formulation, TC has some minimal strings (including the null string) and concatenation. So, I think the essence of incompleteness of a theory is a certain kind of syntactical/arithmetic/computational interpretability. It will apply to certain theories of syntax and to certain theories of numbers if they interpret say Q (arithmetic) or TC (syntax).

For the case of incompleteness of a formal system, diagonalization is syntactic. The diagonalization of an expression is the result of concatenating its quotation name with (or substituting its quotation name into some free position of) itself. If $\phi$ is a formula with $x$ free, its diagonalization is $\phi[x/\ulcorner \phi \urcorner]$. E.g., the diagonalization of $x = \underline{0}$ is $\ulcorner x = \underline{0} \urcorner = \underline{0}$. (Well, Tarski introduced a slightly different definition, which is a bit easier to work with: the diagonalization of $\phi$ is $\exists x(x = \ulcorner \phi \urcorner \wedge \phi)$. Amounts to the same thing.)

Quine's version is something like:
"yields falsehood when prefixed by its own quotation" yields falsehood when prefixed by its own quotation.
In Quine's Mathematical Logic (1940), he emphasizes that this all applies to what he calls "protosyntax", but unfortunately, he never really properly specified an axiom system. The last few years has seen some interesting work on this. Google for "Visser, syntax, TC".

But there's a more general notion of diagonalization though, going back to Cantor's original diagonal argument; I think Priest sees this as lying behind the incompleteness theorems and semantic paradoxes.

The thing is that arithmetic, syntax and computation are all heavily intertwined. It's what they have in common which is at the heart of the phenomenon, and this can be clarified by thinking of interpretability (Visser again). And, of course, Gödel himself stated that the "true reason" for incompleteness is in fact the undefinablity of truth, which he discovered prior to discovering the incompleteness results.

Jeff

8. Thanks everyone for the comments, and Jeff in particular. I knew you'd have the answer! :) This is is really cool stuff: I like in particular your "It's what they have in common which is at the heart of the phenomenon", so ultimately my question is a quest for this je ne sais quoi that they have in common.

And obviously I should have thought of Albert Visser (well, that holds of pretty much any technical question of this nature: he's an oracle). I think I'll drop him a note and ask whether he would be interested in joining the discussion.

9. Re Jeff's remark about diagonalization, Gaifman has a nice little paper on his web site showing how Gödel's incompleteness proof "arises naturally from Cantor’s diagonalization method":

http://www.columbia.edu/~hg17/Diagonal-Cantor-Goedel-05.pdf

Also, expanding on the point about interpretability, Q is interpretable in TC. (I have only read this; have not studied the proof.)

1. I took a course on Godel's theorem with Gaifman in 2003, when I was visiting NYC as a grad student. I realize now more and more that my take on Godel's results are still very much shaped by Gaifman's, including the points he makes in this paper. But it was all sort of 'dormant', so it's good to brush it up a bit!

10. There is a paper by John Bell titled "Incompleteness in a General Setting" that may be helpful as well. I happen to have printed it off the other day, but haven't read it yet.

1. Thanks, didn't know about this paper. John Bell happens to have been Graham Priest's supervisor, so there might be some interesting biographical/conceptual connection here too, in the background of Graham's own thinking about incompleteness, diagonalization and paradox.

11. " It is clear that containing arithmetic is a sufficient condition for the argument to go through, as arithmetic allows for the encoding which then gives rise to the Gödel sentence."
Nit picking here... Dan Willard's systems have arithmetic of a sort, and don't fall into the incompleteness. (His "addition" and "multiplication" are not total functions) I personally feel that they do help clarify where the boundry of "Necessary" conditions are.

12. There is also some older work by Ray Smullyan along the lines of Grzegorczyk' paper, showing that a theory of concatenation is essentially incomplete (I am working from memory, so I don't have a precise reference). And of course, any theory of hereditarily finite sets interprets Q and is therefore incomplete. In fact, Visser has shown that even a theory as weak as adjunction can be bootstrapped to something for which incompleteness can be shown (adjunction is the function f(x,y) = x union {y}).

13. Here's another thought. One version of G1 (from Boolos & Jeffrey) is:

(G1) No recursively axiomatizable extension of Q is both consistent and complete.

If this is to apply to a theory $T$, it follows that $T$ have no finite models. For suppose $\mathcal{M} \models T$ is finite. Let $T^{+} = Th(\mathcal{M})$. Then $T^{+}$ is a complete consistent recursively axtiomatizable extension of $T$.

E.g., let $T$ be all validities. Then $T$ has a 1-element model, say $\mathcal{M}$. Then $Th(\mathcal{M})$ is complete, consistent and recursively axiomatizable (it is axiomatized by a single non-logical axiom: $\forall x \forall y(x = y)$).

The theories Q, TC and AST (Visser's adjunctive set theory, which Aldo mentions) have no finite models.

Jeff

14. "it is axiomatized by a single non-logical axiom: $\forall x \forall y(x=y))$."
Oops, too quick. That's not quite right. It's axiomatized by $\forall x \forall y(x=y))$, plus axioms like $\exists x Px$ or $\forall x \neg Px$, or $\exists x Pxx$ or $\forall x \neg Pxx$, etc., for each primitive predicate $P$, saying if the extension of $P$ is empty or not.

15. One thing worth pointing out/discussing is that although one can prove analogues of the incompleteness theorems in theories of concatenation or tree structures or whatever, it isn't obvious that such theories (really: all "sufficiently strong" formal theories) aren't simply (or don't simply contain) "arithmetic in disguise" anyway. I'm pretty sure I'm getting this from my own conversations with Haim, as a matter of fact.

To put it even more sloppily, it seems that arithmetic, formal syntax, and computability are all intimately tied together, ontologically speaking. There seems to be a sense in which theories of all three are "about the same sorts of things".

So long as we're plugging Gaifman papers, I strongly recommend "On Ontology and Realism in Mathematics", which addresses some of these issues. It's posted on his website. I think it's the best paper I've read in the Philosophy of Mathematics in a very long time, but I'm a bit biased.

16. Nate, thanks for your comment, and the suggestion of Gaifman's article. I read it a while back and thought it was interesting.
"it seems that arithmetic, formal syntax, and computability are all intimately tied together, ontologically speaking."

Right - yes, I think that gets to the heart of it.
(Just up above I said,
"The thing is that arithmetic, syntax and computation are all heavily intertwined. It's what they have in common which is at the heart of the phenomenon, and this can be clarified by thinking of interpretability".)

1. Ah, of course. I read your comment fast and ended up saying something very similar. Ummm...I agree.

17. Panu Raatikainen19 March 2012 at 06:49

I'd like to emphasize that the theory does not have to have anything to do with arithmetic; it can deal (its intended intepretation) e.g. with grandmothers and other ancestor, if you like.
All that matters that its language is formalized, and that elementary arithmetic, as an uninterpreted formal theory, can be relatively interpreted in it. Then the theory is incomplete and essentially undecidable.

1. Yes, definitely - interpretability is the key issue - and whatever it is that arithmetic, syntax and computation all have in common.
An example of the kind of thing you have in mind is the Field/Shapiro/Burgess debate in the early 80s concerning the conservativeness of adding mathematics to a physical geometric theory (which interprets arithmetic) N which doesn't prove (the coded formulation of) its own consistency, Con(N). Adding a set-theoretic comprehension principle leads to a proof of this geometric statement.

Jeff

18. Luis Estrada-González25 March 2012 at 18:42

I find Gaifman treatment very similar to William Lawvere's 1969 category theoretic approach ("Diagonal arguments and Cartesian closed categories", http://www.tac.mta.ca/tac/reprints/articles/15/tr15.pdf). The similarities can be made more visible consulting Noson Yanofsky's 2003 paper in the BSL explaining Lawvere's work in non-categorial terms ("A universal approach to self-referential paradoxes", a preprint is available here: http://arxiv.org/pdf/math/0305282v1.pdf).

1. Many thanks for this, Luis. Really interesting.