[FOM] Axiomatic Syntax

Harvey Friedman friedman at math.ohio-state.edu
Tue Sep 2 09:15:13 EDT 2003

Reply to Avron. 

This should take care of your questions. Again, I still think that this was
clear already in the 50's. Did you ask Sol?

On 9/2/03 7:39 AM, "Arnon Avron" <aa at tau.ac.il> wrote:

> Here is my first, superficial reaction to Friedman's Posting. I say in advance
> that I need time to think about it, so maybe some questions and points
> I raised are total nonsense.

I would have preferred that you think about it some more. But since you made
the posting anyways, I will respond.

I trust that you will tell us whether they are in fact "total nonsense".
>> 1. Show that under any two reasonable codings of the syntax of PA, the
>> consistency statements are provably equivalent within PA.
>> 2. Show that under any two reasonable codings of the syntax of PA, the self
>> referential constructions due to Godel yield statements provably equivalent
>> within PA.
>> Now 2 follows from 1, since the Hilbert Bernays and Jerosolow work directly
>> show that the self referential constructions in question, with respect to
>> any particular reasonable codings of syntax, give the consistency statement
>> with respect to that coding. So we have only to worry about 1.
> Definitely, as far as simlple equivalence goes. I should warn however
> again that when it comes to a debate concerning the *content* of a
> sentence (like: whether a given sentence is self-referential) such
> equivalence is only
> a necessary condition. After all, all theorems of PA are equivalent
> in PA. Does this means that they all have the same content?

There is no issue of content here, so your remark seems irrelevant.

>> The strategy in giving an appropriate set of conditions is to first, at the
>> conceptual level, identify the syntax of PA with an obvious many sorted
>> relational structure, in a finite relational type. The sorts correspond to
>> the relevant syntactic categories, the functions correspond to the basic
>> syntactic operations, and the relations correspond to the basic syntactic
>> relations. 
>> It is important that this structure be presented in a certain way. It must
>> be INDUCTIVELY PRESENTED. I do not have the time to describe, in general,
>> what this means. I would bet that an appropriate notion of this kind is
>> already in the literature. But in any case, it will be obvious that the
>> structure we are interested in, which is to be identified with the syntax of
>> PA, is in fact inductively presented. NOTE: I don't quite see that the well
>> known notion of finitely presented is quite enough for this.
> Although certainly sound in the right direction,

... and is a complete solution to what you ask for.

>this is still
> a little vague. What is the "*obvious*  many sorted relational structure
> (is there only one?), what is the "certain way" that this
> structure should be presented? Especially the last remark
> (with which I tend to agree) indicates that we might not still
> have a satisfactory solution!

I will make ONE MORE attempt NOT TO HAVE TO write this down in detail. It
looks like I am stuck writing SOME details. Sorts for variables, constants,
terms, atomic formulas, formulas, variable substitutions, propositional
formulas, propositional formula substitutions. Function symbols for next
variable, certain identity maps, term combinings, formula combinings,
formula decompositions, evaluation of substitutions, and the actions of
variable substitutions and proositional formula substitutions. Relation
symbols for substitutability, being a logical axiom, being a quantifier free
axiom of PA, being an induction axiom of PA, being a Hilbert style proof.

This may not be 100% accurate, as I am trying to avoid doing this.

>> In any case, the natural presentation of our structure is in fact an
>> inductive presentation.
>> It will be clear that our particular inductively presented structure
>> satisfies the obvious PRINCIPLE(S) OF INDUCTION - with respect to all
>> predicates. This is a second order condition, and MUCH TOO CRUDE for our
>> purposes. The principle of induction should be clear for any inductively
>> defined structure.
>> We can turn the inductive presentation of our structure into finitely many
>> BASIC UNIVERSAL SENTENCES true in the structure. These universal sentences
>> are very special - they correspond to an inductive presentation.
>> However, we can define what we mean by an ARITHMETIC INTERPRETATION of our
>> structure (or any inductively presented structure). The domains, relations,
>> and functions are all given by arithmetic formulas, and the structure must
>> obey those finitely many basic universal sentences.
> Dont you need to redefine here the concept of an arithmetic formula
> so it will make sense for any inductively presented structure (independent
> of any coding)?

ABSOLUTELY NOT. Probably one has to be much more careful under some really
general notion of inductively presented structure - I have not thought about
this carefully. But the one used here causes no difficulties. One just means
an assignment of arithmetic formulas to every notion so that the obvious
universal conditions hold. In our case, the universal conditions are clear.
>> It now makes sense to impose the PRINCIPLE OF ARITHMETIC INDUCTION on the
>> arithmetic interpretations. This is the principle of induction stated only
>> for all arithmetic formulas.
>> *FACT 2* Any two arithmetic interpretations of our structure that satisfy
>> these basic universal sentences AND THE PRINCIPLE OF ARITHMETIC INDUCTION
> I am a little bit confused. I apologize. What does "ARITHMETICALLY ISOMORPHIC"
> exactly mean?

Semantically, it means that there is an actual isomorphism between the two
arithmetic structures, that is in fact, an arithmetic bijection.

Proof theoretically, we will have an arithmetic definition of an isomorphism
which, in fact, can be proved in PA to be an isomorphism.
>> *FACT 3* There exists an arithmetic interpretation of our structure that
>> provably in PA satisfies the basic universal sentences AND THE PRINCIPLE OF
> I am even more confused now. In what sense is it "provably in PA"?
> Is (as I suspect) a natural coding is needed to make this statement
> meaningful?  And if so - are we not back to the question: what
> is a natural coding?

I mean what I said. There exists an arithmetic interpretation for which it
is provable in PA that ... This is needed in order to argue that:

there exists at least one adequate arithmetization of syntax.

This statement is entirely meaningful independent of any notion of natural
>> So we define an adequate arithmetization of syntax simply as an
>> arithmetically presented structure such that in PA, we can prove that the
>> basic universal sentences and the principle of arithmetic induction hold.
> I am not sure that I completely understand this definition at the moment.
> Does it suply a constructive criterion for adequate arithmetization?

Yes. That we have a system of arithmetic formulas for which certain
associated arithmetic formulas are provable in PA. The constructive criteria
is the data, which includes the proof in PA of the associated arithmetic

> And might it not be too broad? For example: is the compostion of one
> adequate encoding with a p.r. function which is (provably in PA)
> 1-1 and onto also adequate according to this definition? (note that
> if this is the case then for *every* \phi and n we can find an
> adequate encoding such that the code of \phi(n) is exactly n!)

It cannot be too broad for our purposes, since it is a Theorem that the
consistency statements associated with any two adequate arithmetizations are
provably equivalent in PA. Furthermore, the equivalence proof in PA is
straightforward and also has a kind of uniformity that is probably of
independent interest beyond what you wanted.
>> There is a form of the self reference lemma that makes the self referential
>> nature of this business as clear as I have ever seen it made in print. I saw
>> this first in Jerosolow.
>> SELF REFERENCE LEMMA. Let phi(x) be a formula of PA with only the free
>> variable x. There exists a closed term t such that the godel number of the
>> sentence phi(t) IS the correct value of t.
>> It is compelling to say that:
>> The sentence phi(t) asserts that the property phi holds of the sentence
>> itself. 
> I dont find it compelling at all, and for two reasons. One is more
> or less that noted already by Heck: the fact that you can use
> arbitrary (or even not so arbitrary) p.r. functions to construct such
> a sentence is hardly surprising, and it only means that the value
> of $t$ and the number which is the code of  phi(t) are the same -
> but this equality of the term with the number  is itself a
> content-ful proposition (in which most ingredients of the
> "self-reference" have been built!).

No, it is not contentful. We are talking about closed terms of low level in
the Ackermann hierarchy, which can be computed in presumably very feasible
amount of time. (There may be a detailed complexity issue here. I am not
sure). The onus is on you to describe why this IS contentful.

Is this contentful?

25 x 15 = 375.


2 x 2 = 4.


1 x 2 = 2.

>Hence the phi(t)
> cannot be said to assert a property of itself. The second reason
> that this lemma is again code-dependent. Change the code, and the
> same sentence is not asserting anything about itself anymore,
> but another one does. The two might be equivalent in PA, but
> self-reference is not a property that is preserved by equivalence in PA
> (well, to my opinion it is not a property that a sentence of PA
> can possibly have, so I am talking myself on a rather semi-intuitive,
> vague level).

It is prima facie pointless to me to complain that the usual self reference
lemma in PA is not really self reference at all, by creating some special
standard for self reference.

How do you propose to explain how Godel actually thought of proving his 2nd
incompleteness theorem? You don't believe him when he says that he was
thinking about what self reference means in PA? And when he talks about the
liar paradox guiding him?

Why not simply accept that this is the appropriate form of self reference in
PA? You don't like the way Godel talks about his own business?

When I talk about this, I refer to DIRECT self reference, in deference to
the well established usual self reference in PA, in the sense of Godel.

I also think that you are not taking into account what I said about normal
forms in my response to Heck.

In any case, it is true that I implicitly take this issue of DIRECT self
reference somewhat seriously as a challenge. That is reflected in my writing
several times about the possibility of doing DIRECT self reference in a
suitable extension of PA.

HOWEVER: as I always emphasize, this has nothing to do with any flaw or
drawback in the matters surrounding our understanding of Godel's
incompleteness theorems.

BUT: There might be a flaw in our understanding IF there were genuine
difficulties in establishing the equivalence over PA of the consistency
statements that you get out of different arithmetizations. Since there is no
problem in establishing that, we see that Godel, once again, seems to
transcend all proposed objections.

Harvey Friedman 

More information about the FOM mailing list