[FOM] Concerning definition of formulas

Vladimir Sazonov Vladimir.Sazonov at liverpool.ac.uk
Sun Oct 7 20:27:16 EDT 2007


To the Moderator: One more attempt by using neutral mailing system Pine.


Vaughan Pratt suggested an approach to defining first order
logic (FOL) formulas (in prenex normal form) which assumes
only naive set theory and "no recursion or induction, either
explicit, or implicit".

Yes, This definition is based on very simple ideas of (finite)
set theory and is quite interesting in itself. But an implicit use
of induction/recursion is, anyway, necessary when defining an
appropriate FOL calculus (proof theory) and semantics of such
formulas, and especially when reasoning on these concepts. All
of this should be taken into account because the original question
of Saurav was essentially on circularity:

    to define FOL and any kind of finiteness and recursion concepts
    related with it we need set theory (or anything like that)
    which, in turn, depends on FOL, finiteness, recursion, the
    concept of truth in FOL, as other FOMers suggested, etc.

Therefore, such an approach seems to me does not answer this question.
I believe (and something similar to this was already suggested e.g. by
Hazen and Avron) that the right answer is not in giving that or other
mathematical definition of FOL. The question and therefore the answer
are of a methodological nature. The problem is not 'how to define' but
rather 'what is the role, meaning and the right place' of definitions
concepts and ideas we use. We can avoid circularity only by realizing
clearly the difference and interrelation between formal and intuitive
(highly inseparable!) sides of mathematics.

The following is essentially based on my posting to FOM from Thu 9/27/2007
(with some further editing and addition on Tarski's truth definition,
etc) which was rejected by the Moderator by technical reasons (something
related with my University e-mailing system which I hopefully corrected
now).

---

Dear Saurav,

Just for the sake of the argument, imagine you are a typical student
of computer science who knows essentially nothing serious on induction
axiom and on (axiomatic or intuitive) set theory, but is able to
understand
what is a syntactically correct program in a programming language -- just
by examples and practical experience. (Anyway, the compiler will correct
you if you made a formal syntactical mistake.) That is, with a minimum of
abstract (mathematical) style of thought, actually most of us, people, are
able to understand at least sufficiently simple formal syntax. We even do
not need to realize in any serious way what is (potential) infinity (or
what is the set of all formal expressions of some kind) because any real
(physical) computational process will halt anyway (the computer will 
either
halt normally or hang up or be switched off). We have and understand
in some way only a small number of formal examples and have some
imagination and expectations (probably initially wrong) of what would
happen in the case of further similar examples.

Take this primitive knowledge/understanding (requiring no high
matters like set theory or the like) as the base. Then work with
various formal systems (naively, in the style described above)
for Formal Arithmetic or ZFC, etc. Deduce theorems, etc.
Of course, use also your (i.e. of the imaginary student) intuition
- whether this intuition is strong or weak, does not matter - about
numbers, sets, any abstract mathematical objects implicit in these
formal derivations. (Just extrapolate from your intuition on the
real world meaning of 'finite', 'discrete', 'forall', 'one', 'two',
'three',... etc to the imaginary world of mathematical objects.)
During these formal/informal exercises your initial naive intuition
could change (becoming more and more alike to the intuition of
professional mathematicians). For example, at some moment it will
become quite intuitively reasonable that "there are" continuous
nowhere differentiable functions, etc.

Then what is the problem? You need a base? I think, the above naive
view is what you need and is completely sufficient. Otherwise you
will hardly avoid a vicious circle in your attempts to find the solid
ground. What I wrote above is not an ABSOLUTELY solid ground, but it is,
no doubts, solid enough. (Something like our experience with running
a deterministic program on a computer many times - it will always give
the same result! Say, run a big Latex file to get postscript file. It
will always have exactly the same size in bites. Practically,
HIGHLY RELIABLE!) And I do not think that we will be able to find
anything more solid and reliable in our physical and (primitive)
mental world as a formal base of mathematical thought.

---------

Now about Tarski's definition of truth (What means M |= A?).
I consider this as a formal (recursive) definition in model theory
which is typically done in the framework of ZFC. In this sense
there is no vicious circle in this definition at all. Some
philosophers or philosophically inclined mathematicians consider
Tarski's definition as something given outside of ANY formal framework
like ZFC. I believe that this is wrong idea inevitably involving
a vicious circle and a lot of poorly understandable quasi-argumentation,
appealing to Platonism and other doubtful ideas. In the line I
described above, we evidently have (or can easily develop) some kind
of "proto" or naive intuition on various mathematical and logical
concepts, say, on the meaning of quantifiers. (Anybody knows
at least a little bit on the meaning of English - or any other
human language - word forall.) Tarski's definition just extrapolates
this intuition quite naturally and smoothly to an imagined (but
formalized!) 'world' of ZFC in a formal way. Also I do not see
here any necessity of appealing to mysterious (having no
formalisation like ZFC) Meta- and Meta-Meta- ... -Tarski worlds
as it was mentioned in one posting. We have a primitive, initial,
naive, probably poor intuition on truth in FOL -- and that is
enough to start making formal derivations with some kind of
understanding of what happens (not just a purely formal game
with formulas).

Formally defining, say in ZFC, what are formal systems, Tarski
semantics, etc, means doing some kind of Metamathematics. But
formal systems defined now in this formal way should not be mixed
with our initial intuition described above. For example Con_PA
does not exactly mean that 'Peano Arithmetic is consistent' --
it is incomparably stronger than what we could mean naively
because we originally understood formal expressions (or programs
in a programming language as was discussed above) as something
feasible (potentially physically existing - some real examples).
Mathematical formal theories describe some imaginary objects
(e.g. possibly non-feasible, imaginary, having no direct
counterpart in the real world of formal expressions such as formal
derivations). This is typical for mathematics to deal with
imaginary objects (such as square root of -1, etc), even when
we did not intend to introduce them. In a sense they appear
themselves (even unexpectedly, prossibly in a tricky way) either
because of the chosen logical rules and mathematical axioms or
because of a 'logic of development' of mathematical ideas and
concepts.

An important note: Depending on the context, I understand 'formal'
either literally or in sufficiently flexible sense of this word
as it is used in mathematical practice -- not necessarily by writing
all formal symbols (as in practical computer programming). It is the
potential (feasible) formalizability what is important in mathematics.
(The third way of understanding 'formal' is according to a
metamathematical theory, that is, 'imaginary formal'.)

I see no circularity in this 'formal/informal' picture of mathematics,
however I realise that it is somewhat oversimplified because people
studying or doing mathematics usually may know nothing about formal
systems we are talking about. They rather use something like these
formal systems (as a mixture of formal steps and intuition)
subconsciously, in the course of training in reading and writing
mathematical definitions and proofs. But after some training, people
eventually start writing mathematical texts which are potentially
formalizable, say, in ZFC (even if they still know essentially
nothing about formal systems -- training and the mathematical
community will help and correct if something is wrong). Also
historical aspect of mathematics should be reflected upon. Was
mathematics in Euclid or Newton-Leibniz time formal or not? I assert
it was sufficiently close to be formal and had tendention to that.
We know that the process of formalization of Analysis took rather
long time, and mathematicians were worrying about the rigour.
Now we have quite a developed machinery to formalize any reasonable
mathematical idea at highest standards almost immediately.
The situation now is radically different from Newton-Leibniz time.
It is in this sense that mathematics has found a good foundation.
I mean the formal approach (to formalizing any kind of intuition)
in general - not just one formalism like ZFC. Also I do NOT mean
here anything like Platonistic world of mathematics. We have various
vague ideas. If we succeed in formalising them they are transformed
by this process to genuinelly mathematical ideas (always depending
on some formalization - may be up to replacing of one formalization
by an 'equivalent' or close one -  and never existing separately).


Vladimir Sazonov


More information about the FOM mailing list