[FOM] Fwd: Formal Math Without Types?

Harvey Friedman hmflogic at gmail.com
Sat Feb 27 00:35:56 EST 2016


On Fri, Feb 26, 2016 at 6:36 AM, Lawrence Paulson <lp15 at cam.ac.uk> wrote:
>> In the meantime, as I indicated earlier, I would like to explore just
>> what problems arise when one wishes to work in a straightforwardly
>> sugared form of ZFC (or ZC) without any serious typing.
>> The first issue that I see is the overuse of the plus sign, +, across
>> all these number systems, and even abstract ones.
>>
>> Mathematicians use "where clauses". E.g., they would write
>>
>> (a + b) + c = a + (b + c), where a,b,c are real numbers and + is
>> addition on the real numbers.
>
> This example illustrates perfectly why we tend to prefer typed systems: the phrase “are real numbers” is taken care of by typing and then so is the disambiguation of the + symbol. In Isabelle/ZF, which is essentially typeless, these constraints are instead expressed explicitly using set membership, so that the associative law above becomes a conditional statement and the + operator becomes a partial function, undefined when applied outside its domain. And that is why in Isabelle/ZF there is a construction of the closed and unbounded class of fixedpoints of the Aleph operator, but not of the rational number ½.

**********************************

Let me reemphasize that my particular interest here may be different
from other peoples. It is

1. Semantic Representation (i.e., without proofs) that exactly
corresponds to the way actual mathematics is written.
2. More ambitiously, Proof Writing that exactly corresponds to the way
actual mathematical proofs are written.

Now these ideals are obviously not literally entirely realizable, since

3. There is some non uniformity in the way mathematics is written.
4. In particular, mathematics is written at various levels of less
than absolute rigor.

I have no doubt that there is a lot of merit in all of the main
working systems people are using, and for most of the them, there are
quite a lot of people hours invested in them. So there remains the
question of just what is the point of this individual person, me, thinking
about this in a very part time way.

I am guessing that there is some point to this  - namely to see what
happens when one tries to get a lot closer to ideas 1,2 than anything
that I normally see. I think I know what 1,2 mean - especially 1,
which is all that I have been talking about here recently. In fact, I
am persuaded that if one wishes to push 1 very seriously, then one
does not merely have a hopeless mess - but rather interesting logical
challenges of new kinds. At least that is what my instincts are.

I also believe that the tolerance in the general mathematical
community for divergence from 1 goes down exponentially as a function
of that divergence. So questioning just about every aspect of how one
does Semantic Representation (and also proofs) may be well worth it.

To fix some notation, I will write ZFC for the kind of lightly sugared
ZFC I have been talking about, which for the purposes of this
discussion, we can take as something that is part of normal
mathematics. Key features are

i. Only variables over sets.
ii. Set abstraction as the only variable binding operator.
iii. Hierarchical definitions of constant, relation and function
symbols. The extensions of the relation and function symbols are
classes (possibly proper) if they exist. E.g., V can be used in this
framework as a unary relation symbol.

The set of real numbers is a constant symbol R that is introduced
hierarchically, on the basis of earlier constant symbols N, Z, Q, and
some other constant, relation, and function symbols. So the real
number line, or being a real number, is handled in the most obvious
way within ZFC.

So far, for Semantic Representation, I don't see any need whatsoever
for going beyond ZFC.

However, there is the matter of mathematicians using + for addition on
N, Z, Q, R. The most faithful approach is to have four different
partial binary functions, all of which are called addition.

The most obvious mathematical writing is imply going to ignore that
there are four different +'s here, and sort of act like there is only
one. If that is what appears on the page, and we are talking about
traditional text made of pages, then the obvious attitude is going to
be that + is defined only at elements of N, and what is done on N is
valid, but what is done on Z, Q, R is not valid.

So one early question is just how we want to handle this very situation.

My approach is to simply ask what the mathematician would generally do
if you said to them - you know, this is not rigorous because the
statements are true about N and false about Z, Q, R. I ask the
mathematician to fix this.

The most common approach - if it is done at all - is I think as follows.

a. Write +N, +Z, +Q, +R. So when addition is successively introduced,
it is introduced as +N, +Z, +Q, +R.
b. Then in theorems, just write +, ignoring the issue, but add a where
clause. E.g., a + b = b + a, where + is +N, +Z, +Q, +R.

I am sure that you can construe this as some sort of very special case
of a much more general and elaborate approach which may go well beyond
the universally familiar to mathematicians. But right or wrong - I am
interested in sticking as close to possible to actual math writing,
with the expectation that I am talking about rather limited things
that are still completely friendly to mathematicians.

Of course, a more complicated case is where there is a parameter
involved referring to the number system. E.g., there is addition on
N^k. Even before this, there is the issue of how to handle N^k, which
is a term with the constant N and the variable k. One would ideally
want to write N^k and not something like FCN(k,N) = the set of
functions from k into N. Depending upon how close we want to adhere to
the ideal, we would accept at least for the moment, FCN(k,N). So we
would write

+ on FCN(k,N)

and use +, with the where clause "where + is + on FCN(k,N)".

I find the whole business of where clauses and friendly disambiguation
as nontrivial, syntactically and semantically. I think there are
systematic solutions that are really very close to the way mathematics
is written.

I don't like the idea of jumping to something elaborate (dependent
type theory, etc.), until and unless it becomes apparent that very
limited devices are not going to be sufficient.

>> I gather that in the MetaMath approach, there are variables over wffs.
>> In the relatively purist approach I am talking about, I do not allow
>> variables over wffs.
>
> Isabelle implements a logical framework in which there can be variables over any syntactic category, but they do not have the same status as variables in the embedded calculus (first-order logic in the case of ZF). This makes it possible to express the axiom scheme of separation, for example, as a single assertion, without extending the class of formulas that can be used for separation. Then entire developments can be schematic in such parameters. In particular, reasoning about classes can be done in a fairly natural way, but not allowing quantification over classes or even writing something like V = V.
>

For present purposes, variables over syntactic categories are prima
facie not appropriate, as in a typical Math Dept, nobody outside logic
will be comfortable using them in their capacity as a mathematician.
In ZFC, the axiom scheme of separation is the principal workhorse,
formulated through set abstraction, and expressibility as a single
assertion is not close to a sufficient reason to introduce variables
over syntactic categories. Of course, there is an important
foundational idea for theory, which is that of axiomatization by
schemes, where one uses schematic letters. However, I don't see this
(yet) as important for the issues at hand.

Harvey Friedman


More information about the FOM mailing list