# [FOM] Computational Nonstandard Analysis

Harvey Friedman hmflogic at gmail.com
Tue Sep 1 14:41:15 EDT 2015

```On Tue, Sep 1, 2015 at 10:21 AM,  <katzmik at macs.biu.ac.il> wrote:
...
> OK. In 1948 Edwin Hewitt published an article where he both introduced the
> term "hyper-real" (to describe the appropriate ideal), and constructed
> a field using a kind of ultrapower construction.
>
> In 1955 Jerzy Los proved "Los's theorem" to the effect that the
> ultrapower preserves all first-order properties (from which the
> transfer principle is an immediate corollary).
>
> In 1961 Abraham Robinson published his first article on non-standard
> analysis (NSA), eventually followed by the 1966 book.
>
> In the meantime, Wim Luxemburg popularized the ultrapower approach in a
> simpler setting than Hewitt (i.e., indexing over the natural numbers).
>
> In 1977 Edward Nelson provided an axiomatisation of Robinson's
> framework called Internal Set Theory (IST) which can be thought of as
> a syntactic approach to NSA.  Karel Hrbacek independently developed a
> syntactic axiomatisation at about the same time.

In 1966 when I was an undergrad at MIT, I formulated the following
system PA*, and proved that it is a conservative extension of PA. The
language is 0,S,+,dot, and a predicate for "being standard".

1 The usual axioms for successor, +,dot.
2. Induction for all formulas in the extended language.
3. There exists a nonstandard integer.

I remember sitting in Gerald Sacks' office at MIT and telling him
interested, and spoke to A. Robinson about it, Sacks told me that A.
Robinson was disappointed that it was a conservative extension. All
three of us of course were well aware that this did not preclude that
nonstandard arguments could be useful even though they can be
systematically a priori eliminated.

Does this fit into the history?

CONJECTURE. All uses of logically oriented machinery and deep
pathology of the usual kind, for concrete statements, can not only be
a priori eliminated, but can be eliminated in a productive way where
one gains new insights beyond the machinery/deep pathology proof.
Furthermore, it is always productive and interesting to make these
eliminations.

NOTE. For other kinds of machinery/pathology, in particular large
cardinals, the above Conjecture fails. Not only can't you eliminate in
an interesting way, you can't eliminate at all! In this one kind of
application of machinery/pathology, elimination is intrinsically
impossible, and this is the only case of it.

>> Because Nelson was so involved in the "claim" that the exponential
>> function is not total on the natural numbers, I automatically assumed
>> that he must have an idiosyncratic setup for nonstandard analysis.
>> Especially, that he must be working with a construction that can be
>> done in a tiny fragment of arithmetic.
>
> No, Nelson's work related to NSA involves a conservative extension of ZFC.
> There are some finitist ideas at the basis of IST but Nelson's work on
> fragments of PA is not directly related to this.

So you can see how I got confused here. By the way, the nonstandard
arithmetic thingie of mine that i just mentioned could be combined
with ultrafinistist ideas, getting much finer conservative extension
results. I think it must be the case that in that context above at
least, you always are going to get conservative extensions, even if
you go to weak and extremely weak systems. Also, I don't know if what
I did there has been looked at from the intuitionistic point of view.
>
>>> (4) Contrary to Friedman's claim, NSA has NOT been used "mostly for doing
>>> rather abstract analysis".
>>
>> What I meant was that for mathematical applications of note to
>> mathematicians, the target mathematics is rather abstract analysis.
>>
>> Now this may still be false or misleading?
>
> One of the recent results that the experts in the field like to talk about is
> the recent solution of the local version of Hilbert's 5th problem (on the
> characterisation of Lie groups) by Lou van den Dries and Isaac Goldbring.  Lie
> groups sound rather concrete.  This is a result that does not have an
> alternative proof in an Archimedean setting.

There is "nonstandard analysis" as a field in the hands of logicians,
and there is "nonstandard analysis" as a tool of occasional use by the
general mathematics community. I was of course talking about the
second, as I occasionally meet mathematicians who have some sort of
reasonable working knowledge of the basics. I wasn't aware that there
is enuf work being done in NSA by logicians to warrant using the term
"experts in the field", but it looks like I may well be quite wrong

Incidentally, I believe that anything like the development you quoted
is a perfect opportunity to test my CONJECTURE above. E.g., I have an
idea about how to go further than Towsner on eliminating ultrafilters
in combinatorics in favor of more defined way low down tools. Of
course, in that situation, there has evolved a number of different low
down ways to deal with, say, Hindman's theorem - but perhaps not for a
lot of other applications.
>
>> Of course, I view ZFC is the at this time most convenient, robust,
>> coherent, useable, general foundational scheme for general
>> mathematics. However, I am deeply interested in any alternatives that
>> are based on at least a prima facie coherent idea. Taking
>> infinitesimals is such an idea. That doesn't mean that we are in a
>> position to offer it up as a good alternative to ZFC. Instead, it
>> seems to be an alternative of among others which needs to be explored,
>> and is being explored. There probably are many avenues of serious
>> foundational interest to go with it that have not yet been explored.
>
> Indeed IST is offered as such an alternative.

It looks more like an extension, not an alternative. I would call it
an alternative if you didn't layer it on top of ZF(C).
>
>> The lack of categoricity and the intrinsic undefinability issues are
>> both major drawback for NSA as any kind of prima facie replacement of
>> ZFC. However, for me, that does not mean that there aren't some very
>> good reasons for taking a hard look at NSA or NSM. After all, long
>> before we really had a decent grasp of epsilon/delta we (e.g., Newton
>> and Leibniz) were casting our mathematics (in and around calculus) in
>> these terms. Obviously it has a lot of conceptual attractions. There
>> are other things that we have wrung out of the mathematical setup that
>> also need to be revisited. I won't go into them here.
>
> There is no lack of categoricity since you are working with the
> ordinary real numbers when you are in IST.  All the theorems proved in
> ZFC are still valid, and in particular the categoricity result.  What
> is new is a one-place predicate "st" which enriches the language of
> the theory.  This can be thought of as an implementation of Leibniz's
> ideas on the distinction between "assignable" and "inassignable"
> numbers.  In modern terminology, the assignable ones would be the "st"
> (standard) ones.

Then the categoricity problem shifts to whether there is a categorical
way to pick out the two relevant ordered fields, R and R* - or if you
take this idealistic stance, is there a categorical way of picking out
R from R*? It seems like you are going to have to take an aggressive
philosophical stance in order to say that the categoricity issue has
been buried.

> An infinitesimal will be a nonstandard real number smaller than every
> positive standard real number.  Again, "standard" means a number on
> which the predicate "st" tests positive, and "nonstandard" otherwise.
> The "enrichment of the language" in question is the introduction of
> the new predicate.  The new axiomatisation satisfies all the ZFC
> axioms without change, plus three new axioms that involve the new
> predicate.  Any theorem you prove in ZFC is still true in the new
> framework, but the richer language enables a broader range of
> arguments, including those exploiting infinitesimals.
>
Just give a little more detail, with those 3 new axioms you refer to.
Then we can hopefully take this to a productive level.

And I think the FOM subscribers would be interested in Sanders
continuing with a SLOW explanation of at least a couple fundamental
results of his or more or less his. I think Sanders thinks a lot about
Reverse NSA? But I don't think Sanders wants to sit on top of ZFC.

Harvey Friedman
```