[FOM] Computational Nonstandard Analysis

katzmik at macs.biu.ac.il katzmik at macs.biu.ac.il
Tue Sep 1 10:21:50 EDT 2015

On Tue, September 1, 2015 08:26, Harvey Friedman wrote:
> Thanks! So A. Robinson used the compactness route and Luxembourg the
> ultra power route? Might be convenient to have you mention a readable
> history of nonstandard analysis for people who don't work in it, like
> me and most subscribers.

OK. In 1948 Edwin Hewitt published an article where he both introduced the
term "hyper-real" (to describe the appropriate ideal), and constructed
a field using a kind of ultrapower construction.

In 1955 Jerzy Los proved "Los's theorem" to the effect that the
ultrapower preserves all first-order properties (from which the
transfer principle is an immediate corollary).

In 1961 Abraham Robinson published his first article on non-standard
analysis (NSA), eventually followed by the 1966 book.

In the meantime, Wim Luxemburg popularized the ultrapower approach in a
simpler setting than Hewitt (i.e., indexing over the natural numbers).

In 1977 Edward Nelson provided an axiomatisation of Robinson's
framework called Internal Set Theory (IST) which can be thought of as
a syntactic approach to NSA.  Karel Hrbacek independently developed a
syntactic axiomatisation at about the same time.

> Because Nelson was so involved in the "claim" that the exponential
> function is not total on the natural numbers, I automatically assumed
> that he must have an idiosyncratic setup for nonstandard analysis.
> Especially, that he must be working with a construction that can be
> done in a tiny fragment of arithmetic.

No, Nelson's work related to NSA involves a conservative extension of ZFC. 
There are some finitist ideas at the basis of IST but Nelson's work on
fragments of PA is not directly related to this.

>> (4) Contrary to Friedman's claim, NSA has NOT been used "mostly for doing
>> rather abstract analysis".
> What I meant was that for mathematical applications of note to
> mathematicians, the target mathematics is rather abstract analysis.
> Now this may still be false or misleading?

One of the recent results that the experts in the field like to talk about is
the recent solution of the local version of Hilbert's 5th problem (on the
characterisation of Lie groups) by Lou van den Dries and Isaac Goldbring.  Lie
groups sound rather concrete.  This is a result that does not have an
alternative proof in an Archimedean setting.

> Of course, I view ZFC is the at this time most convenient, robust,
> coherent, useable, general foundational scheme for general
> mathematics. However, I am deeply interested in any alternatives that
> are based on at least a prima facie coherent idea. Taking
> infinitesimals is such an idea. That doesn't mean that we are in a
> position to offer it up as a good alternative to ZFC. Instead, it
> seems to be an alternative of among others which needs to be explored,
> and is being explored. There probably are many avenues of serious
> foundational interest to go with it that have not yet been explored.

Indeed IST is offered as such an alternative.

> The lack of categoricity and the intrinsic undefinability issues are
> both major drawback for NSA as any kind of prima facie replacement of
> ZFC. However, for me, that does not mean that there aren't some very
> good reasons for taking a hard look at NSA or NSM. After all, long
> before we really had a decent grasp of epsilon/delta we (e.g., Newton
> and Leibniz) were casting our mathematics (in and around calculus) in
> these terms. Obviously it has a lot of conceptual attractions. There
> are other things that we have wrung out of the mathematical setup that
> also need to be revisited. I won't go into them here.

There is no lack of categoricity since you are working with the
ordinary real numbers when you are in IST.  All the theorems proved in
ZFC are still valid, and in particular the categoricity result.  What
is new is a one-place predicate "st" which enriches the language of
the theory.  This can be thought of as an implementation of Leibniz's
ideas on the distinction between "assignable" and "inassignable"
numbers.  In modern terminology, the assignable ones would be the "st"
(standard) ones.

> Better take a complete ordered field.

It still is, in the usual sense.

> You have stopped your account at the interesting place. Rather than
> simply refer the FOM readers to an article, it would be interesting
> for you to elaborate. For example, do you have an example of an
> infinitesimal "within the real numbers themselves"? And if you must
> enrich the language, you could explain briefly just what enrichment of
> the language you have in mind?
> Harvey Friedman

An infinitesimal will be a nonstandard real number smaller than every
positive standard real number.  Again, "standard" means a number on
which the predicate "st" tests positive, and "nonstandard" otherwise.
The "enrichment of the language" in question is the introduction of
the new predicate.  The new axiomatisation satisfies all the ZFC
axioms without change, plus three new axioms that involve the new
predicate.  Any theorem you prove in ZFC is still true in the new
framework, but the richer language enables a broader range of
arguments, including those exploiting infinitesimals.


More information about the FOM mailing list