[FOM] Computational Nonstandard Analysis

katzmik at macs.biu.ac.il katzmik at macs.biu.ac.il
Mon Aug 31 11:05:12 EDT 2015


Harvey's recent post on NSA (see below) seems to contain a number of
inaccuracies.  Some NSA/IST experts that I contacted did not seem interested
in
providing a correction.  Therefore I will volunteer a reply.

As I see it the inaccuracies are at least the following:

(1) Contrary to Friedman's claim, Robinson did not use the ultrapower
construction to build the non-standard reals.  Luxemburg popularized this
construction but at any rate the experts in the field do not follow this route
which is less powerful than the compactness theorem route.

(2) Contrary to Friedman's claim, IST is not an obscure framework that "very
few" people other than Sanders know about, but rather the main point of
reference in contemporary NSA, such as the Kanovei-Reeken book or
Gordon-Kusraev-Kutateladze.

(3) Contrary to Friedman's claim, a proper extension of R of the type
constructed by Robinson CANNOT be exhibited in ZF.

(4) Contrary to Friedman's claim, NSA has NOT been used "mostly for doing
rather abstract analysis".

(5) Harvey has requested a clarification with regard to Nelson's system in a
SLOW mode.  Reproducd below are some relevant comments from a recent article.

To elaborate on Nelson's IST approach to infinitesimals in a non-technical
way, note that the general mathematical public often takes the
Zermelo-Fraenkel theory with the Axiom of Choice (ZFC) to be THE foundation of
mathematics.  How much ontological reality one assigns to this varies from
person to person.  Some mathematicians distance themselves from any kind of
ontological endorsement, which is a formalist position in line with
Robinson's.  On the other hand, many do assume that the ultimate test of truth
and/or verifiability is in the context of ZFC, so that in this sense ZFC still
is THE FOUNDATION, though it is unclear whether Robinson himself would have
subscribed to such a view.

Much of the mathematical public attaches more significance to ZFC than merely
a formalist acceptance of it as the ultimate test of provability, and tends to
accept the existence of sets (wherever they are to be found exactly but that
is a separate question), including infinite sets, and particularly the
existence of the set of REAL numbers which are often assumed to be built into
the nature of REALITY itself, as it were.  When such views are challenged what
one often hears in response is an indignant monologue concerning the
categoricity of the real numbers, etc.  As G. F. Lawler put it in his
"Comments on Edward Nelson's `Internal Set Theory: a new approach to
nonstandard analysis'":

"Clearly, the real numbers exist and have these properties. Indeed, many
courses in elementary analysis choose not to construct the reals but rather to
take the existence of an ordered field as given. This is reasonable: we are
implicitly assuming such an object exists, otherwise, why are we studying it?"
 (Lawler 2011).

In such a situation infinitesimals may not thrive: the real numbers are
thought truly to EXIST, infinitesimals emphatically NOT.  This is where Nelson
comes in with his syntactic novelty and declares: Guess what, we can find
infinitesimals WITHIN THE REAL NUMBERS THEMSELVES if we are only willing to
enrich the language we speak!  The perceived ontological differences between
the real numbers and infinitesimals are therefore seen to be merely a function
of the technical choices, including syntactic limitations, imposed by Cantor,
Dedekind, Weierstrass, and their followers in establishing the foundations of
analysis purged of infinitesimals.

The article in question can be viewed at
http://dx.doi.org/10.1017/S1755020315000015

MK

------------------------------ Original Message ------------------------------
Subject: [FOM] Computational Nonstandard Analysis
From:    "Harvey Friedman" <hmflogic at gmail.com>
Date:    Mon, August 31, 2015 00:13
To:      "Foundations of Mathematics" <fom at cs.nyu.edu>
------------------------------------------------------------------------------

SANDERS
wrote: http://www.cs.nyu.edu/pipermail/fom/2015-August/018948.htmlre: [FOM]
CFR and Programme: SoTFoM III and The Hyperuniverse Programme, Vienna,
21-23 September 2015.

Disclaimer:  Any mention of "Nonstandard Analysis" below refers to Nelson¢s
> internal set theory IST (and fragments), i.e. the syntactic approach to
> NSA.
> As Harvey pointed out, nonstandard models have all kinds of undefinability
> properties, the most well-known one is Tennenbaum¢s theorem.  As already
> noted by Nelson in his BAMS paper introducing IST, the latter DOES NOT
> suffer from Tennenbaum¢s theorem (and related undefinability issues).
> In other words, IST does not have the oft-cited drawbacks of the
> model-theoretic approach to NSA.  (IST does have other drawbacks though,
> see the literature).


Most people have something very very different in mind when they use
the phrase "nonstandard analysis". Following A. Robinson, they assume
one is talking about at least a proper elementary extension of
(R,Z,+,dot), which even be proved to exist in ZF. As I wrote, there is
the impressive Kanovei/Shelah showing that there is a formula of ZFC
which, provably in ZFC, defines such things and much more. However, as
I pointed out, getting at even some nonstandard integer like this is
impossible, and also having the construction provably lie in L(R) is
also impossible. There are other impossibility statements I am looking
at, but not ready to report further.

This usual kind of nonstandard analysis has been mostly used for doing
rather abstract analysis, far far far above what you are concerned with,
which is computational issues. HOWEVER, I am vaguely aware of some
situations where sometimes people like to use nonstandard arguments to get
some computational information out at the end, arguably "easier" or
"differently" than they would if they got their hands really dirty. WHAT
are the best examples of this? WHAT about forging tools to eliminate this,
and retain any computational content?

There are some conservatively results in Reverse Mathematics relevant here,
and probably they need to be fine tuned. One is the conservatively of WKL_0
over RCA_0 for Pi02 sentences. There are others, There are also general
speedup issues as to whether the length of proofs is blowing up
exponentially, or even iterated exponentially, etcetera.

So I can tell that you are living in a different galaxy than classical
nonstandard analysis in the sense of Abraham Robinson with his ultra power
constructions.

So I would like to understand what you are doing SLOWLY. Very few readers
have any idea what IST is and what Nelson was up to. We need to start with
your explanation of Nelson.

It would be much more helpful to state Tennenbaum's Theorem which, I think,
is something about all models of weak fragments of PA are non recursive
(there atomic diagram, not their theory). What is the threshold of having a
recursive nonstandard model or not? Also I gather you are NOT just looking
at nonstandard models of fragments of arithmetic, but rather fragments of
fragments of second order arithmetic? This should be discussed, including
the possibility of introducing interesting primitives. Maybe that is what
you do.

Since your are talking about nonstandard analysis is a very fine grained
delicate setting, we need to proceed with SLOW explanations.

2) In the case of Nelson's IST, the following is meant by "computational
> foundation":
> From the proof of a theorem of *pure* Nonstandard Analysis (i.e. only
> involving the nonstandard notions of continuity, compactness,Ä), one can
> extract
> the proof of the associated ´effective¡ theorem.  The latter is either a
> theorem of constructive/computable analysis, OR an equivalence/implication
> from
> Reverse Math.  The ´effective¡ theorem has its existential quantifiers
> witnessed by terms also extracted from the original proof.
> Conclusion: In other words, one can just ´do pure Nonstandard Analysis¡
> and extract the effective version at the end.  This effective version gives
> one
> a lot of computational content (as good as the proof can provide).  In
> this way, NSA provides a ´computational¡ foundation.

We need some slowly presented nice examples of such extractions. Also
"computational" is a very multifaceted word. All the way from just
about any countable piece of mathematics, all the way down to
something that can be implemented by a computer in less than a day,
available from a major computer company in 2015 for a reasonable
price.

Harvey Friedman










More information about the FOM mailing list