[FOM] CFR and Programme: SoTFoM III and The Hyperuniverse Programme, Vienna, 21-23 September 2015.

Sam Sanders sasander at cage.ugent.be
Sat Aug 29 09:11:35 EDT 2015


Dear Harvey,

> Sam Sanders: `Non-standard analysis as a computational foundation.’
> 
> I INVITE ALL OF THE ABOVE AUTHORS TO POST ON THE FOM GIVING AT LEAST A
> SYNOPSIS OF THEIR MAIN POINTS AND FINDINGS THAT ARE RELEVANT TO F.O.M.

A great idea!  

> Posting only links without discussion can be very valuable, but the
> inclusion of generally understandable elaborations is generally much
> more helpful.

I post a link to a paper below and some explanation.  First, we should fix the topic of discussion:

Disclaimer:  Any mention of “Nonstandard Analysis” below refers to Nelson’s internal set theory IST (and fragments), i.e. the syntactic approach to NSA.   
As Harvey pointed out, nonstandard models have all kinds of undefinability properties, the most well-known one is Tennenbaum’s theorem.  As already
noted by Nelson in his BAMS paper introducing IST, the latter DOES NOT suffer from Tennenbaum’s theorem (and related undefinability issues).  
In other words, IST does not have the oft-cited drawbacks of the model-theoretic approach to NSA.  (IST does have other drawbacks though, see the literature).  

In two words, the main point of my talk is the following:  

<begin>

HOTT and Univalent Foundations are usually promoted as a new *computational* foundation for mathematics.
The implicit suggestion seems to be that ZFC, the “old” foundation, is somehow not a computational foundation.  
Following new results in Nelson’s IST, I show that the latter *already* provides a computational
foundation for mathematics, and this inside the “old” foundation ZFC.  

<end>

Of course, the notion of “computational foundation” must be made precise here:

1) In the case of HOTT, the use of Martin-Loef type theory is (to the best of my knowledge) what is meant by “computational foundation”.  
(There is also the formalisation in Coq, but ZFC has lots of formalised theorems too)

2) In the case of Nelson’s IST, the following is meant by “computational foundation”:

From the proof of a theorem of *pure* Nonstandard Analysis (i.e. only involving the nonstandard notions of continuity, compactness,…), one can extract 
the proof of the associated “effective” theorem.  The latter is either a theorem of constructive/computable analysis, OR an equivalence/implication from 
Reverse Math.  The “effective” theorem has its existential quantifiers witnessed by terms also extracted from the original proof.  

Conclusion: In other words, one can just “do pure Nonstandard Analysis” and extract the effective version at the end.  This effective version gives one
a lot of computational content (as good as the proof can provide).  In this way, NSA provides a “computational” foundation.  


The relevant paper regarding 2) should appear on arXiv soon and can be found here in the meantime:

https://dl.dropboxusercontent.com/u/4558565/Sanders_Unreasonable%20effectiveness%20of%20NSA.pdf

A lot more can be said (Herbrandisations!), but that will be the topic of other postings.  

Best,

Sam





More information about the FOM mailing list