[FOM] Solution (?) to Mathematical Certainty Problem

Harvey Friedman friedman at math.ohio-state.edu
Thu Jun 26 23:37:29 EDT 2003


Reply to Lindauer 5:05AM 6/26/03.

I am not sure that we have, to quote myself,

"agreed on these fundamental aspects of the setup, so that ther eis 
any point in continuing the discussion."

Nevertheless, I will continue the disucssion a bit.

It appears that the thrust of what I am saying is that

**in order for us to achieve absolute certainty, or some particular 
brand of certainty, about ALL of our rigorous mathematical knowledge, 
we need only achieve absolute certainty, or some particular brand of 
certainty, about a TINY PORTION of our rigorous mathematical 
knowledge, together with very weak assumptions about physical 
processes, far weaker than is contained in any substantial physical 
theory. In fact, it is promising that all, or almost all, of the 
physical component can be replaced by weak statistical reasoning.**

This seems to be a genuine reduction, and as such, is very striking. 
Of course, in some sense, this idea has been lurking around for quite 
some time, well before I have come to see this. But perhaps, it has 
not been stated so explicitly, together with a program that one can, 
in fact, make the relevant TINY PORTION more convincing that would be 
a priori expected. I have not gotten into the serious matter of 
analyzing how weak the rigorous physical process theory can be, nor 
how weak the rigorous statistical reasoning can be.

In fact, this may a perfect place to do some things that have never 
been done successfully before. Why? Because this context is very 
specific, and the cards are stacked very much in our favor, since one 
has only to "verify" just a little bit of stuff.

1. Write a generally applicable multitape Turing machine program 
(CORE PROOF CHEKCER) and rigorously specify its behavior and verify 
it.

2. Analyze actual electronic hardware to run it on, and see how weak 
the physical statements have to be in order to argue that we have the 
appropriate kind of infallibility.

3. Do 2 rigorously, perhaps invoking various physical redundancy 
procedures, as well as error correcting codes and the like, but in a 
totally rigorous, verified setting.

4. Analyze all relevant statistical reasoning rigorously. Rigorous 
foundations for statistical reasoning don't really exist up to the 
standards we expect here in such a project. But in this narrowly 
focused situation, they probably can be handled appropriately. This 
might lay the groundwork for pathbreaking foundational ideas 
concerning statistical reasoning.

But a skeptic would say: where are you going to get any absolute 
certainty out of this? Once you admit statistical reasoning as part 
of the mix, you have already admitted defeat, since you don't 100% 
from such considerations.

I was waiting for this question.

It is considered very reasonable - if not very likely - that the 
universe is COARSE. That the universe has been around for only 
reasonably small amount of time, that there have been only a small 
amount of states, a small amount of events, etcetera.

So under this viewpoint, if a probability is rigorously established, 
in some appropriate sense, to be less than, say, 2^-1,000,000, then 
that is not distinguishable in reality from 0. Perhaps in much the 
same way that a distance like that in meters is not distinguishable 
from 0, because of coarseness.

So once one admits coarseness, according to this point of view, in 
order to achieve absolute certainty, we need only get down to 
probabilities of failure at levels like 2^-1,000,000, or some such 
number.

But that is easily imaginable with very clever redundancies, error 
correction, etcetera.

Harvey Friedman


More information about the FOM mailing list