[FOM] Machine-checkable proofs: NP, or worse?
Timothy Y. Chow
tchow at alum.mit.edu
Thu Jun 14 19:31:17 EDT 2007
Tacitly, many people think of the problem of finding feasible proofs of
mathematical theorems as an NP problem. That is, it might be hard to find
a proof, but once the proof is found, it is in principle straightforward
to write it down and check its correctness. I believe that this is also
one of the tacit assumptions behind the various projects that aim to
encode all the proofs in the mathematics textbooks and journals in
machine-checkable form. The idea is that we can achieve unparalleled
(albeit not perfect) certainty about the correctness of our collective
mathematical knowledge, because the program to verify a proof can be made
extremely short and simple. Then one can achieve extremely high
confidence that the proof verifier is correct, and by running it on
different platforms, etc., one can be highly confident that when it says
that a certain proof is correct, then the proof is in fact correct.
Numerous practical objections to this utopian dream can be raised. Once
these databases start becoming a practical reality, managing them can
become a complex project. Multiple copies may be produced; there may be
an irresistible temptation to economize by using abbreviations (and
therefore increasing the complexity of the proof checkers) or databases of
theorems whose proofs are stored "elsewhere"; slightly different dialects
may emerge and it may be difficult to keep track of exactly which
definition of (say) "manifold" is intended; and so on. I'm sure these
concerns are well known to those who think about such things.
Here I want to raise another issue which I suspect has not received as
much attention. Namely, I want to question whether NP is really the
correct complexity class to have in mind here. What made me think about
this was Jonathan Schaeffer's announcement that the initial position of
checkers may be solved within a year.
IF he does make this announcement, should we believe him? From the point
of view of machine-checkable proofs, we should really demand a
machine-checkable certificate that a standard proof verifier (knowing
nothing special about checkers) can validate. The catch, of course, is
that for a game like checkers, such proofs might be exponentially long.
The certificate for the initial position in checkers is probably not all
that big, but it does raise the question of whether the NP way of thinking
is really correct. One can certainly imagine mathematical theorems that
have feasible proofs in the sense that the "polynomial space" and the
"exponential time" required to verify their correctness are both within
the realm of feasibility, but where the "exponential space" required to
store the certificate is not feasible.
One might respond to this objection by dismissing checkers as an anomalous
instance, and that "most mathematical proofs aren't like that." I would
personally be rather wary of such a generalization. Who knows what the
future may bring?
What I would like to see is either
1. an argument that this is not really a problem, for example because any
such scenario could be handled by a "universal" verifier that can
simulate an arbitrarily complex calculation while maintaining a fixed
upper bound on the amount of code that is required to be bug-free; or
2. an argument that, subject to some complexity-theoretic assumptions,
this really *is* a potential problem and there's not much we can do
about it.
Does anyone see how to make either one of these arguments?
Tim
More information about the FOM
mailing list